Dec 16 13:09:49.798912 kernel: Linux version 6.12.61-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Dec 12 15:21:28 -00 2025 Dec 16 13:09:49.798944 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=a214a2d85e162c493e8b13db2df50a43e1005a0e4854a1ae089a14f442a30022 Dec 16 13:09:49.798956 kernel: BIOS-provided physical RAM map: Dec 16 13:09:49.798962 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Dec 16 13:09:49.798968 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable Dec 16 13:09:49.798973 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Dec 16 13:09:49.798980 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable Dec 16 13:09:49.798986 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Dec 16 13:09:49.798992 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable Dec 16 13:09:49.799000 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Dec 16 13:09:49.799006 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000007e73efff] usable Dec 16 13:09:49.799012 kernel: BIOS-e820: [mem 0x000000007e73f000-0x000000007e7fffff] reserved Dec 16 13:09:49.799017 kernel: BIOS-e820: [mem 0x000000007e800000-0x000000007ea70fff] usable Dec 16 13:09:49.799024 kernel: BIOS-e820: [mem 0x000000007ea71000-0x000000007eb84fff] reserved Dec 16 13:09:49.799031 kernel: BIOS-e820: [mem 0x000000007eb85000-0x000000007f6ecfff] usable Dec 16 13:09:49.799039 kernel: BIOS-e820: [mem 0x000000007f6ed000-0x000000007f96cfff] reserved Dec 16 13:09:49.799045 kernel: BIOS-e820: [mem 0x000000007f96d000-0x000000007f97efff] ACPI data Dec 16 13:09:49.799051 kernel: BIOS-e820: [mem 0x000000007f97f000-0x000000007f9fefff] ACPI NVS Dec 16 13:09:49.799068 kernel: BIOS-e820: [mem 0x000000007f9ff000-0x000000007fe4efff] usable Dec 16 13:09:49.799074 kernel: BIOS-e820: [mem 0x000000007fe4f000-0x000000007fe52fff] reserved Dec 16 13:09:49.799080 kernel: BIOS-e820: [mem 0x000000007fe53000-0x000000007fe54fff] ACPI NVS Dec 16 13:09:49.799086 kernel: BIOS-e820: [mem 0x000000007fe55000-0x000000007febbfff] usable Dec 16 13:09:49.799092 kernel: BIOS-e820: [mem 0x000000007febc000-0x000000007ff3ffff] reserved Dec 16 13:09:49.799098 kernel: BIOS-e820: [mem 0x000000007ff40000-0x000000007fffffff] ACPI NVS Dec 16 13:09:49.799104 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Dec 16 13:09:49.799112 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Dec 16 13:09:49.799118 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Dec 16 13:09:49.799124 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000047fffffff] usable Dec 16 13:09:49.799130 kernel: NX (Execute Disable) protection: active Dec 16 13:09:49.799136 kernel: APIC: Static calls initialized Dec 16 13:09:49.799142 kernel: e820: update [mem 0x7dd4e018-0x7dd57a57] usable ==> usable Dec 16 13:09:49.799149 kernel: e820: update [mem 0x7dd26018-0x7dd4d457] usable ==> usable Dec 16 13:09:49.799155 kernel: extended physical RAM map: Dec 16 13:09:49.799161 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Dec 16 13:09:49.799167 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable Dec 16 13:09:49.799173 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Dec 16 13:09:49.799181 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable Dec 16 13:09:49.799187 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Dec 16 13:09:49.799193 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable Dec 16 13:09:49.799200 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Dec 16 13:09:49.799209 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000007dd26017] usable Dec 16 13:09:49.799216 kernel: reserve setup_data: [mem 0x000000007dd26018-0x000000007dd4d457] usable Dec 16 13:09:49.799222 kernel: reserve setup_data: [mem 0x000000007dd4d458-0x000000007dd4e017] usable Dec 16 13:09:49.799231 kernel: reserve setup_data: [mem 0x000000007dd4e018-0x000000007dd57a57] usable Dec 16 13:09:49.799237 kernel: reserve setup_data: [mem 0x000000007dd57a58-0x000000007e73efff] usable Dec 16 13:09:49.799243 kernel: reserve setup_data: [mem 0x000000007e73f000-0x000000007e7fffff] reserved Dec 16 13:09:49.799250 kernel: reserve setup_data: [mem 0x000000007e800000-0x000000007ea70fff] usable Dec 16 13:09:49.799256 kernel: reserve setup_data: [mem 0x000000007ea71000-0x000000007eb84fff] reserved Dec 16 13:09:49.799262 kernel: reserve setup_data: [mem 0x000000007eb85000-0x000000007f6ecfff] usable Dec 16 13:09:49.799269 kernel: reserve setup_data: [mem 0x000000007f6ed000-0x000000007f96cfff] reserved Dec 16 13:09:49.799275 kernel: reserve setup_data: [mem 0x000000007f96d000-0x000000007f97efff] ACPI data Dec 16 13:09:49.799281 kernel: reserve setup_data: [mem 0x000000007f97f000-0x000000007f9fefff] ACPI NVS Dec 16 13:09:49.799290 kernel: reserve setup_data: [mem 0x000000007f9ff000-0x000000007fe4efff] usable Dec 16 13:09:49.799304 kernel: reserve setup_data: [mem 0x000000007fe4f000-0x000000007fe52fff] reserved Dec 16 13:09:49.799310 kernel: reserve setup_data: [mem 0x000000007fe53000-0x000000007fe54fff] ACPI NVS Dec 16 13:09:49.799317 kernel: reserve setup_data: [mem 0x000000007fe55000-0x000000007febbfff] usable Dec 16 13:09:49.799323 kernel: reserve setup_data: [mem 0x000000007febc000-0x000000007ff3ffff] reserved Dec 16 13:09:49.799330 kernel: reserve setup_data: [mem 0x000000007ff40000-0x000000007fffffff] ACPI NVS Dec 16 13:09:49.799336 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Dec 16 13:09:49.799343 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Dec 16 13:09:49.799349 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Dec 16 13:09:49.799356 kernel: reserve setup_data: [mem 0x0000000100000000-0x000000047fffffff] usable Dec 16 13:09:49.799362 kernel: efi: EFI v2.7 by EDK II Dec 16 13:09:49.799371 kernel: efi: SMBIOS=0x7f772000 ACPI=0x7f97e000 ACPI 2.0=0x7f97e014 MEMATTR=0x7e282018 RNG=0x7f972018 Dec 16 13:09:49.799378 kernel: random: crng init done Dec 16 13:09:49.799384 kernel: efi: Remove mem152: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Dec 16 13:09:49.799391 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Dec 16 13:09:49.799397 kernel: secureboot: Secure boot disabled Dec 16 13:09:49.799403 kernel: SMBIOS 2.8 present. Dec 16 13:09:49.799410 kernel: DMI: STACKIT Cloud OpenStack Nova/Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 Dec 16 13:09:49.799416 kernel: DMI: Memory slots populated: 1/1 Dec 16 13:09:49.799422 kernel: Hypervisor detected: KVM Dec 16 13:09:49.799429 kernel: last_pfn = 0x7febc max_arch_pfn = 0x10000000000 Dec 16 13:09:49.799435 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Dec 16 13:09:49.799442 kernel: kvm-clock: using sched offset of 7014589615 cycles Dec 16 13:09:49.799450 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Dec 16 13:09:49.799458 kernel: tsc: Detected 2294.608 MHz processor Dec 16 13:09:49.799465 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Dec 16 13:09:49.799471 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Dec 16 13:09:49.799478 kernel: last_pfn = 0x480000 max_arch_pfn = 0x10000000000 Dec 16 13:09:49.799485 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Dec 16 13:09:49.799492 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Dec 16 13:09:49.799498 kernel: last_pfn = 0x7febc max_arch_pfn = 0x10000000000 Dec 16 13:09:49.799505 kernel: Using GB pages for direct mapping Dec 16 13:09:49.799514 kernel: ACPI: Early table checksum verification disabled Dec 16 13:09:49.799520 kernel: ACPI: RSDP 0x000000007F97E014 000024 (v02 BOCHS ) Dec 16 13:09:49.799527 kernel: ACPI: XSDT 0x000000007F97D0E8 00004C (v01 BOCHS BXPC 00000001 01000013) Dec 16 13:09:49.799534 kernel: ACPI: FACP 0x000000007F977000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 13:09:49.799540 kernel: ACPI: DSDT 0x000000007F978000 004441 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 13:09:49.799547 kernel: ACPI: FACS 0x000000007F9DD000 000040 Dec 16 13:09:49.799553 kernel: ACPI: APIC 0x000000007F976000 0000B0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 13:09:49.799560 kernel: ACPI: MCFG 0x000000007F975000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 13:09:49.799566 kernel: ACPI: WAET 0x000000007F974000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 13:09:49.799575 kernel: ACPI: BGRT 0x000000007F973000 000038 (v01 INTEL EDK2 00000002 01000013) Dec 16 13:09:49.799581 kernel: ACPI: Reserving FACP table memory at [mem 0x7f977000-0x7f9770f3] Dec 16 13:09:49.799588 kernel: ACPI: Reserving DSDT table memory at [mem 0x7f978000-0x7f97c440] Dec 16 13:09:49.799595 kernel: ACPI: Reserving FACS table memory at [mem 0x7f9dd000-0x7f9dd03f] Dec 16 13:09:49.799601 kernel: ACPI: Reserving APIC table memory at [mem 0x7f976000-0x7f9760af] Dec 16 13:09:49.799608 kernel: ACPI: Reserving MCFG table memory at [mem 0x7f975000-0x7f97503b] Dec 16 13:09:49.799614 kernel: ACPI: Reserving WAET table memory at [mem 0x7f974000-0x7f974027] Dec 16 13:09:49.799621 kernel: ACPI: Reserving BGRT table memory at [mem 0x7f973000-0x7f973037] Dec 16 13:09:49.799627 kernel: No NUMA configuration found Dec 16 13:09:49.799636 kernel: Faking a node at [mem 0x0000000000000000-0x000000047fffffff] Dec 16 13:09:49.799642 kernel: NODE_DATA(0) allocated [mem 0x47fff8dc0-0x47fffffff] Dec 16 13:09:49.799649 kernel: Zone ranges: Dec 16 13:09:49.799656 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Dec 16 13:09:49.799662 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Dec 16 13:09:49.799669 kernel: Normal [mem 0x0000000100000000-0x000000047fffffff] Dec 16 13:09:49.799675 kernel: Device empty Dec 16 13:09:49.799682 kernel: Movable zone start for each node Dec 16 13:09:49.799691 kernel: Early memory node ranges Dec 16 13:09:49.799697 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Dec 16 13:09:49.799707 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] Dec 16 13:09:49.799716 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] Dec 16 13:09:49.799726 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] Dec 16 13:09:49.799735 kernel: node 0: [mem 0x0000000000900000-0x000000007e73efff] Dec 16 13:09:49.799743 kernel: node 0: [mem 0x000000007e800000-0x000000007ea70fff] Dec 16 13:09:49.799752 kernel: node 0: [mem 0x000000007eb85000-0x000000007f6ecfff] Dec 16 13:09:49.799765 kernel: node 0: [mem 0x000000007f9ff000-0x000000007fe4efff] Dec 16 13:09:49.799774 kernel: node 0: [mem 0x000000007fe55000-0x000000007febbfff] Dec 16 13:09:49.799781 kernel: node 0: [mem 0x0000000100000000-0x000000047fffffff] Dec 16 13:09:49.799788 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000047fffffff] Dec 16 13:09:49.799796 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Dec 16 13:09:49.799803 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Dec 16 13:09:49.799812 kernel: On node 0, zone DMA: 8 pages in unavailable ranges Dec 16 13:09:49.799819 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Dec 16 13:09:49.799826 kernel: On node 0, zone DMA: 239 pages in unavailable ranges Dec 16 13:09:49.799834 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Dec 16 13:09:49.799841 kernel: On node 0, zone DMA32: 276 pages in unavailable ranges Dec 16 13:09:49.799850 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Dec 16 13:09:49.799857 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges Dec 16 13:09:49.799865 kernel: On node 0, zone Normal: 324 pages in unavailable ranges Dec 16 13:09:49.799872 kernel: ACPI: PM-Timer IO Port: 0x608 Dec 16 13:09:49.799879 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Dec 16 13:09:49.799886 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Dec 16 13:09:49.799893 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Dec 16 13:09:49.799901 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Dec 16 13:09:49.799908 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Dec 16 13:09:49.799917 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Dec 16 13:09:49.799924 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Dec 16 13:09:49.799932 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Dec 16 13:09:49.799939 kernel: TSC deadline timer available Dec 16 13:09:49.799946 kernel: CPU topo: Max. logical packages: 8 Dec 16 13:09:49.799953 kernel: CPU topo: Max. logical dies: 8 Dec 16 13:09:49.799960 kernel: CPU topo: Max. dies per package: 1 Dec 16 13:09:49.799967 kernel: CPU topo: Max. threads per core: 1 Dec 16 13:09:49.799975 kernel: CPU topo: Num. cores per package: 1 Dec 16 13:09:49.799983 kernel: CPU topo: Num. threads per package: 1 Dec 16 13:09:49.799990 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs Dec 16 13:09:49.799998 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Dec 16 13:09:49.800005 kernel: kvm-guest: KVM setup pv remote TLB flush Dec 16 13:09:49.800012 kernel: kvm-guest: setup PV sched yield Dec 16 13:09:49.800019 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Dec 16 13:09:49.800026 kernel: Booting paravirtualized kernel on KVM Dec 16 13:09:49.800036 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Dec 16 13:09:49.800044 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1 Dec 16 13:09:49.800053 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Dec 16 13:09:49.800060 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Dec 16 13:09:49.800067 kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 Dec 16 13:09:49.800074 kernel: kvm-guest: PV spinlocks enabled Dec 16 13:09:49.800082 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Dec 16 13:09:49.800090 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=a214a2d85e162c493e8b13db2df50a43e1005a0e4854a1ae089a14f442a30022 Dec 16 13:09:49.800097 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Dec 16 13:09:49.800105 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Dec 16 13:09:49.800114 kernel: Fallback order for Node 0: 0 Dec 16 13:09:49.800121 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4192374 Dec 16 13:09:49.800129 kernel: Policy zone: Normal Dec 16 13:09:49.800136 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 16 13:09:49.800143 kernel: software IO TLB: area num 8. Dec 16 13:09:49.800150 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1 Dec 16 13:09:49.800157 kernel: ftrace: allocating 40103 entries in 157 pages Dec 16 13:09:49.800165 kernel: ftrace: allocated 157 pages with 5 groups Dec 16 13:09:49.800172 kernel: Dynamic Preempt: voluntary Dec 16 13:09:49.800181 kernel: rcu: Preemptible hierarchical RCU implementation. Dec 16 13:09:49.800189 kernel: rcu: RCU event tracing is enabled. Dec 16 13:09:49.800197 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=8. Dec 16 13:09:49.800204 kernel: Trampoline variant of Tasks RCU enabled. Dec 16 13:09:49.800212 kernel: Rude variant of Tasks RCU enabled. Dec 16 13:09:49.800219 kernel: Tracing variant of Tasks RCU enabled. Dec 16 13:09:49.800226 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 16 13:09:49.800234 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8 Dec 16 13:09:49.800241 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8. Dec 16 13:09:49.800248 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8. Dec 16 13:09:49.800258 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8. Dec 16 13:09:49.800265 kernel: NR_IRQS: 33024, nr_irqs: 488, preallocated irqs: 16 Dec 16 13:09:49.800272 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 16 13:09:49.800279 kernel: Console: colour dummy device 80x25 Dec 16 13:09:49.800287 kernel: printk: legacy console [tty0] enabled Dec 16 13:09:49.800294 kernel: printk: legacy console [ttyS0] enabled Dec 16 13:09:49.800308 kernel: ACPI: Core revision 20240827 Dec 16 13:09:49.800315 kernel: APIC: Switch to symmetric I/O mode setup Dec 16 13:09:49.800323 kernel: x2apic enabled Dec 16 13:09:49.800332 kernel: APIC: Switched APIC routing to: physical x2apic Dec 16 13:09:49.800340 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Dec 16 13:09:49.800347 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Dec 16 13:09:49.800354 kernel: kvm-guest: setup PV IPIs Dec 16 13:09:49.800361 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x21134f58f0d, max_idle_ns: 440795217993 ns Dec 16 13:09:49.800368 kernel: Calibrating delay loop (skipped) preset value.. 4589.21 BogoMIPS (lpj=2294608) Dec 16 13:09:49.800376 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Dec 16 13:09:49.800383 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Dec 16 13:09:49.800390 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Dec 16 13:09:49.800399 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Dec 16 13:09:49.800406 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Dec 16 13:09:49.800413 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Dec 16 13:09:49.800420 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Dec 16 13:09:49.800428 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Dec 16 13:09:49.800435 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Dec 16 13:09:49.800442 kernel: TAA: Mitigation: Clear CPU buffers Dec 16 13:09:49.800449 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Dec 16 13:09:49.800456 kernel: active return thunk: its_return_thunk Dec 16 13:09:49.800463 kernel: ITS: Mitigation: Aligned branch/return thunks Dec 16 13:09:49.800470 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Dec 16 13:09:49.800479 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Dec 16 13:09:49.800486 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Dec 16 13:09:49.800493 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Dec 16 13:09:49.800500 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Dec 16 13:09:49.800506 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Dec 16 13:09:49.800513 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Dec 16 13:09:49.800520 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Dec 16 13:09:49.800531 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Dec 16 13:09:49.800538 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Dec 16 13:09:49.800545 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Dec 16 13:09:49.800556 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Dec 16 13:09:49.800565 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Dec 16 13:09:49.800572 kernel: Freeing SMP alternatives memory: 32K Dec 16 13:09:49.800579 kernel: pid_max: default: 32768 minimum: 301 Dec 16 13:09:49.800586 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Dec 16 13:09:49.800593 kernel: landlock: Up and running. Dec 16 13:09:49.800600 kernel: SELinux: Initializing. Dec 16 13:09:49.800607 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 16 13:09:49.800614 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 16 13:09:49.800622 kernel: smpboot: CPU0: Intel(R) Xeon(R) Silver 4316 CPU @ 2.30GHz (family: 0x6, model: 0x6a, stepping: 0x6) Dec 16 13:09:49.800629 kernel: Performance Events: PEBS fmt0-, Icelake events, full-width counters, Intel PMU driver. Dec 16 13:09:49.800636 kernel: ... version: 2 Dec 16 13:09:49.800645 kernel: ... bit width: 48 Dec 16 13:09:49.800653 kernel: ... generic registers: 8 Dec 16 13:09:49.800660 kernel: ... value mask: 0000ffffffffffff Dec 16 13:09:49.800668 kernel: ... max period: 00007fffffffffff Dec 16 13:09:49.800675 kernel: ... fixed-purpose events: 3 Dec 16 13:09:49.800682 kernel: ... event mask: 00000007000000ff Dec 16 13:09:49.800689 kernel: signal: max sigframe size: 3632 Dec 16 13:09:49.800697 kernel: rcu: Hierarchical SRCU implementation. Dec 16 13:09:49.800704 kernel: rcu: Max phase no-delay instances is 400. Dec 16 13:09:49.800713 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Dec 16 13:09:49.800720 kernel: smp: Bringing up secondary CPUs ... Dec 16 13:09:49.800728 kernel: smpboot: x86: Booting SMP configuration: Dec 16 13:09:49.800735 kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 Dec 16 13:09:49.800742 kernel: smp: Brought up 1 node, 8 CPUs Dec 16 13:09:49.800750 kernel: smpboot: Total of 8 processors activated (36713.72 BogoMIPS) Dec 16 13:09:49.800758 kernel: Memory: 16308700K/16769496K available (14336K kernel code, 2444K rwdata, 26064K rodata, 46188K init, 2572K bss, 453240K reserved, 0K cma-reserved) Dec 16 13:09:49.800765 kernel: devtmpfs: initialized Dec 16 13:09:49.800773 kernel: x86/mm: Memory block size: 128MB Dec 16 13:09:49.800782 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) Dec 16 13:09:49.800790 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) Dec 16 13:09:49.800797 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) Dec 16 13:09:49.800804 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7f97f000-0x7f9fefff] (524288 bytes) Dec 16 13:09:49.800812 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fe53000-0x7fe54fff] (8192 bytes) Dec 16 13:09:49.800819 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7ff40000-0x7fffffff] (786432 bytes) Dec 16 13:09:49.800826 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 16 13:09:49.800834 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear) Dec 16 13:09:49.800841 kernel: pinctrl core: initialized pinctrl subsystem Dec 16 13:09:49.800850 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 16 13:09:49.800857 kernel: audit: initializing netlink subsys (disabled) Dec 16 13:09:49.800865 kernel: audit: type=2000 audit(1765890585.771:1): state=initialized audit_enabled=0 res=1 Dec 16 13:09:49.800872 kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 16 13:09:49.800879 kernel: thermal_sys: Registered thermal governor 'user_space' Dec 16 13:09:49.800886 kernel: cpuidle: using governor menu Dec 16 13:09:49.800893 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 16 13:09:49.800901 kernel: dca service started, version 1.12.1 Dec 16 13:09:49.800908 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Dec 16 13:09:49.800918 kernel: PCI: Using configuration type 1 for base access Dec 16 13:09:49.800925 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Dec 16 13:09:49.800933 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Dec 16 13:09:49.800940 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Dec 16 13:09:49.800947 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Dec 16 13:09:49.800955 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Dec 16 13:09:49.800962 kernel: ACPI: Added _OSI(Module Device) Dec 16 13:09:49.800970 kernel: ACPI: Added _OSI(Processor Device) Dec 16 13:09:49.800977 kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 16 13:09:49.800986 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 16 13:09:49.800993 kernel: ACPI: Interpreter enabled Dec 16 13:09:49.801001 kernel: ACPI: PM: (supports S0 S3 S5) Dec 16 13:09:49.801008 kernel: ACPI: Using IOAPIC for interrupt routing Dec 16 13:09:49.801015 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Dec 16 13:09:49.801023 kernel: PCI: Using E820 reservations for host bridge windows Dec 16 13:09:49.801030 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Dec 16 13:09:49.801037 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 16 13:09:49.801168 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Dec 16 13:09:49.801246 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Dec 16 13:09:49.801323 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Dec 16 13:09:49.801333 kernel: PCI host bridge to bus 0000:00 Dec 16 13:09:49.801404 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Dec 16 13:09:49.801466 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Dec 16 13:09:49.801527 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Dec 16 13:09:49.801589 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Dec 16 13:09:49.801654 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Dec 16 13:09:49.801732 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x38e800003fff window] Dec 16 13:09:49.801804 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Dec 16 13:09:49.801903 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Dec 16 13:09:49.801985 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Dec 16 13:09:49.802059 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80000000-0x807fffff pref] Dec 16 13:09:49.802129 kernel: pci 0000:00:01.0: BAR 2 [mem 0x38e800000000-0x38e800003fff 64bit pref] Dec 16 13:09:49.802197 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8439e000-0x8439efff] Dec 16 13:09:49.802270 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Dec 16 13:09:49.802347 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Dec 16 13:09:49.802426 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.802496 kernel: pci 0000:00:02.0: BAR 0 [mem 0x8439d000-0x8439dfff] Dec 16 13:09:49.802568 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Dec 16 13:09:49.802636 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Dec 16 13:09:49.802708 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Dec 16 13:09:49.802777 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Dec 16 13:09:49.802852 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.802920 kernel: pci 0000:00:02.1: BAR 0 [mem 0x8439c000-0x8439cfff] Dec 16 13:09:49.802991 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Dec 16 13:09:49.803068 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Dec 16 13:09:49.803137 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Dec 16 13:09:49.803211 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.803279 kernel: pci 0000:00:02.2: BAR 0 [mem 0x8439b000-0x8439bfff] Dec 16 13:09:49.803354 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Dec 16 13:09:49.803422 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Dec 16 13:09:49.803489 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Dec 16 13:09:49.803566 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.803635 kernel: pci 0000:00:02.3: BAR 0 [mem 0x8439a000-0x8439afff] Dec 16 13:09:49.803726 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Dec 16 13:09:49.803796 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Dec 16 13:09:49.803863 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Dec 16 13:09:49.803936 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.804004 kernel: pci 0000:00:02.4: BAR 0 [mem 0x84399000-0x84399fff] Dec 16 13:09:49.804075 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Dec 16 13:09:49.804143 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Dec 16 13:09:49.804210 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Dec 16 13:09:49.804284 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.804423 kernel: pci 0000:00:02.5: BAR 0 [mem 0x84398000-0x84398fff] Dec 16 13:09:49.804495 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Dec 16 13:09:49.804565 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Dec 16 13:09:49.804635 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Dec 16 13:09:49.804709 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.804778 kernel: pci 0000:00:02.6: BAR 0 [mem 0x84397000-0x84397fff] Dec 16 13:09:49.804846 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Dec 16 13:09:49.804914 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Dec 16 13:09:49.804981 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Dec 16 13:09:49.805057 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.805128 kernel: pci 0000:00:02.7: BAR 0 [mem 0x84396000-0x84396fff] Dec 16 13:09:49.805195 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Dec 16 13:09:49.805263 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Dec 16 13:09:49.805340 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Dec 16 13:09:49.805415 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.805485 kernel: pci 0000:00:03.0: BAR 0 [mem 0x84395000-0x84395fff] Dec 16 13:09:49.805556 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Dec 16 13:09:49.805625 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Dec 16 13:09:49.805692 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Dec 16 13:09:49.805770 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.805854 kernel: pci 0000:00:03.1: BAR 0 [mem 0x84394000-0x84394fff] Dec 16 13:09:49.805925 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Dec 16 13:09:49.805992 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Dec 16 13:09:49.806063 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Dec 16 13:09:49.806135 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.806203 kernel: pci 0000:00:03.2: BAR 0 [mem 0x84393000-0x84393fff] Dec 16 13:09:49.806271 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Dec 16 13:09:49.806347 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Dec 16 13:09:49.806419 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Dec 16 13:09:49.806493 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.806568 kernel: pci 0000:00:03.3: BAR 0 [mem 0x84392000-0x84392fff] Dec 16 13:09:49.806638 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Dec 16 13:09:49.806706 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Dec 16 13:09:49.806774 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Dec 16 13:09:49.806846 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.806917 kernel: pci 0000:00:03.4: BAR 0 [mem 0x84391000-0x84391fff] Dec 16 13:09:49.806985 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Dec 16 13:09:49.807063 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Dec 16 13:09:49.807134 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Dec 16 13:09:49.807209 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.807278 kernel: pci 0000:00:03.5: BAR 0 [mem 0x84390000-0x84390fff] Dec 16 13:09:49.807357 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Dec 16 13:09:49.807432 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Dec 16 13:09:49.807504 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Dec 16 13:09:49.807578 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.807649 kernel: pci 0000:00:03.6: BAR 0 [mem 0x8438f000-0x8438ffff] Dec 16 13:09:49.807717 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Dec 16 13:09:49.807785 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Dec 16 13:09:49.807852 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Dec 16 13:09:49.807925 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.807997 kernel: pci 0000:00:03.7: BAR 0 [mem 0x8438e000-0x8438efff] Dec 16 13:09:49.808065 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Dec 16 13:09:49.808133 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Dec 16 13:09:49.808200 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Dec 16 13:09:49.808274 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.808356 kernel: pci 0000:00:04.0: BAR 0 [mem 0x8438d000-0x8438dfff] Dec 16 13:09:49.808429 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Dec 16 13:09:49.808497 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Dec 16 13:09:49.808565 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Dec 16 13:09:49.808637 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.808708 kernel: pci 0000:00:04.1: BAR 0 [mem 0x8438c000-0x8438cfff] Dec 16 13:09:49.808775 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Dec 16 13:09:49.808843 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Dec 16 13:09:49.808911 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Dec 16 13:09:49.808992 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.809061 kernel: pci 0000:00:04.2: BAR 0 [mem 0x8438b000-0x8438bfff] Dec 16 13:09:49.809131 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Dec 16 13:09:49.809199 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Dec 16 13:09:49.809267 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Dec 16 13:09:49.809350 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.809421 kernel: pci 0000:00:04.3: BAR 0 [mem 0x8438a000-0x8438afff] Dec 16 13:09:49.809491 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Dec 16 13:09:49.809559 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Dec 16 13:09:49.809627 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Dec 16 13:09:49.809700 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.809770 kernel: pci 0000:00:04.4: BAR 0 [mem 0x84389000-0x84389fff] Dec 16 13:09:49.809837 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Dec 16 13:09:49.809906 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Dec 16 13:09:49.809978 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Dec 16 13:09:49.810054 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.810124 kernel: pci 0000:00:04.5: BAR 0 [mem 0x84388000-0x84388fff] Dec 16 13:09:49.810191 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Dec 16 13:09:49.810261 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Dec 16 13:09:49.810340 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Dec 16 13:09:49.810417 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.810490 kernel: pci 0000:00:04.6: BAR 0 [mem 0x84387000-0x84387fff] Dec 16 13:09:49.810561 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Dec 16 13:09:49.810629 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Dec 16 13:09:49.810699 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Dec 16 13:09:49.810779 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.810848 kernel: pci 0000:00:04.7: BAR 0 [mem 0x84386000-0x84386fff] Dec 16 13:09:49.810915 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Dec 16 13:09:49.810983 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Dec 16 13:09:49.811050 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Dec 16 13:09:49.811136 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.811206 kernel: pci 0000:00:05.0: BAR 0 [mem 0x84385000-0x84385fff] Dec 16 13:09:49.811277 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Dec 16 13:09:49.811357 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Dec 16 13:09:49.811425 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Dec 16 13:09:49.811498 kernel: pci 0000:00:05.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.811568 kernel: pci 0000:00:05.1: BAR 0 [mem 0x84384000-0x84384fff] Dec 16 13:09:49.811637 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Dec 16 13:09:49.811704 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Dec 16 13:09:49.811775 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Dec 16 13:09:49.811848 kernel: pci 0000:00:05.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.811916 kernel: pci 0000:00:05.2: BAR 0 [mem 0x84383000-0x84383fff] Dec 16 13:09:49.811985 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Dec 16 13:09:49.812052 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Dec 16 13:09:49.812120 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Dec 16 13:09:49.812196 kernel: pci 0000:00:05.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.812269 kernel: pci 0000:00:05.3: BAR 0 [mem 0x84382000-0x84382fff] Dec 16 13:09:49.812346 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Dec 16 13:09:49.812414 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Dec 16 13:09:49.812482 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Dec 16 13:09:49.812556 kernel: pci 0000:00:05.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:09:49.812624 kernel: pci 0000:00:05.4: BAR 0 [mem 0x84381000-0x84381fff] Dec 16 13:09:49.812692 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Dec 16 13:09:49.812761 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Dec 16 13:09:49.812832 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Dec 16 13:09:49.812907 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Dec 16 13:09:49.812977 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Dec 16 13:09:49.813050 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Dec 16 13:09:49.813119 kernel: pci 0000:00:1f.2: BAR 4 [io 0x7040-0x705f] Dec 16 13:09:49.813186 kernel: pci 0000:00:1f.2: BAR 5 [mem 0x84380000-0x84380fff] Dec 16 13:09:49.813261 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Dec 16 13:09:49.813340 kernel: pci 0000:00:1f.3: BAR 4 [io 0x7000-0x703f] Dec 16 13:09:49.813418 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Dec 16 13:09:49.813490 kernel: pci 0000:01:00.0: BAR 0 [mem 0x84200000-0x842000ff 64bit] Dec 16 13:09:49.813561 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Dec 16 13:09:49.813630 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Dec 16 13:09:49.813701 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Dec 16 13:09:49.813773 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Dec 16 13:09:49.813844 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Dec 16 13:09:49.813923 kernel: pci_bus 0000:02: extended config space not accessible Dec 16 13:09:49.813934 kernel: acpiphp: Slot [1] registered Dec 16 13:09:49.813942 kernel: acpiphp: Slot [0] registered Dec 16 13:09:49.813950 kernel: acpiphp: Slot [2] registered Dec 16 13:09:49.813958 kernel: acpiphp: Slot [3] registered Dec 16 13:09:49.813965 kernel: acpiphp: Slot [4] registered Dec 16 13:09:49.813973 kernel: acpiphp: Slot [5] registered Dec 16 13:09:49.813982 kernel: acpiphp: Slot [6] registered Dec 16 13:09:49.813990 kernel: acpiphp: Slot [7] registered Dec 16 13:09:49.813998 kernel: acpiphp: Slot [8] registered Dec 16 13:09:49.814005 kernel: acpiphp: Slot [9] registered Dec 16 13:09:49.814013 kernel: acpiphp: Slot [10] registered Dec 16 13:09:49.814021 kernel: acpiphp: Slot [11] registered Dec 16 13:09:49.814028 kernel: acpiphp: Slot [12] registered Dec 16 13:09:49.814036 kernel: acpiphp: Slot [13] registered Dec 16 13:09:49.814043 kernel: acpiphp: Slot [14] registered Dec 16 13:09:49.814053 kernel: acpiphp: Slot [15] registered Dec 16 13:09:49.814061 kernel: acpiphp: Slot [16] registered Dec 16 13:09:49.814068 kernel: acpiphp: Slot [17] registered Dec 16 13:09:49.814075 kernel: acpiphp: Slot [18] registered Dec 16 13:09:49.814083 kernel: acpiphp: Slot [19] registered Dec 16 13:09:49.814091 kernel: acpiphp: Slot [20] registered Dec 16 13:09:49.814099 kernel: acpiphp: Slot [21] registered Dec 16 13:09:49.814108 kernel: acpiphp: Slot [22] registered Dec 16 13:09:49.814115 kernel: acpiphp: Slot [23] registered Dec 16 13:09:49.814123 kernel: acpiphp: Slot [24] registered Dec 16 13:09:49.814133 kernel: acpiphp: Slot [25] registered Dec 16 13:09:49.814140 kernel: acpiphp: Slot [26] registered Dec 16 13:09:49.814148 kernel: acpiphp: Slot [27] registered Dec 16 13:09:49.814155 kernel: acpiphp: Slot [28] registered Dec 16 13:09:49.814163 kernel: acpiphp: Slot [29] registered Dec 16 13:09:49.814170 kernel: acpiphp: Slot [30] registered Dec 16 13:09:49.814178 kernel: acpiphp: Slot [31] registered Dec 16 13:09:49.814254 kernel: pci 0000:02:01.0: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint Dec 16 13:09:49.814339 kernel: pci 0000:02:01.0: BAR 4 [io 0x6000-0x601f] Dec 16 13:09:49.814422 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Dec 16 13:09:49.814432 kernel: acpiphp: Slot [0-2] registered Dec 16 13:09:49.814518 kernel: pci 0000:03:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 16 13:09:49.814590 kernel: pci 0000:03:00.0: BAR 1 [mem 0x83e00000-0x83e00fff] Dec 16 13:09:49.814671 kernel: pci 0000:03:00.0: BAR 4 [mem 0x380800000000-0x380800003fff 64bit pref] Dec 16 13:09:49.814742 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref] Dec 16 13:09:49.814812 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Dec 16 13:09:49.814826 kernel: acpiphp: Slot [0-3] registered Dec 16 13:09:49.814901 kernel: pci 0000:04:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Dec 16 13:09:49.814973 kernel: pci 0000:04:00.0: BAR 1 [mem 0x83c00000-0x83c00fff] Dec 16 13:09:49.815043 kernel: pci 0000:04:00.0: BAR 4 [mem 0x381000000000-0x381000003fff 64bit pref] Dec 16 13:09:49.815121 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Dec 16 13:09:49.815132 kernel: acpiphp: Slot [0-4] registered Dec 16 13:09:49.815206 kernel: pci 0000:05:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Dec 16 13:09:49.815282 kernel: pci 0000:05:00.0: BAR 4 [mem 0x381800000000-0x381800003fff 64bit pref] Dec 16 13:09:49.815362 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Dec 16 13:09:49.815372 kernel: acpiphp: Slot [0-5] registered Dec 16 13:09:49.815446 kernel: pci 0000:06:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Dec 16 13:09:49.815518 kernel: pci 0000:06:00.0: BAR 1 [mem 0x83800000-0x83800fff] Dec 16 13:09:49.815589 kernel: pci 0000:06:00.0: BAR 4 [mem 0x382000000000-0x382000003fff 64bit pref] Dec 16 13:09:49.815659 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Dec 16 13:09:49.815672 kernel: acpiphp: Slot [0-6] registered Dec 16 13:09:49.815741 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Dec 16 13:09:49.815751 kernel: acpiphp: Slot [0-7] registered Dec 16 13:09:49.815819 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Dec 16 13:09:49.815829 kernel: acpiphp: Slot [0-8] registered Dec 16 13:09:49.815896 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Dec 16 13:09:49.815906 kernel: acpiphp: Slot [0-9] registered Dec 16 13:09:49.815974 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Dec 16 13:09:49.815987 kernel: acpiphp: Slot [0-10] registered Dec 16 13:09:49.816055 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Dec 16 13:09:49.816066 kernel: acpiphp: Slot [0-11] registered Dec 16 13:09:49.816132 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Dec 16 13:09:49.816143 kernel: acpiphp: Slot [0-12] registered Dec 16 13:09:49.816211 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Dec 16 13:09:49.816221 kernel: acpiphp: Slot [0-13] registered Dec 16 13:09:49.816288 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Dec 16 13:09:49.816313 kernel: acpiphp: Slot [0-14] registered Dec 16 13:09:49.816383 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Dec 16 13:09:49.816393 kernel: acpiphp: Slot [0-15] registered Dec 16 13:09:49.816461 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Dec 16 13:09:49.816472 kernel: acpiphp: Slot [0-16] registered Dec 16 13:09:49.816538 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Dec 16 13:09:49.816548 kernel: acpiphp: Slot [0-17] registered Dec 16 13:09:49.816630 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Dec 16 13:09:49.816644 kernel: acpiphp: Slot [0-18] registered Dec 16 13:09:49.816713 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Dec 16 13:09:49.816723 kernel: acpiphp: Slot [0-19] registered Dec 16 13:09:49.816789 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Dec 16 13:09:49.816799 kernel: acpiphp: Slot [0-20] registered Dec 16 13:09:49.816866 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Dec 16 13:09:49.816876 kernel: acpiphp: Slot [0-21] registered Dec 16 13:09:49.816945 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Dec 16 13:09:49.816956 kernel: acpiphp: Slot [0-22] registered Dec 16 13:09:49.817023 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Dec 16 13:09:49.817033 kernel: acpiphp: Slot [0-23] registered Dec 16 13:09:49.817100 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Dec 16 13:09:49.817110 kernel: acpiphp: Slot [0-24] registered Dec 16 13:09:49.817177 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Dec 16 13:09:49.817188 kernel: acpiphp: Slot [0-25] registered Dec 16 13:09:49.817257 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Dec 16 13:09:49.817266 kernel: acpiphp: Slot [0-26] registered Dec 16 13:09:49.817342 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Dec 16 13:09:49.817352 kernel: acpiphp: Slot [0-27] registered Dec 16 13:09:49.817419 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Dec 16 13:09:49.817430 kernel: acpiphp: Slot [0-28] registered Dec 16 13:09:49.817495 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Dec 16 13:09:49.817505 kernel: acpiphp: Slot [0-29] registered Dec 16 13:09:49.817575 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Dec 16 13:09:49.817585 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Dec 16 13:09:49.817593 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Dec 16 13:09:49.817601 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Dec 16 13:09:49.817609 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Dec 16 13:09:49.817617 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Dec 16 13:09:49.817624 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Dec 16 13:09:49.817632 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Dec 16 13:09:49.817643 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Dec 16 13:09:49.817654 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Dec 16 13:09:49.817665 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Dec 16 13:09:49.817677 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Dec 16 13:09:49.817688 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Dec 16 13:09:49.817700 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Dec 16 13:09:49.817711 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Dec 16 13:09:49.817724 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Dec 16 13:09:49.817736 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Dec 16 13:09:49.817747 kernel: iommu: Default domain type: Translated Dec 16 13:09:49.817763 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Dec 16 13:09:49.817775 kernel: efivars: Registered efivars operations Dec 16 13:09:49.817789 kernel: PCI: Using ACPI for IRQ routing Dec 16 13:09:49.817800 kernel: PCI: pci_cache_line_size set to 64 bytes Dec 16 13:09:49.817811 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] Dec 16 13:09:49.817822 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] Dec 16 13:09:49.817833 kernel: e820: reserve RAM buffer [mem 0x7dd26018-0x7fffffff] Dec 16 13:09:49.817840 kernel: e820: reserve RAM buffer [mem 0x7dd4e018-0x7fffffff] Dec 16 13:09:49.817848 kernel: e820: reserve RAM buffer [mem 0x7e73f000-0x7fffffff] Dec 16 13:09:49.817858 kernel: e820: reserve RAM buffer [mem 0x7ea71000-0x7fffffff] Dec 16 13:09:49.817865 kernel: e820: reserve RAM buffer [mem 0x7f6ed000-0x7fffffff] Dec 16 13:09:49.817873 kernel: e820: reserve RAM buffer [mem 0x7fe4f000-0x7fffffff] Dec 16 13:09:49.817881 kernel: e820: reserve RAM buffer [mem 0x7febc000-0x7fffffff] Dec 16 13:09:49.817950 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Dec 16 13:09:49.818019 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Dec 16 13:09:49.818087 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Dec 16 13:09:49.818097 kernel: vgaarb: loaded Dec 16 13:09:49.818107 kernel: clocksource: Switched to clocksource kvm-clock Dec 16 13:09:49.818115 kernel: VFS: Disk quotas dquot_6.6.0 Dec 16 13:09:49.818123 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 16 13:09:49.818131 kernel: pnp: PnP ACPI init Dec 16 13:09:49.818206 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Dec 16 13:09:49.818217 kernel: pnp: PnP ACPI: found 5 devices Dec 16 13:09:49.818225 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Dec 16 13:09:49.818233 kernel: NET: Registered PF_INET protocol family Dec 16 13:09:49.818243 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 16 13:09:49.818251 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Dec 16 13:09:49.818259 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 16 13:09:49.818267 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Dec 16 13:09:49.818275 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Dec 16 13:09:49.818282 kernel: TCP: Hash tables configured (established 131072 bind 65536) Dec 16 13:09:49.818290 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 16 13:09:49.818305 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 16 13:09:49.818313 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 16 13:09:49.818322 kernel: NET: Registered PF_XDP protocol family Dec 16 13:09:49.818396 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Dec 16 13:09:49.818465 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Dec 16 13:09:49.818534 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Dec 16 13:09:49.818604 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Dec 16 13:09:49.818674 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Dec 16 13:09:49.818743 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Dec 16 13:09:49.818813 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Dec 16 13:09:49.818886 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Dec 16 13:09:49.818954 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Dec 16 13:09:49.819026 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Dec 16 13:09:49.819105 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Dec 16 13:09:49.819173 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Dec 16 13:09:49.819242 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Dec 16 13:09:49.819318 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Dec 16 13:09:49.819388 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Dec 16 13:09:49.819459 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Dec 16 13:09:49.819534 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Dec 16 13:09:49.819608 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Dec 16 13:09:49.819677 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Dec 16 13:09:49.819746 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Dec 16 13:09:49.819816 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Dec 16 13:09:49.819885 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Dec 16 13:09:49.819953 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Dec 16 13:09:49.820024 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Dec 16 13:09:49.820093 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Dec 16 13:09:49.820161 kernel: pci 0000:00:05.1: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Dec 16 13:09:49.820229 kernel: pci 0000:00:05.2: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Dec 16 13:09:49.820312 kernel: pci 0000:00:05.3: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Dec 16 13:09:49.820381 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Dec 16 13:09:49.820448 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x1fff]: assigned Dec 16 13:09:49.820518 kernel: pci 0000:00:02.2: bridge window [io 0x2000-0x2fff]: assigned Dec 16 13:09:49.820586 kernel: pci 0000:00:02.3: bridge window [io 0x3000-0x3fff]: assigned Dec 16 13:09:49.820655 kernel: pci 0000:00:02.4: bridge window [io 0x4000-0x4fff]: assigned Dec 16 13:09:49.820723 kernel: pci 0000:00:02.5: bridge window [io 0x5000-0x5fff]: assigned Dec 16 13:09:49.820804 kernel: pci 0000:00:02.6: bridge window [io 0x8000-0x8fff]: assigned Dec 16 13:09:49.820874 kernel: pci 0000:00:02.7: bridge window [io 0x9000-0x9fff]: assigned Dec 16 13:09:49.820942 kernel: pci 0000:00:03.0: bridge window [io 0xa000-0xafff]: assigned Dec 16 13:09:49.821009 kernel: pci 0000:00:03.1: bridge window [io 0xb000-0xbfff]: assigned Dec 16 13:09:49.821080 kernel: pci 0000:00:03.2: bridge window [io 0xc000-0xcfff]: assigned Dec 16 13:09:49.821148 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff]: assigned Dec 16 13:09:49.821216 kernel: pci 0000:00:03.4: bridge window [io 0xe000-0xefff]: assigned Dec 16 13:09:49.821283 kernel: pci 0000:00:03.5: bridge window [io 0xf000-0xffff]: assigned Dec 16 13:09:49.821361 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.821429 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.821497 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.821565 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.821635 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.821703 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.821772 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.821840 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.821909 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.821977 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.822045 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.822112 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.822182 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.822250 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.822326 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.822395 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.822463 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.822532 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.822601 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.822669 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.822740 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.822807 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.822875 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.822943 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.823012 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.823092 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.823162 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.823230 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.823307 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.823377 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.823446 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff]: assigned Dec 16 13:09:49.823514 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff]: assigned Dec 16 13:09:49.823584 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff]: assigned Dec 16 13:09:49.823664 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff]: assigned Dec 16 13:09:49.823755 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff]: assigned Dec 16 13:09:49.823842 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff]: assigned Dec 16 13:09:49.823934 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff]: assigned Dec 16 13:09:49.824021 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff]: assigned Dec 16 13:09:49.824109 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff]: assigned Dec 16 13:09:49.824196 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff]: assigned Dec 16 13:09:49.824286 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff]: assigned Dec 16 13:09:49.824389 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff]: assigned Dec 16 13:09:49.824473 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff]: assigned Dec 16 13:09:49.824541 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.824646 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.824716 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.824786 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.824854 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.824922 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.824989 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.825057 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.825126 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.825199 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.825267 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.825343 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.825412 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.825480 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.825549 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.825617 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.825686 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.825757 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.825825 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.825893 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.825962 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.826029 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.826098 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.826166 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.826234 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.826313 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.826382 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.826451 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.826519 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:09:49.826588 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Dec 16 13:09:49.826664 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Dec 16 13:09:49.826735 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Dec 16 13:09:49.826805 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Dec 16 13:09:49.826886 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Dec 16 13:09:49.826958 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Dec 16 13:09:49.827032 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Dec 16 13:09:49.827112 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Dec 16 13:09:49.827180 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Dec 16 13:09:49.827253 kernel: pci 0000:03:00.0: ROM [mem 0x83e80000-0x83efffff pref]: assigned Dec 16 13:09:49.827329 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Dec 16 13:09:49.827398 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Dec 16 13:09:49.827468 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Dec 16 13:09:49.827539 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Dec 16 13:09:49.827606 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Dec 16 13:09:49.827675 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Dec 16 13:09:49.827743 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Dec 16 13:09:49.827811 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Dec 16 13:09:49.827878 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Dec 16 13:09:49.827945 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Dec 16 13:09:49.828012 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Dec 16 13:09:49.828080 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Dec 16 13:09:49.828149 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Dec 16 13:09:49.828218 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Dec 16 13:09:49.828286 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Dec 16 13:09:49.828394 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Dec 16 13:09:49.828464 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Dec 16 13:09:49.828532 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Dec 16 13:09:49.828599 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Dec 16 13:09:49.828668 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Dec 16 13:09:49.828743 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Dec 16 13:09:49.828835 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Dec 16 13:09:49.828913 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Dec 16 13:09:49.828986 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Dec 16 13:09:49.829060 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Dec 16 13:09:49.829128 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Dec 16 13:09:49.829196 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Dec 16 13:09:49.829263 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Dec 16 13:09:49.829340 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Dec 16 13:09:49.829410 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Dec 16 13:09:49.829483 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Dec 16 13:09:49.829556 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Dec 16 13:09:49.829625 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Dec 16 13:09:49.829693 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Dec 16 13:09:49.829762 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Dec 16 13:09:49.829831 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Dec 16 13:09:49.829898 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Dec 16 13:09:49.829967 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Dec 16 13:09:49.830037 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Dec 16 13:09:49.830108 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Dec 16 13:09:49.830177 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Dec 16 13:09:49.830246 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Dec 16 13:09:49.830323 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Dec 16 13:09:49.830393 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Dec 16 13:09:49.830462 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Dec 16 13:09:49.830530 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Dec 16 13:09:49.830598 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff] Dec 16 13:09:49.830670 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Dec 16 13:09:49.830738 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Dec 16 13:09:49.830807 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Dec 16 13:09:49.830890 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff] Dec 16 13:09:49.830970 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Dec 16 13:09:49.831039 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Dec 16 13:09:49.831123 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Dec 16 13:09:49.831195 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff] Dec 16 13:09:49.831263 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Dec 16 13:09:49.831339 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Dec 16 13:09:49.831409 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Dec 16 13:09:49.831477 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff] Dec 16 13:09:49.831546 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Dec 16 13:09:49.831615 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Dec 16 13:09:49.831685 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Dec 16 13:09:49.831756 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff] Dec 16 13:09:49.831824 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Dec 16 13:09:49.831892 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Dec 16 13:09:49.831961 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Dec 16 13:09:49.832029 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff] Dec 16 13:09:49.832097 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Dec 16 13:09:49.832165 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Dec 16 13:09:49.832236 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Dec 16 13:09:49.832317 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff] Dec 16 13:09:49.832387 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Dec 16 13:09:49.832455 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Dec 16 13:09:49.832524 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Dec 16 13:09:49.832592 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff] Dec 16 13:09:49.832660 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Dec 16 13:09:49.832728 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Dec 16 13:09:49.832801 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Dec 16 13:09:49.832869 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff] Dec 16 13:09:49.832937 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Dec 16 13:09:49.833006 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Dec 16 13:09:49.833075 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Dec 16 13:09:49.833144 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff] Dec 16 13:09:49.833212 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Dec 16 13:09:49.833283 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Dec 16 13:09:49.833361 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Dec 16 13:09:49.833432 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff] Dec 16 13:09:49.833500 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Dec 16 13:09:49.833568 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Dec 16 13:09:49.833638 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Dec 16 13:09:49.833706 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff] Dec 16 13:09:49.833775 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Dec 16 13:09:49.833847 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Dec 16 13:09:49.833916 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Dec 16 13:09:49.833984 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff] Dec 16 13:09:49.834052 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Dec 16 13:09:49.834119 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Dec 16 13:09:49.834185 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Dec 16 13:09:49.834247 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Dec 16 13:09:49.834317 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Dec 16 13:09:49.834378 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Dec 16 13:09:49.834439 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Dec 16 13:09:49.834498 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x38e800003fff window] Dec 16 13:09:49.834570 kernel: pci_bus 0000:01: resource 0 [io 0x6000-0x6fff] Dec 16 13:09:49.834635 kernel: pci_bus 0000:01: resource 1 [mem 0x84000000-0x842fffff] Dec 16 13:09:49.834698 kernel: pci_bus 0000:01: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Dec 16 13:09:49.834770 kernel: pci_bus 0000:02: resource 0 [io 0x6000-0x6fff] Dec 16 13:09:49.834836 kernel: pci_bus 0000:02: resource 1 [mem 0x84000000-0x841fffff] Dec 16 13:09:49.834902 kernel: pci_bus 0000:02: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Dec 16 13:09:49.834971 kernel: pci_bus 0000:03: resource 1 [mem 0x83e00000-0x83ffffff] Dec 16 13:09:49.835035 kernel: pci_bus 0000:03: resource 2 [mem 0x380800000000-0x380fffffffff 64bit pref] Dec 16 13:09:49.835118 kernel: pci_bus 0000:04: resource 1 [mem 0x83c00000-0x83dfffff] Dec 16 13:09:49.835182 kernel: pci_bus 0000:04: resource 2 [mem 0x381000000000-0x3817ffffffff 64bit pref] Dec 16 13:09:49.835253 kernel: pci_bus 0000:05: resource 1 [mem 0x83a00000-0x83bfffff] Dec 16 13:09:49.835326 kernel: pci_bus 0000:05: resource 2 [mem 0x381800000000-0x381fffffffff 64bit pref] Dec 16 13:09:49.835394 kernel: pci_bus 0000:06: resource 1 [mem 0x83800000-0x839fffff] Dec 16 13:09:49.835458 kernel: pci_bus 0000:06: resource 2 [mem 0x382000000000-0x3827ffffffff 64bit pref] Dec 16 13:09:49.835526 kernel: pci_bus 0000:07: resource 1 [mem 0x83600000-0x837fffff] Dec 16 13:09:49.835590 kernel: pci_bus 0000:07: resource 2 [mem 0x382800000000-0x382fffffffff 64bit pref] Dec 16 13:09:49.835661 kernel: pci_bus 0000:08: resource 1 [mem 0x83400000-0x835fffff] Dec 16 13:09:49.835726 kernel: pci_bus 0000:08: resource 2 [mem 0x383000000000-0x3837ffffffff 64bit pref] Dec 16 13:09:49.835796 kernel: pci_bus 0000:09: resource 1 [mem 0x83200000-0x833fffff] Dec 16 13:09:49.835859 kernel: pci_bus 0000:09: resource 2 [mem 0x383800000000-0x383fffffffff 64bit pref] Dec 16 13:09:49.835927 kernel: pci_bus 0000:0a: resource 1 [mem 0x83000000-0x831fffff] Dec 16 13:09:49.835991 kernel: pci_bus 0000:0a: resource 2 [mem 0x384000000000-0x3847ffffffff 64bit pref] Dec 16 13:09:49.836062 kernel: pci_bus 0000:0b: resource 1 [mem 0x82e00000-0x82ffffff] Dec 16 13:09:49.836128 kernel: pci_bus 0000:0b: resource 2 [mem 0x384800000000-0x384fffffffff 64bit pref] Dec 16 13:09:49.836194 kernel: pci_bus 0000:0c: resource 1 [mem 0x82c00000-0x82dfffff] Dec 16 13:09:49.836259 kernel: pci_bus 0000:0c: resource 2 [mem 0x385000000000-0x3857ffffffff 64bit pref] Dec 16 13:09:49.836353 kernel: pci_bus 0000:0d: resource 1 [mem 0x82a00000-0x82bfffff] Dec 16 13:09:49.836422 kernel: pci_bus 0000:0d: resource 2 [mem 0x385800000000-0x385fffffffff 64bit pref] Dec 16 13:09:49.836491 kernel: pci_bus 0000:0e: resource 1 [mem 0x82800000-0x829fffff] Dec 16 13:09:49.836555 kernel: pci_bus 0000:0e: resource 2 [mem 0x386000000000-0x3867ffffffff 64bit pref] Dec 16 13:09:49.836623 kernel: pci_bus 0000:0f: resource 1 [mem 0x82600000-0x827fffff] Dec 16 13:09:49.836695 kernel: pci_bus 0000:0f: resource 2 [mem 0x386800000000-0x386fffffffff 64bit pref] Dec 16 13:09:49.836765 kernel: pci_bus 0000:10: resource 1 [mem 0x82400000-0x825fffff] Dec 16 13:09:49.836833 kernel: pci_bus 0000:10: resource 2 [mem 0x387000000000-0x3877ffffffff 64bit pref] Dec 16 13:09:49.836901 kernel: pci_bus 0000:11: resource 1 [mem 0x82200000-0x823fffff] Dec 16 13:09:49.836966 kernel: pci_bus 0000:11: resource 2 [mem 0x387800000000-0x387fffffffff 64bit pref] Dec 16 13:09:49.837035 kernel: pci_bus 0000:12: resource 0 [io 0xf000-0xffff] Dec 16 13:09:49.837099 kernel: pci_bus 0000:12: resource 1 [mem 0x82000000-0x821fffff] Dec 16 13:09:49.837163 kernel: pci_bus 0000:12: resource 2 [mem 0x388000000000-0x3887ffffffff 64bit pref] Dec 16 13:09:49.837230 kernel: pci_bus 0000:13: resource 0 [io 0xe000-0xefff] Dec 16 13:09:49.837303 kernel: pci_bus 0000:13: resource 1 [mem 0x81e00000-0x81ffffff] Dec 16 13:09:49.837368 kernel: pci_bus 0000:13: resource 2 [mem 0x388800000000-0x388fffffffff 64bit pref] Dec 16 13:09:49.837436 kernel: pci_bus 0000:14: resource 0 [io 0xd000-0xdfff] Dec 16 13:09:49.837499 kernel: pci_bus 0000:14: resource 1 [mem 0x81c00000-0x81dfffff] Dec 16 13:09:49.837562 kernel: pci_bus 0000:14: resource 2 [mem 0x389000000000-0x3897ffffffff 64bit pref] Dec 16 13:09:49.837628 kernel: pci_bus 0000:15: resource 0 [io 0xc000-0xcfff] Dec 16 13:09:49.837695 kernel: pci_bus 0000:15: resource 1 [mem 0x81a00000-0x81bfffff] Dec 16 13:09:49.837758 kernel: pci_bus 0000:15: resource 2 [mem 0x389800000000-0x389fffffffff 64bit pref] Dec 16 13:09:49.837827 kernel: pci_bus 0000:16: resource 0 [io 0xb000-0xbfff] Dec 16 13:09:49.837891 kernel: pci_bus 0000:16: resource 1 [mem 0x81800000-0x819fffff] Dec 16 13:09:49.837954 kernel: pci_bus 0000:16: resource 2 [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Dec 16 13:09:49.838020 kernel: pci_bus 0000:17: resource 0 [io 0xa000-0xafff] Dec 16 13:09:49.838083 kernel: pci_bus 0000:17: resource 1 [mem 0x81600000-0x817fffff] Dec 16 13:09:49.838150 kernel: pci_bus 0000:17: resource 2 [mem 0x38a800000000-0x38afffffffff 64bit pref] Dec 16 13:09:49.838218 kernel: pci_bus 0000:18: resource 0 [io 0x9000-0x9fff] Dec 16 13:09:49.838282 kernel: pci_bus 0000:18: resource 1 [mem 0x81400000-0x815fffff] Dec 16 13:09:49.838353 kernel: pci_bus 0000:18: resource 2 [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Dec 16 13:09:49.838422 kernel: pci_bus 0000:19: resource 0 [io 0x8000-0x8fff] Dec 16 13:09:49.838485 kernel: pci_bus 0000:19: resource 1 [mem 0x81200000-0x813fffff] Dec 16 13:09:49.838548 kernel: pci_bus 0000:19: resource 2 [mem 0x38b800000000-0x38bfffffffff 64bit pref] Dec 16 13:09:49.838617 kernel: pci_bus 0000:1a: resource 0 [io 0x5000-0x5fff] Dec 16 13:09:49.838681 kernel: pci_bus 0000:1a: resource 1 [mem 0x81000000-0x811fffff] Dec 16 13:09:49.838744 kernel: pci_bus 0000:1a: resource 2 [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Dec 16 13:09:49.838812 kernel: pci_bus 0000:1b: resource 0 [io 0x4000-0x4fff] Dec 16 13:09:49.838875 kernel: pci_bus 0000:1b: resource 1 [mem 0x80e00000-0x80ffffff] Dec 16 13:09:49.838938 kernel: pci_bus 0000:1b: resource 2 [mem 0x38c800000000-0x38cfffffffff 64bit pref] Dec 16 13:09:49.839005 kernel: pci_bus 0000:1c: resource 0 [io 0x3000-0x3fff] Dec 16 13:09:49.839081 kernel: pci_bus 0000:1c: resource 1 [mem 0x80c00000-0x80dfffff] Dec 16 13:09:49.839145 kernel: pci_bus 0000:1c: resource 2 [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Dec 16 13:09:49.839213 kernel: pci_bus 0000:1d: resource 0 [io 0x2000-0x2fff] Dec 16 13:09:49.839277 kernel: pci_bus 0000:1d: resource 1 [mem 0x80a00000-0x80bfffff] Dec 16 13:09:49.839362 kernel: pci_bus 0000:1d: resource 2 [mem 0x38d800000000-0x38dfffffffff 64bit pref] Dec 16 13:09:49.839436 kernel: pci_bus 0000:1e: resource 0 [io 0x1000-0x1fff] Dec 16 13:09:49.839504 kernel: pci_bus 0000:1e: resource 1 [mem 0x80800000-0x809fffff] Dec 16 13:09:49.839568 kernel: pci_bus 0000:1e: resource 2 [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Dec 16 13:09:49.839579 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Dec 16 13:09:49.839587 kernel: PCI: CLS 0 bytes, default 64 Dec 16 13:09:49.839595 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Dec 16 13:09:49.839603 kernel: software IO TLB: mapped [mem 0x0000000077e7e000-0x000000007be7e000] (64MB) Dec 16 13:09:49.839611 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Dec 16 13:09:49.839620 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x21134f58f0d, max_idle_ns: 440795217993 ns Dec 16 13:09:49.839630 kernel: Initialise system trusted keyrings Dec 16 13:09:49.839638 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Dec 16 13:09:49.839646 kernel: Key type asymmetric registered Dec 16 13:09:49.839654 kernel: Asymmetric key parser 'x509' registered Dec 16 13:09:49.839661 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Dec 16 13:09:49.839669 kernel: io scheduler mq-deadline registered Dec 16 13:09:49.839677 kernel: io scheduler kyber registered Dec 16 13:09:49.839685 kernel: io scheduler bfq registered Dec 16 13:09:49.839756 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Dec 16 13:09:49.839830 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Dec 16 13:09:49.839900 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Dec 16 13:09:49.839969 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Dec 16 13:09:49.840039 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Dec 16 13:09:49.840108 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Dec 16 13:09:49.840177 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Dec 16 13:09:49.840248 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Dec 16 13:09:49.840330 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Dec 16 13:09:49.840399 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Dec 16 13:09:49.840468 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Dec 16 13:09:49.840538 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Dec 16 13:09:49.840608 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Dec 16 13:09:49.840679 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Dec 16 13:09:49.840748 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Dec 16 13:09:49.840816 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Dec 16 13:09:49.840826 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Dec 16 13:09:49.840893 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Dec 16 13:09:49.840962 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Dec 16 13:09:49.841034 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 33 Dec 16 13:09:49.841102 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 33 Dec 16 13:09:49.841170 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 34 Dec 16 13:09:49.841238 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 34 Dec 16 13:09:49.841314 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 35 Dec 16 13:09:49.841385 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 35 Dec 16 13:09:49.841453 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 36 Dec 16 13:09:49.841522 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 36 Dec 16 13:09:49.841590 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 37 Dec 16 13:09:49.841658 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 37 Dec 16 13:09:49.841727 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 38 Dec 16 13:09:49.841797 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 38 Dec 16 13:09:49.841865 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 39 Dec 16 13:09:49.841932 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 39 Dec 16 13:09:49.841942 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Dec 16 13:09:49.842008 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 40 Dec 16 13:09:49.842075 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 40 Dec 16 13:09:49.842145 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 41 Dec 16 13:09:49.842214 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 41 Dec 16 13:09:49.842282 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 42 Dec 16 13:09:49.842359 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 42 Dec 16 13:09:49.842428 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 43 Dec 16 13:09:49.842497 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 43 Dec 16 13:09:49.842566 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 44 Dec 16 13:09:49.842634 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 44 Dec 16 13:09:49.842702 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 45 Dec 16 13:09:49.842771 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 45 Dec 16 13:09:49.842838 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 46 Dec 16 13:09:49.842909 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 46 Dec 16 13:09:49.842977 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 47 Dec 16 13:09:49.843045 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 47 Dec 16 13:09:49.843066 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Dec 16 13:09:49.843134 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 48 Dec 16 13:09:49.843202 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 48 Dec 16 13:09:49.843270 kernel: pcieport 0000:00:05.1: PME: Signaling with IRQ 49 Dec 16 13:09:49.843346 kernel: pcieport 0000:00:05.1: AER: enabled with IRQ 49 Dec 16 13:09:49.843418 kernel: pcieport 0000:00:05.2: PME: Signaling with IRQ 50 Dec 16 13:09:49.843486 kernel: pcieport 0000:00:05.2: AER: enabled with IRQ 50 Dec 16 13:09:49.843554 kernel: pcieport 0000:00:05.3: PME: Signaling with IRQ 51 Dec 16 13:09:49.843622 kernel: pcieport 0000:00:05.3: AER: enabled with IRQ 51 Dec 16 13:09:49.843690 kernel: pcieport 0000:00:05.4: PME: Signaling with IRQ 52 Dec 16 13:09:49.843757 kernel: pcieport 0000:00:05.4: AER: enabled with IRQ 52 Dec 16 13:09:49.843767 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Dec 16 13:09:49.843775 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 16 13:09:49.843783 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Dec 16 13:09:49.843793 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Dec 16 13:09:49.843801 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Dec 16 13:09:49.843809 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Dec 16 13:09:49.843880 kernel: rtc_cmos 00:03: RTC can wake from S4 Dec 16 13:09:49.843892 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Dec 16 13:09:49.843953 kernel: rtc_cmos 00:03: registered as rtc0 Dec 16 13:09:49.844016 kernel: rtc_cmos 00:03: setting system clock to 2025-12-16T13:09:49 UTC (1765890589) Dec 16 13:09:49.844081 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Dec 16 13:09:49.844091 kernel: intel_pstate: CPU model not supported Dec 16 13:09:49.844099 kernel: efifb: probing for efifb Dec 16 13:09:49.844107 kernel: efifb: framebuffer at 0x80000000, using 4000k, total 4000k Dec 16 13:09:49.844114 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Dec 16 13:09:49.844122 kernel: efifb: scrolling: redraw Dec 16 13:09:49.844130 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Dec 16 13:09:49.844138 kernel: Console: switching to colour frame buffer device 160x50 Dec 16 13:09:49.844145 kernel: fb0: EFI VGA frame buffer device Dec 16 13:09:49.844153 kernel: pstore: Using crash dump compression: deflate Dec 16 13:09:49.844163 kernel: pstore: Registered efi_pstore as persistent store backend Dec 16 13:09:49.844171 kernel: NET: Registered PF_INET6 protocol family Dec 16 13:09:49.844179 kernel: Segment Routing with IPv6 Dec 16 13:09:49.844186 kernel: In-situ OAM (IOAM) with IPv6 Dec 16 13:09:49.844194 kernel: NET: Registered PF_PACKET protocol family Dec 16 13:09:49.844202 kernel: Key type dns_resolver registered Dec 16 13:09:49.844209 kernel: IPI shorthand broadcast: enabled Dec 16 13:09:49.844217 kernel: sched_clock: Marking stable (4233002568, 165918610)->(4642494352, -243573174) Dec 16 13:09:49.844225 kernel: registered taskstats version 1 Dec 16 13:09:49.844235 kernel: Loading compiled-in X.509 certificates Dec 16 13:09:49.844243 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.61-flatcar: 0d0c78e6590cb40d27f1cef749ef9f2f3425f38d' Dec 16 13:09:49.844251 kernel: Demotion targets for Node 0: null Dec 16 13:09:49.844259 kernel: Key type .fscrypt registered Dec 16 13:09:49.844266 kernel: Key type fscrypt-provisioning registered Dec 16 13:09:49.844274 kernel: ima: No TPM chip found, activating TPM-bypass! Dec 16 13:09:49.844281 kernel: ima: Allocated hash algorithm: sha1 Dec 16 13:09:49.844289 kernel: ima: No architecture policies found Dec 16 13:09:49.844313 kernel: clk: Disabling unused clocks Dec 16 13:09:49.844323 kernel: Warning: unable to open an initial console. Dec 16 13:09:49.844331 kernel: Freeing unused kernel image (initmem) memory: 46188K Dec 16 13:09:49.844339 kernel: Write protecting the kernel read-only data: 40960k Dec 16 13:09:49.844347 kernel: Freeing unused kernel image (rodata/data gap) memory: 560K Dec 16 13:09:49.844354 kernel: Run /init as init process Dec 16 13:09:49.844366 kernel: with arguments: Dec 16 13:09:49.844376 kernel: /init Dec 16 13:09:49.844383 kernel: with environment: Dec 16 13:09:49.844391 kernel: HOME=/ Dec 16 13:09:49.844405 kernel: TERM=linux Dec 16 13:09:49.844415 systemd[1]: Successfully made /usr/ read-only. Dec 16 13:09:49.844426 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 13:09:49.844435 systemd[1]: Detected virtualization kvm. Dec 16 13:09:49.844443 systemd[1]: Detected architecture x86-64. Dec 16 13:09:49.844451 systemd[1]: Running in initrd. Dec 16 13:09:49.844459 systemd[1]: No hostname configured, using default hostname. Dec 16 13:09:49.844470 systemd[1]: Hostname set to . Dec 16 13:09:49.844478 systemd[1]: Initializing machine ID from VM UUID. Dec 16 13:09:49.844497 systemd[1]: Queued start job for default target initrd.target. Dec 16 13:09:49.844507 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 13:09:49.844516 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 13:09:49.844525 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Dec 16 13:09:49.844533 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 13:09:49.844541 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Dec 16 13:09:49.844551 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Dec 16 13:09:49.844562 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Dec 16 13:09:49.844571 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Dec 16 13:09:49.844579 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 13:09:49.844587 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 13:09:49.844595 systemd[1]: Reached target paths.target - Path Units. Dec 16 13:09:49.844604 systemd[1]: Reached target slices.target - Slice Units. Dec 16 13:09:49.844612 systemd[1]: Reached target swap.target - Swaps. Dec 16 13:09:49.844620 systemd[1]: Reached target timers.target - Timer Units. Dec 16 13:09:49.844629 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 13:09:49.844639 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 13:09:49.844647 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Dec 16 13:09:49.844656 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Dec 16 13:09:49.844664 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 13:09:49.844672 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 13:09:49.844680 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 13:09:49.844689 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 13:09:49.844697 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Dec 16 13:09:49.844707 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 13:09:49.844715 systemd[1]: Finished network-cleanup.service - Network Cleanup. Dec 16 13:09:49.844724 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Dec 16 13:09:49.844732 systemd[1]: Starting systemd-fsck-usr.service... Dec 16 13:09:49.844741 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 13:09:49.844749 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 13:09:49.844757 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 13:09:49.844765 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Dec 16 13:09:49.844776 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 13:09:49.844784 systemd[1]: Finished systemd-fsck-usr.service. Dec 16 13:09:49.844793 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 16 13:09:49.844823 systemd-journald[278]: Collecting audit messages is disabled. Dec 16 13:09:49.844847 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 13:09:49.844858 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 13:09:49.844867 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 16 13:09:49.844875 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 13:09:49.844886 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 13:09:49.844895 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 16 13:09:49.844904 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 13:09:49.844912 kernel: Bridge firewalling registered Dec 16 13:09:49.844921 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 13:09:49.844929 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Dec 16 13:09:49.844937 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 13:09:49.844947 systemd-journald[278]: Journal started Dec 16 13:09:49.844968 systemd-journald[278]: Runtime Journal (/run/log/journal/c71939ed77244084a91d974211ac9954) is 8M, max 319.5M, 311.5M free. Dec 16 13:09:49.800581 systemd-modules-load[281]: Inserted module 'overlay' Dec 16 13:09:49.850490 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 13:09:49.831040 systemd-modules-load[281]: Inserted module 'br_netfilter' Dec 16 13:09:49.851940 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 13:09:49.855449 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 13:09:49.859365 systemd-tmpfiles[314]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Dec 16 13:09:49.860034 dracut-cmdline[306]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=a214a2d85e162c493e8b13db2df50a43e1005a0e4854a1ae089a14f442a30022 Dec 16 13:09:49.862541 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 13:09:49.864255 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 13:09:49.907467 systemd-resolved[342]: Positive Trust Anchors: Dec 16 13:09:49.907481 systemd-resolved[342]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 13:09:49.907512 systemd-resolved[342]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 13:09:49.909744 systemd-resolved[342]: Defaulting to hostname 'linux'. Dec 16 13:09:49.911454 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 13:09:49.912002 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 13:09:49.942352 kernel: SCSI subsystem initialized Dec 16 13:09:49.954340 kernel: Loading iSCSI transport class v2.0-870. Dec 16 13:09:49.965340 kernel: iscsi: registered transport (tcp) Dec 16 13:09:49.993395 kernel: iscsi: registered transport (qla4xxx) Dec 16 13:09:49.994045 kernel: QLogic iSCSI HBA Driver Dec 16 13:09:50.017026 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 13:09:50.047285 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 13:09:50.048883 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 13:09:50.094654 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Dec 16 13:09:50.096385 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Dec 16 13:09:50.161326 kernel: raid6: avx512x4 gen() 43296 MB/s Dec 16 13:09:50.179348 kernel: raid6: avx512x2 gen() 47274 MB/s Dec 16 13:09:50.197342 kernel: raid6: avx512x1 gen() 44261 MB/s Dec 16 13:09:50.214376 kernel: raid6: avx2x4 gen() 34807 MB/s Dec 16 13:09:50.231391 kernel: raid6: avx2x2 gen() 34464 MB/s Dec 16 13:09:50.248783 kernel: raid6: avx2x1 gen() 26812 MB/s Dec 16 13:09:50.248904 kernel: raid6: using algorithm avx512x2 gen() 47274 MB/s Dec 16 13:09:50.268435 kernel: raid6: .... xor() 26753 MB/s, rmw enabled Dec 16 13:09:50.268573 kernel: raid6: using avx512x2 recovery algorithm Dec 16 13:09:50.289331 kernel: xor: automatically using best checksumming function avx Dec 16 13:09:50.432336 kernel: Btrfs loaded, zoned=no, fsverity=no Dec 16 13:09:50.439919 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Dec 16 13:09:50.442125 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 13:09:50.477976 systemd-udevd[536]: Using default interface naming scheme 'v255'. Dec 16 13:09:50.482511 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 13:09:50.484341 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Dec 16 13:09:50.513438 dracut-pre-trigger[544]: rd.md=0: removing MD RAID activation Dec 16 13:09:50.539939 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 13:09:50.541529 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 13:09:50.629772 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 13:09:50.631675 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Dec 16 13:09:50.654320 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues Dec 16 13:09:50.665834 kernel: virtio_blk virtio2: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Dec 16 13:09:50.684840 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Dec 16 13:09:50.684891 kernel: GPT:17805311 != 104857599 Dec 16 13:09:50.684902 kernel: GPT:Alternate GPT header not at the end of the disk. Dec 16 13:09:50.684912 kernel: GPT:17805311 != 104857599 Dec 16 13:09:50.684921 kernel: GPT: Use GNU Parted to correct GPT errors. Dec 16 13:09:50.685713 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 16 13:09:50.693561 kernel: ACPI: bus type USB registered Dec 16 13:09:50.693622 kernel: usbcore: registered new interface driver usbfs Dec 16 13:09:50.696335 kernel: usbcore: registered new interface driver hub Dec 16 13:09:50.697321 kernel: usbcore: registered new device driver usb Dec 16 13:09:50.697359 kernel: cryptd: max_cpu_qlen set to 1000 Dec 16 13:09:50.716314 kernel: AES CTR mode by8 optimization enabled Dec 16 13:09:50.728653 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 13:09:50.730098 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Dec 16 13:09:50.730266 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 13:09:50.730836 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 13:09:50.733352 kernel: libata version 3.00 loaded. Dec 16 13:09:50.735221 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 13:09:50.741112 kernel: uhci_hcd 0000:02:01.0: UHCI Host Controller Dec 16 13:09:50.741279 kernel: uhci_hcd 0000:02:01.0: new USB bus registered, assigned bus number 1 Dec 16 13:09:50.741388 kernel: uhci_hcd 0000:02:01.0: detected 2 ports Dec 16 13:09:50.741478 kernel: uhci_hcd 0000:02:01.0: irq 22, io port 0x00006000 Dec 16 13:09:50.743669 kernel: hub 1-0:1.0: USB hub found Dec 16 13:09:50.743845 kernel: hub 1-0:1.0: 2 ports detected Dec 16 13:09:50.747498 kernel: ahci 0000:00:1f.2: version 3.0 Dec 16 13:09:50.747650 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Dec 16 13:09:50.750591 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Dec 16 13:09:50.750705 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Dec 16 13:09:50.752498 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Dec 16 13:09:50.754346 kernel: scsi host0: ahci Dec 16 13:09:50.754493 kernel: scsi host1: ahci Dec 16 13:09:50.755542 kernel: scsi host2: ahci Dec 16 13:09:50.756335 kernel: scsi host3: ahci Dec 16 13:09:50.757726 kernel: scsi host4: ahci Dec 16 13:09:50.757875 kernel: scsi host5: ahci Dec 16 13:09:50.758483 kernel: ata1: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380100 irq 67 lpm-pol 1 Dec 16 13:09:50.761726 kernel: ata2: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380180 irq 67 lpm-pol 1 Dec 16 13:09:50.761755 kernel: ata3: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380200 irq 67 lpm-pol 1 Dec 16 13:09:50.763430 kernel: ata4: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380280 irq 67 lpm-pol 1 Dec 16 13:09:50.765270 kernel: ata5: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380300 irq 67 lpm-pol 1 Dec 16 13:09:50.765320 kernel: ata6: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380380 irq 67 lpm-pol 1 Dec 16 13:09:50.770079 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 13:09:50.797641 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Dec 16 13:09:50.805344 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Dec 16 13:09:50.813078 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 16 13:09:50.819130 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Dec 16 13:09:50.819636 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Dec 16 13:09:50.821439 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Dec 16 13:09:50.852783 disk-uuid[755]: Primary Header is updated. Dec 16 13:09:50.852783 disk-uuid[755]: Secondary Entries is updated. Dec 16 13:09:50.852783 disk-uuid[755]: Secondary Header is updated. Dec 16 13:09:50.862347 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 16 13:09:50.973342 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd Dec 16 13:09:51.080331 kernel: ata6: SATA link down (SStatus 0 SControl 300) Dec 16 13:09:51.080440 kernel: ata4: SATA link down (SStatus 0 SControl 300) Dec 16 13:09:51.081349 kernel: ata1: SATA link down (SStatus 0 SControl 300) Dec 16 13:09:51.083333 kernel: ata5: SATA link down (SStatus 0 SControl 300) Dec 16 13:09:51.086356 kernel: ata2: SATA link down (SStatus 0 SControl 300) Dec 16 13:09:51.088335 kernel: ata3: SATA link down (SStatus 0 SControl 300) Dec 16 13:09:51.099876 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Dec 16 13:09:51.101559 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 13:09:51.102041 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 13:09:51.103072 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 13:09:51.104727 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Dec 16 13:09:51.127559 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Dec 16 13:09:51.157327 kernel: hid: raw HID events driver (C) Jiri Kosina Dec 16 13:09:51.166463 kernel: usbcore: registered new interface driver usbhid Dec 16 13:09:51.166547 kernel: usbhid: USB HID core driver Dec 16 13:09:51.173653 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.0/0000:01:00.0/0000:02:01.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Dec 16 13:09:51.173722 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:01.0-1/input0 Dec 16 13:09:51.877407 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 16 13:09:51.877820 disk-uuid[756]: The operation has completed successfully. Dec 16 13:09:51.937286 systemd[1]: disk-uuid.service: Deactivated successfully. Dec 16 13:09:51.937402 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Dec 16 13:09:51.963996 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Dec 16 13:09:51.995700 sh[782]: Success Dec 16 13:09:52.024349 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 16 13:09:52.024419 kernel: device-mapper: uevent: version 1.0.3 Dec 16 13:09:52.032380 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Dec 16 13:09:52.046344 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Dec 16 13:09:52.164993 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Dec 16 13:09:52.169128 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Dec 16 13:09:52.189456 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Dec 16 13:09:52.215342 kernel: BTRFS: device fsid a6ae7f96-a076-4d3c-81ed-46dd341492f8 devid 1 transid 37 /dev/mapper/usr (253:0) scanned by mount (795) Dec 16 13:09:52.218755 kernel: BTRFS info (device dm-0): first mount of filesystem a6ae7f96-a076-4d3c-81ed-46dd341492f8 Dec 16 13:09:52.218817 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Dec 16 13:09:52.432965 kernel: BTRFS info (device dm-0): disabling log replay at mount time Dec 16 13:09:52.433072 kernel: BTRFS info (device dm-0): enabling free space tree Dec 16 13:09:52.460236 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Dec 16 13:09:52.461709 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Dec 16 13:09:52.462899 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Dec 16 13:09:52.464114 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Dec 16 13:09:52.465902 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Dec 16 13:09:52.513357 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (830) Dec 16 13:09:52.518324 kernel: BTRFS info (device vda6): first mount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 16 13:09:52.518387 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Dec 16 13:09:52.529854 kernel: BTRFS info (device vda6): turning on async discard Dec 16 13:09:52.529918 kernel: BTRFS info (device vda6): enabling free space tree Dec 16 13:09:52.538368 kernel: BTRFS info (device vda6): last unmount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 16 13:09:52.538907 systemd[1]: Finished ignition-setup.service - Ignition (setup). Dec 16 13:09:52.541807 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Dec 16 13:09:52.596976 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 13:09:52.599701 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 13:09:52.645696 systemd-networkd[966]: lo: Link UP Dec 16 13:09:52.645707 systemd-networkd[966]: lo: Gained carrier Dec 16 13:09:52.646790 systemd-networkd[966]: Enumeration completed Dec 16 13:09:52.646951 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 13:09:52.647113 systemd-networkd[966]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 13:09:52.647117 systemd-networkd[966]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 13:09:52.647814 systemd[1]: Reached target network.target - Network. Dec 16 13:09:52.648278 systemd-networkd[966]: eth0: Link UP Dec 16 13:09:52.648423 systemd-networkd[966]: eth0: Gained carrier Dec 16 13:09:52.648434 systemd-networkd[966]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 13:09:52.677385 systemd-networkd[966]: eth0: DHCPv4 address 10.0.25.207/25, gateway 10.0.25.129 acquired from 10.0.25.129 Dec 16 13:09:52.761084 ignition[907]: Ignition 2.22.0 Dec 16 13:09:52.761100 ignition[907]: Stage: fetch-offline Dec 16 13:09:52.761182 ignition[907]: no configs at "/usr/lib/ignition/base.d" Dec 16 13:09:52.761193 ignition[907]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 13:09:52.761287 ignition[907]: parsed url from cmdline: "" Dec 16 13:09:52.763687 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 13:09:52.761291 ignition[907]: no config URL provided Dec 16 13:09:52.761311 ignition[907]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 13:09:52.761319 ignition[907]: no config at "/usr/lib/ignition/user.ign" Dec 16 13:09:52.765601 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Dec 16 13:09:52.761325 ignition[907]: failed to fetch config: resource requires networking Dec 16 13:09:52.761506 ignition[907]: Ignition finished successfully Dec 16 13:09:52.811671 ignition[986]: Ignition 2.22.0 Dec 16 13:09:52.811693 ignition[986]: Stage: fetch Dec 16 13:09:52.811876 ignition[986]: no configs at "/usr/lib/ignition/base.d" Dec 16 13:09:52.811886 ignition[986]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 13:09:52.811975 ignition[986]: parsed url from cmdline: "" Dec 16 13:09:52.811979 ignition[986]: no config URL provided Dec 16 13:09:52.811984 ignition[986]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 13:09:52.811991 ignition[986]: no config at "/usr/lib/ignition/user.ign" Dec 16 13:09:52.812205 ignition[986]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Dec 16 13:09:52.812256 ignition[986]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Dec 16 13:09:52.812343 ignition[986]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Dec 16 13:09:53.149520 ignition[986]: GET result: OK Dec 16 13:09:53.149654 ignition[986]: parsing config with SHA512: 84b5bdc8e2ec04587eab39b79030dde8f4538ad35847451e90814d3b3499ed4e9fc6554e28c517d216d02726fda95becd6dc039329ca708762e621326b147d19 Dec 16 13:09:53.156436 unknown[986]: fetched base config from "system" Dec 16 13:09:53.156451 unknown[986]: fetched base config from "system" Dec 16 13:09:53.156846 ignition[986]: fetch: fetch complete Dec 16 13:09:53.156458 unknown[986]: fetched user config from "openstack" Dec 16 13:09:53.156860 ignition[986]: fetch: fetch passed Dec 16 13:09:53.156904 ignition[986]: Ignition finished successfully Dec 16 13:09:53.159272 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Dec 16 13:09:53.161478 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Dec 16 13:09:53.212410 ignition[997]: Ignition 2.22.0 Dec 16 13:09:53.212430 ignition[997]: Stage: kargs Dec 16 13:09:53.212663 ignition[997]: no configs at "/usr/lib/ignition/base.d" Dec 16 13:09:53.212677 ignition[997]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 13:09:53.216030 ignition[997]: kargs: kargs passed Dec 16 13:09:53.216174 ignition[997]: Ignition finished successfully Dec 16 13:09:53.219426 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Dec 16 13:09:53.222320 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Dec 16 13:09:53.270091 ignition[1008]: Ignition 2.22.0 Dec 16 13:09:53.270107 ignition[1008]: Stage: disks Dec 16 13:09:53.270280 ignition[1008]: no configs at "/usr/lib/ignition/base.d" Dec 16 13:09:53.270292 ignition[1008]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 13:09:53.273380 systemd[1]: Finished ignition-disks.service - Ignition (disks). Dec 16 13:09:53.271196 ignition[1008]: disks: disks passed Dec 16 13:09:53.274278 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Dec 16 13:09:53.271246 ignition[1008]: Ignition finished successfully Dec 16 13:09:53.275117 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Dec 16 13:09:53.276097 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 13:09:53.277112 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 13:09:53.278198 systemd[1]: Reached target basic.target - Basic System. Dec 16 13:09:53.280768 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Dec 16 13:09:53.348100 systemd-fsck[1022]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Dec 16 13:09:53.352082 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Dec 16 13:09:53.353714 systemd[1]: Mounting sysroot.mount - /sysroot... Dec 16 13:09:53.584358 kernel: EXT4-fs (vda9): mounted filesystem e48ca59c-1206-4abd-b121-5e9b35e49852 r/w with ordered data mode. Quota mode: none. Dec 16 13:09:53.585707 systemd[1]: Mounted sysroot.mount - /sysroot. Dec 16 13:09:53.586904 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Dec 16 13:09:53.590288 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 13:09:53.592095 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Dec 16 13:09:53.592813 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Dec 16 13:09:53.593525 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Dec 16 13:09:53.593983 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Dec 16 13:09:53.594013 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 13:09:53.610454 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Dec 16 13:09:53.613775 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Dec 16 13:09:53.625369 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1030) Dec 16 13:09:53.629545 kernel: BTRFS info (device vda6): first mount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 16 13:09:53.629634 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Dec 16 13:09:53.640775 kernel: BTRFS info (device vda6): turning on async discard Dec 16 13:09:53.640857 kernel: BTRFS info (device vda6): enabling free space tree Dec 16 13:09:53.643334 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 13:09:53.703342 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:09:53.726953 initrd-setup-root[1059]: cut: /sysroot/etc/passwd: No such file or directory Dec 16 13:09:53.738648 initrd-setup-root[1066]: cut: /sysroot/etc/group: No such file or directory Dec 16 13:09:53.745626 initrd-setup-root[1073]: cut: /sysroot/etc/shadow: No such file or directory Dec 16 13:09:53.752429 initrd-setup-root[1080]: cut: /sysroot/etc/gshadow: No such file or directory Dec 16 13:09:53.898955 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Dec 16 13:09:53.901591 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Dec 16 13:09:53.903326 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Dec 16 13:09:53.927759 systemd[1]: sysroot-oem.mount: Deactivated successfully. Dec 16 13:09:53.931466 kernel: BTRFS info (device vda6): last unmount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 16 13:09:53.960516 systemd-networkd[966]: eth0: Gained IPv6LL Dec 16 13:09:53.963702 ignition[1147]: INFO : Ignition 2.22.0 Dec 16 13:09:53.963702 ignition[1147]: INFO : Stage: mount Dec 16 13:09:53.965194 ignition[1147]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 13:09:53.965194 ignition[1147]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 13:09:53.965194 ignition[1147]: INFO : mount: mount passed Dec 16 13:09:53.965194 ignition[1147]: INFO : Ignition finished successfully Dec 16 13:09:53.967461 systemd[1]: Finished ignition-mount.service - Ignition (mount). Dec 16 13:09:53.973712 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Dec 16 13:09:54.763380 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:09:56.777398 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:10:00.789378 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:10:00.801842 coreos-metadata[1032]: Dec 16 13:10:00.801 WARN failed to locate config-drive, using the metadata service API instead Dec 16 13:10:00.824446 coreos-metadata[1032]: Dec 16 13:10:00.824 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 16 13:10:00.987743 coreos-metadata[1032]: Dec 16 13:10:00.987 INFO Fetch successful Dec 16 13:10:00.988454 coreos-metadata[1032]: Dec 16 13:10:00.987 INFO wrote hostname ci-4459-2-2-9-79ca1ea2c9 to /sysroot/etc/hostname Dec 16 13:10:00.990856 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Dec 16 13:10:00.991140 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Dec 16 13:10:00.994996 systemd[1]: Starting ignition-files.service - Ignition (files)... Dec 16 13:10:01.030975 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 13:10:01.056330 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1170) Dec 16 13:10:01.061597 kernel: BTRFS info (device vda6): first mount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 16 13:10:01.061644 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Dec 16 13:10:01.069340 kernel: BTRFS info (device vda6): turning on async discard Dec 16 13:10:01.069394 kernel: BTRFS info (device vda6): enabling free space tree Dec 16 13:10:01.071855 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 13:10:01.112859 ignition[1188]: INFO : Ignition 2.22.0 Dec 16 13:10:01.112859 ignition[1188]: INFO : Stage: files Dec 16 13:10:01.114082 ignition[1188]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 13:10:01.114082 ignition[1188]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 13:10:01.114082 ignition[1188]: DEBUG : files: compiled without relabeling support, skipping Dec 16 13:10:01.115699 ignition[1188]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Dec 16 13:10:01.115699 ignition[1188]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Dec 16 13:10:01.121432 ignition[1188]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Dec 16 13:10:01.122041 ignition[1188]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Dec 16 13:10:01.122551 ignition[1188]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Dec 16 13:10:01.122501 unknown[1188]: wrote ssh authorized keys file for user: core Dec 16 13:10:01.126157 ignition[1188]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Dec 16 13:10:01.127051 ignition[1188]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Dec 16 13:10:01.198573 ignition[1188]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Dec 16 13:10:01.321937 ignition[1188]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Dec 16 13:10:01.321937 ignition[1188]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Dec 16 13:10:01.323379 ignition[1188]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Dec 16 13:10:01.323379 ignition[1188]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Dec 16 13:10:01.323379 ignition[1188]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Dec 16 13:10:01.323379 ignition[1188]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 13:10:01.323379 ignition[1188]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 13:10:01.323379 ignition[1188]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 13:10:01.323379 ignition[1188]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 13:10:01.331957 ignition[1188]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 13:10:01.332563 ignition[1188]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 13:10:01.332563 ignition[1188]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Dec 16 13:10:01.335432 ignition[1188]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Dec 16 13:10:01.335432 ignition[1188]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Dec 16 13:10:01.339217 ignition[1188]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.1-x86-64.raw: attempt #1 Dec 16 13:10:01.942804 ignition[1188]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Dec 16 13:10:02.521548 ignition[1188]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Dec 16 13:10:02.521548 ignition[1188]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Dec 16 13:10:02.527714 ignition[1188]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 13:10:02.534553 ignition[1188]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 13:10:02.534553 ignition[1188]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Dec 16 13:10:02.534553 ignition[1188]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Dec 16 13:10:02.538204 ignition[1188]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Dec 16 13:10:02.538204 ignition[1188]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Dec 16 13:10:02.538204 ignition[1188]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Dec 16 13:10:02.538204 ignition[1188]: INFO : files: files passed Dec 16 13:10:02.538204 ignition[1188]: INFO : Ignition finished successfully Dec 16 13:10:02.537683 systemd[1]: Finished ignition-files.service - Ignition (files). Dec 16 13:10:02.539976 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Dec 16 13:10:02.542669 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Dec 16 13:10:02.568707 systemd[1]: ignition-quench.service: Deactivated successfully. Dec 16 13:10:02.568820 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Dec 16 13:10:02.573744 initrd-setup-root-after-ignition[1225]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 13:10:02.573744 initrd-setup-root-after-ignition[1225]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Dec 16 13:10:02.574859 initrd-setup-root-after-ignition[1229]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 13:10:02.575485 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 13:10:02.576774 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Dec 16 13:10:02.578453 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Dec 16 13:10:02.606070 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 16 13:10:02.606216 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Dec 16 13:10:02.608096 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Dec 16 13:10:02.608974 systemd[1]: Reached target initrd.target - Initrd Default Target. Dec 16 13:10:02.610360 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Dec 16 13:10:02.611393 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Dec 16 13:10:02.642456 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 13:10:02.644685 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Dec 16 13:10:02.657468 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Dec 16 13:10:02.658207 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 13:10:02.659415 systemd[1]: Stopped target timers.target - Timer Units. Dec 16 13:10:02.660418 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 16 13:10:02.660532 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 13:10:02.661934 systemd[1]: Stopped target initrd.target - Initrd Default Target. Dec 16 13:10:02.662980 systemd[1]: Stopped target basic.target - Basic System. Dec 16 13:10:02.663886 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Dec 16 13:10:02.664887 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 13:10:02.665792 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Dec 16 13:10:02.666699 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Dec 16 13:10:02.667620 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Dec 16 13:10:02.668535 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 13:10:02.669386 systemd[1]: Stopped target sysinit.target - System Initialization. Dec 16 13:10:02.670195 systemd[1]: Stopped target local-fs.target - Local File Systems. Dec 16 13:10:02.671012 systemd[1]: Stopped target swap.target - Swaps. Dec 16 13:10:02.671831 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 16 13:10:02.671943 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Dec 16 13:10:02.673273 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Dec 16 13:10:02.674251 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 13:10:02.675126 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Dec 16 13:10:02.675267 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 13:10:02.676060 systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 16 13:10:02.676170 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Dec 16 13:10:02.677525 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Dec 16 13:10:02.677628 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 13:10:02.678483 systemd[1]: ignition-files.service: Deactivated successfully. Dec 16 13:10:02.678568 systemd[1]: Stopped ignition-files.service - Ignition (files). Dec 16 13:10:02.680198 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Dec 16 13:10:02.680700 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 16 13:10:02.680798 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 13:10:02.682338 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Dec 16 13:10:02.682994 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 16 13:10:02.683131 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 13:10:02.683950 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 16 13:10:02.684036 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 13:10:02.688012 systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 16 13:10:02.700537 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Dec 16 13:10:02.719261 systemd[1]: sysroot-boot.mount: Deactivated successfully. Dec 16 13:10:02.719920 ignition[1249]: INFO : Ignition 2.22.0 Dec 16 13:10:02.719920 ignition[1249]: INFO : Stage: umount Dec 16 13:10:02.720989 ignition[1249]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 13:10:02.720989 ignition[1249]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 13:10:02.720989 ignition[1249]: INFO : umount: umount passed Dec 16 13:10:02.722057 ignition[1249]: INFO : Ignition finished successfully Dec 16 13:10:02.723907 systemd[1]: ignition-mount.service: Deactivated successfully. Dec 16 13:10:02.724014 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Dec 16 13:10:02.724928 systemd[1]: ignition-disks.service: Deactivated successfully. Dec 16 13:10:02.724972 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Dec 16 13:10:02.725587 systemd[1]: ignition-kargs.service: Deactivated successfully. Dec 16 13:10:02.725626 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Dec 16 13:10:02.726353 systemd[1]: ignition-fetch.service: Deactivated successfully. Dec 16 13:10:02.726391 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Dec 16 13:10:02.727121 systemd[1]: Stopped target network.target - Network. Dec 16 13:10:02.727864 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Dec 16 13:10:02.727905 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 13:10:02.728681 systemd[1]: Stopped target paths.target - Path Units. Dec 16 13:10:02.729404 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 16 13:10:02.734377 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 13:10:02.734849 systemd[1]: Stopped target slices.target - Slice Units. Dec 16 13:10:02.735650 systemd[1]: Stopped target sockets.target - Socket Units. Dec 16 13:10:02.736471 systemd[1]: iscsid.socket: Deactivated successfully. Dec 16 13:10:02.736514 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 13:10:02.737224 systemd[1]: iscsiuio.socket: Deactivated successfully. Dec 16 13:10:02.737257 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 13:10:02.737972 systemd[1]: ignition-setup.service: Deactivated successfully. Dec 16 13:10:02.738026 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Dec 16 13:10:02.738775 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Dec 16 13:10:02.738820 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Dec 16 13:10:02.739729 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Dec 16 13:10:02.740395 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Dec 16 13:10:02.747242 systemd[1]: systemd-resolved.service: Deactivated successfully. Dec 16 13:10:02.747391 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Dec 16 13:10:02.750203 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Dec 16 13:10:02.750519 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 16 13:10:02.750557 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 13:10:02.752014 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Dec 16 13:10:02.753317 systemd[1]: systemd-networkd.service: Deactivated successfully. Dec 16 13:10:02.753446 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Dec 16 13:10:02.755222 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Dec 16 13:10:02.755453 systemd[1]: Stopped target network-pre.target - Preparation for Network. Dec 16 13:10:02.756101 systemd[1]: systemd-networkd.socket: Deactivated successfully. Dec 16 13:10:02.756142 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Dec 16 13:10:02.757559 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Dec 16 13:10:02.758330 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Dec 16 13:10:02.758391 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 13:10:02.759252 systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 16 13:10:02.759288 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Dec 16 13:10:02.760121 systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 16 13:10:02.760156 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Dec 16 13:10:02.760889 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 13:10:02.762518 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Dec 16 13:10:02.772733 systemd[1]: sysroot-boot.service: Deactivated successfully. Dec 16 13:10:02.772857 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Dec 16 13:10:02.774327 systemd[1]: initrd-setup-root.service: Deactivated successfully. Dec 16 13:10:02.774377 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Dec 16 13:10:02.775289 systemd[1]: network-cleanup.service: Deactivated successfully. Dec 16 13:10:02.775399 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Dec 16 13:10:02.776944 systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 16 13:10:02.777092 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 13:10:02.777969 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 16 13:10:02.778003 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Dec 16 13:10:02.778529 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 16 13:10:02.778555 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 13:10:02.779244 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 16 13:10:02.779280 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Dec 16 13:10:02.780376 systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 16 13:10:02.780411 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Dec 16 13:10:02.781561 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 16 13:10:02.781605 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 13:10:02.783400 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Dec 16 13:10:02.783940 systemd[1]: systemd-network-generator.service: Deactivated successfully. Dec 16 13:10:02.783993 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 13:10:02.785477 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 16 13:10:02.785520 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 13:10:02.786398 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 13:10:02.786439 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 13:10:02.811156 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 16 13:10:02.811271 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Dec 16 13:10:02.812427 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Dec 16 13:10:02.813700 systemd[1]: Starting initrd-switch-root.service - Switch Root... Dec 16 13:10:02.823538 systemd[1]: Switching root. Dec 16 13:10:02.870318 systemd-journald[278]: Received SIGTERM from PID 1 (systemd). Dec 16 13:10:02.870391 systemd-journald[278]: Journal stopped Dec 16 13:10:04.056974 kernel: SELinux: policy capability network_peer_controls=1 Dec 16 13:10:04.057054 kernel: SELinux: policy capability open_perms=1 Dec 16 13:10:04.057066 kernel: SELinux: policy capability extended_socket_class=1 Dec 16 13:10:04.057082 kernel: SELinux: policy capability always_check_network=0 Dec 16 13:10:04.057098 kernel: SELinux: policy capability cgroup_seclabel=1 Dec 16 13:10:04.057111 kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 16 13:10:04.057121 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Dec 16 13:10:04.057130 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Dec 16 13:10:04.057148 kernel: SELinux: policy capability userspace_initial_context=0 Dec 16 13:10:04.057157 kernel: audit: type=1403 audit(1765890603.036:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Dec 16 13:10:04.057174 systemd[1]: Successfully loaded SELinux policy in 72.217ms. Dec 16 13:10:04.057200 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 9.749ms. Dec 16 13:10:04.057213 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 13:10:04.057226 systemd[1]: Detected virtualization kvm. Dec 16 13:10:04.057236 systemd[1]: Detected architecture x86-64. Dec 16 13:10:04.057246 systemd[1]: Detected first boot. Dec 16 13:10:04.057257 systemd[1]: Hostname set to . Dec 16 13:10:04.057267 systemd[1]: Initializing machine ID from VM UUID. Dec 16 13:10:04.057280 zram_generator::config[1297]: No configuration found. Dec 16 13:10:04.057292 kernel: Guest personality initialized and is inactive Dec 16 13:10:04.057314 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Dec 16 13:10:04.057324 kernel: Initialized host personality Dec 16 13:10:04.057334 kernel: NET: Registered PF_VSOCK protocol family Dec 16 13:10:04.057344 systemd[1]: Populated /etc with preset unit settings. Dec 16 13:10:04.057355 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Dec 16 13:10:04.057366 systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 16 13:10:04.057376 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Dec 16 13:10:04.057389 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 16 13:10:04.057400 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Dec 16 13:10:04.057413 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Dec 16 13:10:04.057423 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Dec 16 13:10:04.057434 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Dec 16 13:10:04.057444 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Dec 16 13:10:04.057455 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Dec 16 13:10:04.057465 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Dec 16 13:10:04.057475 systemd[1]: Created slice user.slice - User and Session Slice. Dec 16 13:10:04.057486 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 13:10:04.057496 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 13:10:04.057508 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Dec 16 13:10:04.057518 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Dec 16 13:10:04.057529 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Dec 16 13:10:04.057540 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 13:10:04.057550 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Dec 16 13:10:04.057560 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 13:10:04.057573 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 13:10:04.057583 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Dec 16 13:10:04.057594 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Dec 16 13:10:04.057604 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Dec 16 13:10:04.057614 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Dec 16 13:10:04.057624 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 13:10:04.057635 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 13:10:04.057646 systemd[1]: Reached target slices.target - Slice Units. Dec 16 13:10:04.057656 systemd[1]: Reached target swap.target - Swaps. Dec 16 13:10:04.057668 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Dec 16 13:10:04.057678 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Dec 16 13:10:04.057688 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Dec 16 13:10:04.057698 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 13:10:04.057709 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 13:10:04.057719 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 13:10:04.057729 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Dec 16 13:10:04.057739 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Dec 16 13:10:04.057750 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Dec 16 13:10:04.057762 systemd[1]: Mounting media.mount - External Media Directory... Dec 16 13:10:04.057772 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 13:10:04.057782 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Dec 16 13:10:04.057793 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Dec 16 13:10:04.057803 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Dec 16 13:10:04.057813 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 16 13:10:04.057823 systemd[1]: Reached target machines.target - Containers. Dec 16 13:10:04.057834 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Dec 16 13:10:04.057844 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 13:10:04.057858 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 13:10:04.057868 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Dec 16 13:10:04.057878 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 13:10:04.057889 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 13:10:04.057899 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 13:10:04.057909 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Dec 16 13:10:04.057919 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 13:10:04.057935 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Dec 16 13:10:04.057947 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 16 13:10:04.057957 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Dec 16 13:10:04.057967 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Dec 16 13:10:04.057978 systemd[1]: Stopped systemd-fsck-usr.service. Dec 16 13:10:04.057988 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 13:10:04.058000 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 13:10:04.058010 kernel: loop: module loaded Dec 16 13:10:04.058019 kernel: fuse: init (API version 7.41) Dec 16 13:10:04.058029 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 13:10:04.058040 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 13:10:04.058050 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Dec 16 13:10:04.058061 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Dec 16 13:10:04.058071 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 13:10:04.058082 systemd[1]: verity-setup.service: Deactivated successfully. Dec 16 13:10:04.058094 systemd[1]: Stopped verity-setup.service. Dec 16 13:10:04.058105 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 13:10:04.058117 kernel: ACPI: bus type drm_connector registered Dec 16 13:10:04.058127 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Dec 16 13:10:04.058157 systemd-journald[1378]: Collecting audit messages is disabled. Dec 16 13:10:04.058189 systemd-journald[1378]: Journal started Dec 16 13:10:04.058210 systemd-journald[1378]: Runtime Journal (/run/log/journal/c71939ed77244084a91d974211ac9954) is 8M, max 319.5M, 311.5M free. Dec 16 13:10:03.847844 systemd[1]: Queued start job for default target multi-user.target. Dec 16 13:10:04.058430 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 13:10:03.871447 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Dec 16 13:10:03.871829 systemd[1]: systemd-journald.service: Deactivated successfully. Dec 16 13:10:04.060450 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Dec 16 13:10:04.060963 systemd[1]: Mounted media.mount - External Media Directory. Dec 16 13:10:04.061475 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Dec 16 13:10:04.061960 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Dec 16 13:10:04.062450 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Dec 16 13:10:04.063122 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Dec 16 13:10:04.063805 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 13:10:04.064443 systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 16 13:10:04.064601 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Dec 16 13:10:04.065199 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 13:10:04.065356 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 13:10:04.065944 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 13:10:04.066076 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 13:10:04.066683 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 13:10:04.066814 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 13:10:04.067416 systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 16 13:10:04.067551 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Dec 16 13:10:04.068124 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 13:10:04.068251 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 13:10:04.068884 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 13:10:04.069501 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 13:10:04.070091 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Dec 16 13:10:04.070713 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Dec 16 13:10:04.079953 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 13:10:04.081714 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Dec 16 13:10:04.083124 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Dec 16 13:10:04.083599 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Dec 16 13:10:04.083630 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 13:10:04.084822 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Dec 16 13:10:04.102122 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Dec 16 13:10:04.102788 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 13:10:04.103906 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Dec 16 13:10:04.105263 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Dec 16 13:10:04.105787 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 13:10:04.106630 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Dec 16 13:10:04.107143 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 13:10:04.107918 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 13:10:04.110034 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Dec 16 13:10:04.111284 systemd[1]: Starting systemd-sysusers.service - Create System Users... Dec 16 13:10:04.113213 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Dec 16 13:10:04.113741 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Dec 16 13:10:04.118182 systemd-journald[1378]: Time spent on flushing to /var/log/journal/c71939ed77244084a91d974211ac9954 is 23.220ms for 1704 entries. Dec 16 13:10:04.118182 systemd-journald[1378]: System Journal (/var/log/journal/c71939ed77244084a91d974211ac9954) is 8M, max 584.8M, 576.8M free. Dec 16 13:10:04.163174 systemd-journald[1378]: Received client request to flush runtime journal. Dec 16 13:10:04.163233 kernel: loop0: detected capacity change from 0 to 1640 Dec 16 13:10:04.163259 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Dec 16 13:10:04.125089 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Dec 16 13:10:04.125755 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Dec 16 13:10:04.127316 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Dec 16 13:10:04.132979 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 13:10:04.157732 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 13:10:04.164799 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Dec 16 13:10:04.172337 kernel: loop1: detected capacity change from 0 to 128560 Dec 16 13:10:04.172853 systemd[1]: Finished systemd-sysusers.service - Create System Users. Dec 16 13:10:04.174551 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 13:10:04.188515 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Dec 16 13:10:04.204311 systemd-tmpfiles[1438]: ACLs are not supported, ignoring. Dec 16 13:10:04.204327 systemd-tmpfiles[1438]: ACLs are not supported, ignoring. Dec 16 13:10:04.208567 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 13:10:04.248349 kernel: loop2: detected capacity change from 0 to 110984 Dec 16 13:10:04.326351 kernel: loop3: detected capacity change from 0 to 219144 Dec 16 13:10:04.399366 kernel: loop4: detected capacity change from 0 to 1640 Dec 16 13:10:04.412464 kernel: loop5: detected capacity change from 0 to 128560 Dec 16 13:10:04.434361 kernel: loop6: detected capacity change from 0 to 110984 Dec 16 13:10:04.456341 kernel: loop7: detected capacity change from 0 to 219144 Dec 16 13:10:04.490840 (sd-merge)[1450]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-stackit'. Dec 16 13:10:04.491684 (sd-merge)[1450]: Merged extensions into '/usr'. Dec 16 13:10:04.496814 systemd[1]: Reload requested from client PID 1422 ('systemd-sysext') (unit systemd-sysext.service)... Dec 16 13:10:04.496832 systemd[1]: Reloading... Dec 16 13:10:04.545340 zram_generator::config[1476]: No configuration found. Dec 16 13:10:04.705937 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Dec 16 13:10:04.706107 systemd[1]: Reloading finished in 208 ms. Dec 16 13:10:04.740092 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Dec 16 13:10:04.740969 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Dec 16 13:10:04.754607 systemd[1]: Starting ensure-sysext.service... Dec 16 13:10:04.756545 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 13:10:04.758604 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 13:10:04.770445 systemd[1]: Reload requested from client PID 1519 ('systemctl') (unit ensure-sysext.service)... Dec 16 13:10:04.770461 systemd[1]: Reloading... Dec 16 13:10:04.788541 systemd-tmpfiles[1520]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Dec 16 13:10:04.789461 systemd-tmpfiles[1520]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Dec 16 13:10:04.790001 systemd-tmpfiles[1520]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 16 13:10:04.790482 systemd-tmpfiles[1520]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Dec 16 13:10:04.791957 systemd-tmpfiles[1520]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Dec 16 13:10:04.792482 systemd-tmpfiles[1520]: ACLs are not supported, ignoring. Dec 16 13:10:04.792589 systemd-tmpfiles[1520]: ACLs are not supported, ignoring. Dec 16 13:10:04.802746 systemd-tmpfiles[1520]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 13:10:04.802769 systemd-tmpfiles[1520]: Skipping /boot Dec 16 13:10:04.810333 zram_generator::config[1549]: No configuration found. Dec 16 13:10:04.812168 systemd-udevd[1521]: Using default interface naming scheme 'v255'. Dec 16 13:10:04.816210 systemd-tmpfiles[1520]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 13:10:04.816231 systemd-tmpfiles[1520]: Skipping /boot Dec 16 13:10:04.900098 ldconfig[1417]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Dec 16 13:10:04.957337 kernel: mousedev: PS/2 mouse device common for all mice Dec 16 13:10:04.974318 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Dec 16 13:10:04.981416 kernel: ACPI: button: Power Button [PWRF] Dec 16 13:10:05.056016 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Dec 16 13:10:05.056262 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Dec 16 13:10:05.055167 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Dec 16 13:10:05.055403 systemd[1]: Reloading finished in 284 ms. Dec 16 13:10:05.071313 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Dec 16 13:10:05.072772 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 13:10:05.073818 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Dec 16 13:10:05.077311 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Dec 16 13:10:05.079397 kernel: Console: switching to colour dummy device 80x25 Dec 16 13:10:05.080043 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 13:10:05.081629 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Dec 16 13:10:05.083249 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Dec 16 13:10:05.083293 kernel: [drm] features: -context_init Dec 16 13:10:05.085309 kernel: [drm] number of scanouts: 1 Dec 16 13:10:05.085344 kernel: [drm] number of cap sets: 0 Dec 16 13:10:05.086313 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Dec 16 13:10:05.090655 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Dec 16 13:10:05.091646 kernel: Console: switching to colour frame buffer device 160x50 Dec 16 13:10:05.096335 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Dec 16 13:10:05.125522 systemd[1]: Finished ensure-sysext.service. Dec 16 13:10:05.130950 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 16 13:10:05.135041 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 13:10:05.136442 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 13:10:05.139703 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Dec 16 13:10:05.140023 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 13:10:05.141184 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 13:10:05.159101 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 13:10:05.161249 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 13:10:05.162631 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 13:10:05.164082 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Dec 16 13:10:05.164893 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 13:10:05.165605 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Dec 16 13:10:05.166361 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 13:10:05.167165 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Dec 16 13:10:05.169116 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 13:10:05.171889 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 13:10:05.173067 systemd[1]: Reached target time-set.target - System Time Set. Dec 16 13:10:05.173974 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Dec 16 13:10:05.176026 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 13:10:05.176467 kernel: pps_core: LinuxPPS API ver. 1 registered Dec 16 13:10:05.176495 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Dec 16 13:10:05.178086 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 13:10:05.179007 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 13:10:05.179181 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 13:10:05.180079 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 13:10:05.180239 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 13:10:05.180571 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 13:10:05.180737 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 13:10:05.180975 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 13:10:05.181105 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 13:10:05.184315 kernel: PTP clock support registered Dec 16 13:10:05.187810 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Dec 16 13:10:05.188494 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Dec 16 13:10:05.190292 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Dec 16 13:10:05.192952 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 13:10:05.193067 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 13:10:05.194978 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Dec 16 13:10:05.197342 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Dec 16 13:10:05.208679 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Dec 16 13:10:05.208923 augenrules[1702]: No rules Dec 16 13:10:05.212116 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 13:10:05.212292 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 13:10:05.214347 systemd[1]: Starting systemd-update-done.service - Update is Completed... Dec 16 13:10:05.237510 systemd[1]: Started systemd-userdbd.service - User Database Manager. Dec 16 13:10:05.238348 systemd[1]: Finished systemd-update-done.service - Update is Completed. Dec 16 13:10:05.286678 systemd-networkd[1674]: lo: Link UP Dec 16 13:10:05.287036 systemd-networkd[1674]: lo: Gained carrier Dec 16 13:10:05.288249 systemd-networkd[1674]: Enumeration completed Dec 16 13:10:05.288389 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 13:10:05.289359 systemd-networkd[1674]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 13:10:05.289369 systemd-networkd[1674]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 13:10:05.290544 systemd-networkd[1674]: eth0: Link UP Dec 16 13:10:05.290660 systemd-resolved[1675]: Positive Trust Anchors: Dec 16 13:10:05.290671 systemd-resolved[1675]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 13:10:05.290703 systemd-resolved[1675]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 13:10:05.290847 systemd-networkd[1674]: eth0: Gained carrier Dec 16 13:10:05.290893 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Dec 16 13:10:05.290895 systemd-networkd[1674]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 13:10:05.294410 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Dec 16 13:10:05.296134 systemd-resolved[1675]: Using system hostname 'ci-4459-2-2-9-79ca1ea2c9'. Dec 16 13:10:05.296917 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 13:10:05.298390 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 13:10:05.302181 systemd[1]: Reached target network.target - Network. Dec 16 13:10:05.306264 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 13:10:05.322588 systemd-networkd[1674]: eth0: DHCPv4 address 10.0.25.207/25, gateway 10.0.25.129 acquired from 10.0.25.129 Dec 16 13:10:05.327168 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Dec 16 13:10:05.328639 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Dec 16 13:10:05.330220 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 16 13:10:05.330371 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 13:10:05.333283 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Dec 16 13:10:05.333803 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Dec 16 13:10:05.334199 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Dec 16 13:10:05.334752 systemd[1]: Started logrotate.timer - Daily rotation of log files. Dec 16 13:10:05.335251 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Dec 16 13:10:05.335629 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Dec 16 13:10:05.335989 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Dec 16 13:10:05.336016 systemd[1]: Reached target paths.target - Path Units. Dec 16 13:10:05.343616 systemd[1]: Reached target timers.target - Timer Units. Dec 16 13:10:05.346356 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Dec 16 13:10:05.348336 systemd[1]: Starting docker.socket - Docker Socket for the API... Dec 16 13:10:05.351015 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Dec 16 13:10:05.351958 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Dec 16 13:10:05.352530 systemd[1]: Reached target ssh-access.target - SSH Access Available. Dec 16 13:10:05.364586 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Dec 16 13:10:05.367609 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Dec 16 13:10:05.369945 systemd[1]: Listening on docker.socket - Docker Socket for the API. Dec 16 13:10:05.375130 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 13:10:05.376088 systemd[1]: Reached target basic.target - Basic System. Dec 16 13:10:05.376585 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Dec 16 13:10:05.376617 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Dec 16 13:10:05.383339 systemd[1]: Starting chronyd.service - NTP client/server... Dec 16 13:10:05.385889 systemd[1]: Starting containerd.service - containerd container runtime... Dec 16 13:10:05.388624 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Dec 16 13:10:05.392236 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Dec 16 13:10:05.394593 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Dec 16 13:10:05.397790 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:10:05.398951 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Dec 16 13:10:05.400676 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Dec 16 13:10:05.401933 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Dec 16 13:10:05.402907 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Dec 16 13:10:05.406375 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Dec 16 13:10:05.407773 jq[1737]: false Dec 16 13:10:05.408225 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Dec 16 13:10:05.411550 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Dec 16 13:10:05.413644 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Dec 16 13:10:05.414920 google_oslogin_nss_cache[1739]: oslogin_cache_refresh[1739]: Refreshing passwd entry cache Dec 16 13:10:05.414937 oslogin_cache_refresh[1739]: Refreshing passwd entry cache Dec 16 13:10:05.415526 extend-filesystems[1738]: Found /dev/vda6 Dec 16 13:10:05.417828 systemd[1]: Starting systemd-logind.service - User Login Management... Dec 16 13:10:05.420358 extend-filesystems[1738]: Found /dev/vda9 Dec 16 13:10:05.424387 extend-filesystems[1738]: Checking size of /dev/vda9 Dec 16 13:10:05.424030 oslogin_cache_refresh[1739]: Failure getting users, quitting Dec 16 13:10:05.424866 google_oslogin_nss_cache[1739]: oslogin_cache_refresh[1739]: Failure getting users, quitting Dec 16 13:10:05.424866 google_oslogin_nss_cache[1739]: oslogin_cache_refresh[1739]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Dec 16 13:10:05.424866 google_oslogin_nss_cache[1739]: oslogin_cache_refresh[1739]: Refreshing group entry cache Dec 16 13:10:05.420939 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Dec 16 13:10:05.424048 oslogin_cache_refresh[1739]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Dec 16 13:10:05.421479 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Dec 16 13:10:05.424095 oslogin_cache_refresh[1739]: Refreshing group entry cache Dec 16 13:10:05.422016 systemd[1]: Starting update-engine.service - Update Engine... Dec 16 13:10:05.425629 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Dec 16 13:10:05.429236 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Dec 16 13:10:05.431488 google_oslogin_nss_cache[1739]: oslogin_cache_refresh[1739]: Failure getting groups, quitting Dec 16 13:10:05.431488 google_oslogin_nss_cache[1739]: oslogin_cache_refresh[1739]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Dec 16 13:10:05.430733 oslogin_cache_refresh[1739]: Failure getting groups, quitting Dec 16 13:10:05.430744 oslogin_cache_refresh[1739]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Dec 16 13:10:05.432332 extend-filesystems[1738]: Resized partition /dev/vda9 Dec 16 13:10:05.433790 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Dec 16 13:10:05.434071 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Dec 16 13:10:05.434595 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Dec 16 13:10:05.434827 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Dec 16 13:10:05.435024 chronyd[1730]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Dec 16 13:10:05.435855 chronyd[1730]: Loaded seccomp filter (level 2) Dec 16 13:10:05.436061 systemd[1]: Started chronyd.service - NTP client/server. Dec 16 13:10:05.437079 extend-filesystems[1766]: resize2fs 1.47.3 (8-Jul-2025) Dec 16 13:10:05.440286 jq[1760]: true Dec 16 13:10:05.436848 systemd[1]: motdgen.service: Deactivated successfully. Dec 16 13:10:05.437118 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Dec 16 13:10:05.440718 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Dec 16 13:10:05.440969 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Dec 16 13:10:05.445122 update_engine[1754]: I20251216 13:10:05.445034 1754 main.cc:92] Flatcar Update Engine starting Dec 16 13:10:05.448141 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 12499963 blocks Dec 16 13:10:05.455075 (ntainerd)[1770]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Dec 16 13:10:05.458046 jq[1768]: true Dec 16 13:10:05.469403 tar[1767]: linux-amd64/LICENSE Dec 16 13:10:05.469855 tar[1767]: linux-amd64/helm Dec 16 13:10:05.471979 systemd-logind[1750]: New seat seat0. Dec 16 13:10:05.474928 systemd-logind[1750]: Watching system buttons on /dev/input/event3 (Power Button) Dec 16 13:10:05.474952 systemd-logind[1750]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Dec 16 13:10:05.475264 systemd[1]: Started systemd-logind.service - User Login Management. Dec 16 13:10:05.488602 dbus-daemon[1733]: [system] SELinux support is enabled Dec 16 13:10:05.488819 systemd[1]: Started dbus.service - D-Bus System Message Bus. Dec 16 13:10:05.491255 update_engine[1754]: I20251216 13:10:05.491207 1754 update_check_scheduler.cc:74] Next update check in 10m26s Dec 16 13:10:05.491774 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Dec 16 13:10:05.491806 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Dec 16 13:10:05.492523 dbus-daemon[1733]: [system] Successfully activated service 'org.freedesktop.systemd1' Dec 16 13:10:05.493011 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Dec 16 13:10:05.493027 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Dec 16 13:10:05.502254 systemd[1]: Started update-engine.service - Update Engine. Dec 16 13:10:05.504678 systemd[1]: Started locksmithd.service - Cluster reboot manager. Dec 16 13:10:05.563856 locksmithd[1801]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Dec 16 13:10:05.637345 sshd_keygen[1765]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Dec 16 13:10:05.668726 containerd[1770]: time="2025-12-16T13:10:05Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Dec 16 13:10:05.670157 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Dec 16 13:10:05.675159 systemd[1]: Starting issuegen.service - Generate /run/issue... Dec 16 13:10:05.682416 containerd[1770]: time="2025-12-16T13:10:05.682367157Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Dec 16 13:10:05.686048 bash[1800]: Updated "/home/core/.ssh/authorized_keys" Dec 16 13:10:05.686971 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Dec 16 13:10:05.690004 systemd[1]: Starting sshkeys.service... Dec 16 13:10:05.691210 containerd[1770]: time="2025-12-16T13:10:05.691164650Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="13.915µs" Dec 16 13:10:05.691256 containerd[1770]: time="2025-12-16T13:10:05.691212495Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Dec 16 13:10:05.691256 containerd[1770]: time="2025-12-16T13:10:05.691236464Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Dec 16 13:10:05.691445 containerd[1770]: time="2025-12-16T13:10:05.691421417Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Dec 16 13:10:05.691466 containerd[1770]: time="2025-12-16T13:10:05.691445476Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Dec 16 13:10:05.691485 containerd[1770]: time="2025-12-16T13:10:05.691473979Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 13:10:05.691554 containerd[1770]: time="2025-12-16T13:10:05.691539504Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 13:10:05.691574 containerd[1770]: time="2025-12-16T13:10:05.691552406Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 13:10:05.691812 containerd[1770]: time="2025-12-16T13:10:05.691792703Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 13:10:05.691837 containerd[1770]: time="2025-12-16T13:10:05.691812954Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 13:10:05.691837 containerd[1770]: time="2025-12-16T13:10:05.691824488Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 13:10:05.691837 containerd[1770]: time="2025-12-16T13:10:05.691833731Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Dec 16 13:10:05.691943 containerd[1770]: time="2025-12-16T13:10:05.691899342Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Dec 16 13:10:05.692120 containerd[1770]: time="2025-12-16T13:10:05.692094921Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 13:10:05.692143 containerd[1770]: time="2025-12-16T13:10:05.692126949Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 13:10:05.692143 containerd[1770]: time="2025-12-16T13:10:05.692137895Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Dec 16 13:10:05.692211 containerd[1770]: time="2025-12-16T13:10:05.692165964Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Dec 16 13:10:05.693815 containerd[1770]: time="2025-12-16T13:10:05.693671720Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Dec 16 13:10:05.693815 containerd[1770]: time="2025-12-16T13:10:05.693793035Z" level=info msg="metadata content store policy set" policy=shared Dec 16 13:10:05.695371 systemd[1]: issuegen.service: Deactivated successfully. Dec 16 13:10:05.708715 systemd[1]: Finished issuegen.service - Generate /run/issue. Dec 16 13:10:05.713561 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Dec 16 13:10:05.730072 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Dec 16 13:10:05.732089 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Dec 16 13:10:05.753525 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:10:05.739365 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Dec 16 13:10:05.744039 systemd[1]: Started getty@tty1.service - Getty on tty1. Dec 16 13:10:05.748004 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Dec 16 13:10:05.750425 systemd[1]: Reached target getty.target - Login Prompts. Dec 16 13:10:05.755372 containerd[1770]: time="2025-12-16T13:10:05.755278306Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Dec 16 13:10:05.755440 containerd[1770]: time="2025-12-16T13:10:05.755407041Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Dec 16 13:10:05.755470 containerd[1770]: time="2025-12-16T13:10:05.755433963Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Dec 16 13:10:05.755470 containerd[1770]: time="2025-12-16T13:10:05.755453604Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Dec 16 13:10:05.755506 containerd[1770]: time="2025-12-16T13:10:05.755475123Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Dec 16 13:10:05.755506 containerd[1770]: time="2025-12-16T13:10:05.755493972Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Dec 16 13:10:05.755548 containerd[1770]: time="2025-12-16T13:10:05.755512308Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Dec 16 13:10:05.755567 containerd[1770]: time="2025-12-16T13:10:05.755543573Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Dec 16 13:10:05.755585 containerd[1770]: time="2025-12-16T13:10:05.755563988Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Dec 16 13:10:05.755609 containerd[1770]: time="2025-12-16T13:10:05.755581517Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Dec 16 13:10:05.755609 containerd[1770]: time="2025-12-16T13:10:05.755596818Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Dec 16 13:10:05.755647 containerd[1770]: time="2025-12-16T13:10:05.755624023Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Dec 16 13:10:05.755874 containerd[1770]: time="2025-12-16T13:10:05.755841901Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Dec 16 13:10:05.755901 containerd[1770]: time="2025-12-16T13:10:05.755881705Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Dec 16 13:10:05.755919 containerd[1770]: time="2025-12-16T13:10:05.755904918Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Dec 16 13:10:05.755940 containerd[1770]: time="2025-12-16T13:10:05.755922842Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Dec 16 13:10:05.755962 containerd[1770]: time="2025-12-16T13:10:05.755940628Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Dec 16 13:10:05.755981 containerd[1770]: time="2025-12-16T13:10:05.755960047Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Dec 16 13:10:05.756007 containerd[1770]: time="2025-12-16T13:10:05.755978911Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Dec 16 13:10:05.756007 containerd[1770]: time="2025-12-16T13:10:05.755995383Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Dec 16 13:10:05.756048 containerd[1770]: time="2025-12-16T13:10:05.756013174Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Dec 16 13:10:05.756068 containerd[1770]: time="2025-12-16T13:10:05.756049224Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Dec 16 13:10:05.756089 containerd[1770]: time="2025-12-16T13:10:05.756069683Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Dec 16 13:10:05.756168 containerd[1770]: time="2025-12-16T13:10:05.756140864Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Dec 16 13:10:05.756190 containerd[1770]: time="2025-12-16T13:10:05.756175341Z" level=info msg="Start snapshots syncer" Dec 16 13:10:05.756231 containerd[1770]: time="2025-12-16T13:10:05.756209095Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Dec 16 13:10:05.756733 containerd[1770]: time="2025-12-16T13:10:05.756660825Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Dec 16 13:10:05.756858 containerd[1770]: time="2025-12-16T13:10:05.756759009Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Dec 16 13:10:05.756858 containerd[1770]: time="2025-12-16T13:10:05.756833748Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Dec 16 13:10:05.757008 containerd[1770]: time="2025-12-16T13:10:05.756976923Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Dec 16 13:10:05.757031 containerd[1770]: time="2025-12-16T13:10:05.757014829Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Dec 16 13:10:05.757050 containerd[1770]: time="2025-12-16T13:10:05.757031775Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Dec 16 13:10:05.757073 containerd[1770]: time="2025-12-16T13:10:05.757047877Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Dec 16 13:10:05.757073 containerd[1770]: time="2025-12-16T13:10:05.757066153Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Dec 16 13:10:05.757111 containerd[1770]: time="2025-12-16T13:10:05.757085775Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Dec 16 13:10:05.757130 containerd[1770]: time="2025-12-16T13:10:05.757115133Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Dec 16 13:10:05.757172 containerd[1770]: time="2025-12-16T13:10:05.757153242Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Dec 16 13:10:05.757194 containerd[1770]: time="2025-12-16T13:10:05.757178175Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Dec 16 13:10:05.757216 containerd[1770]: time="2025-12-16T13:10:05.757196278Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Dec 16 13:10:05.757261 containerd[1770]: time="2025-12-16T13:10:05.757238036Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 13:10:05.757281 containerd[1770]: time="2025-12-16T13:10:05.757262199Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 13:10:05.757281 containerd[1770]: time="2025-12-16T13:10:05.757276858Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 13:10:05.757336 containerd[1770]: time="2025-12-16T13:10:05.757292279Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 13:10:05.757336 containerd[1770]: time="2025-12-16T13:10:05.757324386Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Dec 16 13:10:05.757371 containerd[1770]: time="2025-12-16T13:10:05.757339921Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Dec 16 13:10:05.757394 containerd[1770]: time="2025-12-16T13:10:05.757373288Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Dec 16 13:10:05.757413 containerd[1770]: time="2025-12-16T13:10:05.757400629Z" level=info msg="runtime interface created" Dec 16 13:10:05.757432 containerd[1770]: time="2025-12-16T13:10:05.757410164Z" level=info msg="created NRI interface" Dec 16 13:10:05.757432 containerd[1770]: time="2025-12-16T13:10:05.757425238Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Dec 16 13:10:05.757466 containerd[1770]: time="2025-12-16T13:10:05.757444111Z" level=info msg="Connect containerd service" Dec 16 13:10:05.757484 containerd[1770]: time="2025-12-16T13:10:05.757475188Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Dec 16 13:10:05.758625 containerd[1770]: time="2025-12-16T13:10:05.758570130Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 16 13:10:05.870373 containerd[1770]: time="2025-12-16T13:10:05.870263807Z" level=info msg="Start subscribing containerd event" Dec 16 13:10:05.870521 containerd[1770]: time="2025-12-16T13:10:05.870371518Z" level=info msg="Start recovering state" Dec 16 13:10:05.870521 containerd[1770]: time="2025-12-16T13:10:05.870459102Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Dec 16 13:10:05.870591 containerd[1770]: time="2025-12-16T13:10:05.870524589Z" level=info msg=serving... address=/run/containerd/containerd.sock Dec 16 13:10:05.870591 containerd[1770]: time="2025-12-16T13:10:05.870529613Z" level=info msg="Start event monitor" Dec 16 13:10:05.870591 containerd[1770]: time="2025-12-16T13:10:05.870559457Z" level=info msg="Start cni network conf syncer for default" Dec 16 13:10:05.870591 containerd[1770]: time="2025-12-16T13:10:05.870582819Z" level=info msg="Start streaming server" Dec 16 13:10:05.870694 containerd[1770]: time="2025-12-16T13:10:05.870597516Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Dec 16 13:10:05.870694 containerd[1770]: time="2025-12-16T13:10:05.870610334Z" level=info msg="runtime interface starting up..." Dec 16 13:10:05.870694 containerd[1770]: time="2025-12-16T13:10:05.870621089Z" level=info msg="starting plugins..." Dec 16 13:10:05.870694 containerd[1770]: time="2025-12-16T13:10:05.870643585Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Dec 16 13:10:05.870907 containerd[1770]: time="2025-12-16T13:10:05.870867197Z" level=info msg="containerd successfully booted in 0.202510s" Dec 16 13:10:05.871031 systemd[1]: Started containerd.service - containerd container runtime. Dec 16 13:10:05.969086 tar[1767]: linux-amd64/README.md Dec 16 13:10:05.994103 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Dec 16 13:10:06.054357 kernel: EXT4-fs (vda9): resized filesystem to 12499963 Dec 16 13:10:06.092375 extend-filesystems[1766]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Dec 16 13:10:06.092375 extend-filesystems[1766]: old_desc_blocks = 1, new_desc_blocks = 6 Dec 16 13:10:06.092375 extend-filesystems[1766]: The filesystem on /dev/vda9 is now 12499963 (4k) blocks long. Dec 16 13:10:06.096531 extend-filesystems[1738]: Resized filesystem in /dev/vda9 Dec 16 13:10:06.093639 systemd[1]: extend-filesystems.service: Deactivated successfully. Dec 16 13:10:06.093992 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Dec 16 13:10:06.412350 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:10:06.754355 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:10:07.208665 systemd-networkd[1674]: eth0: Gained IPv6LL Dec 16 13:10:07.212940 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Dec 16 13:10:07.214161 systemd[1]: Reached target network-online.target - Network is Online. Dec 16 13:10:07.216340 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:10:07.221705 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Dec 16 13:10:07.281966 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Dec 16 13:10:08.428457 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:10:08.660655 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:10:08.694337 (kubelet)[1877]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 13:10:08.768392 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:10:09.465011 kubelet[1877]: E1216 13:10:09.464895 1877 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 13:10:09.466912 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 13:10:09.467196 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 13:10:09.467851 systemd[1]: kubelet.service: Consumed 1.212s CPU time, 262.5M memory peak. Dec 16 13:10:12.447356 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:10:12.458570 coreos-metadata[1732]: Dec 16 13:10:12.458 WARN failed to locate config-drive, using the metadata service API instead Dec 16 13:10:12.481748 coreos-metadata[1732]: Dec 16 13:10:12.481 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Dec 16 13:10:12.784425 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:10:12.789313 coreos-metadata[1732]: Dec 16 13:10:12.789 INFO Fetch successful Dec 16 13:10:12.789313 coreos-metadata[1732]: Dec 16 13:10:12.789 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 16 13:10:12.800492 coreos-metadata[1829]: Dec 16 13:10:12.800 WARN failed to locate config-drive, using the metadata service API instead Dec 16 13:10:12.836818 coreos-metadata[1829]: Dec 16 13:10:12.836 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Dec 16 13:10:13.045175 coreos-metadata[1732]: Dec 16 13:10:13.044 INFO Fetch successful Dec 16 13:10:13.045360 coreos-metadata[1732]: Dec 16 13:10:13.045 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Dec 16 13:10:13.130685 coreos-metadata[1829]: Dec 16 13:10:13.130 INFO Fetch successful Dec 16 13:10:13.130685 coreos-metadata[1829]: Dec 16 13:10:13.130 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Dec 16 13:10:13.271183 coreos-metadata[1732]: Dec 16 13:10:13.271 INFO Fetch successful Dec 16 13:10:13.271183 coreos-metadata[1732]: Dec 16 13:10:13.271 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Dec 16 13:10:14.905807 coreos-metadata[1829]: Dec 16 13:10:14.905 INFO Fetch successful Dec 16 13:10:14.910564 unknown[1829]: wrote ssh authorized keys file for user: core Dec 16 13:10:14.946516 coreos-metadata[1732]: Dec 16 13:10:14.946 INFO Fetch successful Dec 16 13:10:14.946516 coreos-metadata[1732]: Dec 16 13:10:14.946 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Dec 16 13:10:14.984632 update-ssh-keys[1899]: Updated "/home/core/.ssh/authorized_keys" Dec 16 13:10:14.985875 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Dec 16 13:10:14.989247 systemd[1]: Finished sshkeys.service. Dec 16 13:10:15.076242 coreos-metadata[1732]: Dec 16 13:10:15.076 INFO Fetch successful Dec 16 13:10:15.076242 coreos-metadata[1732]: Dec 16 13:10:15.076 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Dec 16 13:10:15.219499 coreos-metadata[1732]: Dec 16 13:10:15.219 INFO Fetch successful Dec 16 13:10:15.268277 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Dec 16 13:10:15.268961 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Dec 16 13:10:15.269181 systemd[1]: Reached target multi-user.target - Multi-User System. Dec 16 13:10:15.269425 systemd[1]: Startup finished in 4.288s (kernel) + 13.387s (initrd) + 12.304s (userspace) = 29.980s. Dec 16 13:10:19.717694 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Dec 16 13:10:19.719287 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:10:19.942542 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:10:19.946043 (kubelet)[1917]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 13:10:19.987103 kubelet[1917]: E1216 13:10:19.986878 1917 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 13:10:19.993729 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 13:10:19.993906 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 13:10:19.994307 systemd[1]: kubelet.service: Consumed 209ms CPU time, 113.4M memory peak. Dec 16 13:10:20.330089 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Dec 16 13:10:20.331234 systemd[1]: Started sshd@0-10.0.25.207:22-147.75.109.163:54392.service - OpenSSH per-connection server daemon (147.75.109.163:54392). Dec 16 13:10:21.349233 sshd[1931]: Accepted publickey for core from 147.75.109.163 port 54392 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:10:21.353134 sshd-session[1931]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:10:21.369055 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Dec 16 13:10:21.370581 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Dec 16 13:10:21.374766 systemd-logind[1750]: New session 1 of user core. Dec 16 13:10:21.407679 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Dec 16 13:10:21.409758 systemd[1]: Starting user@500.service - User Manager for UID 500... Dec 16 13:10:21.435402 (systemd)[1936]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Dec 16 13:10:21.438103 systemd-logind[1750]: New session c1 of user core. Dec 16 13:10:21.577856 systemd[1936]: Queued start job for default target default.target. Dec 16 13:10:21.599365 systemd[1936]: Created slice app.slice - User Application Slice. Dec 16 13:10:21.599396 systemd[1936]: Reached target paths.target - Paths. Dec 16 13:10:21.599438 systemd[1936]: Reached target timers.target - Timers. Dec 16 13:10:21.600807 systemd[1936]: Starting dbus.socket - D-Bus User Message Bus Socket... Dec 16 13:10:21.618405 systemd[1936]: Listening on dbus.socket - D-Bus User Message Bus Socket. Dec 16 13:10:21.618511 systemd[1936]: Reached target sockets.target - Sockets. Dec 16 13:10:21.618550 systemd[1936]: Reached target basic.target - Basic System. Dec 16 13:10:21.618583 systemd[1936]: Reached target default.target - Main User Target. Dec 16 13:10:21.618608 systemd[1936]: Startup finished in 170ms. Dec 16 13:10:21.619173 systemd[1]: Started user@500.service - User Manager for UID 500. Dec 16 13:10:21.621109 systemd[1]: Started session-1.scope - Session 1 of User core. Dec 16 13:10:22.316582 systemd[1]: Started sshd@1-10.0.25.207:22-147.75.109.163:52534.service - OpenSSH per-connection server daemon (147.75.109.163:52534). Dec 16 13:10:23.314681 sshd[1947]: Accepted publickey for core from 147.75.109.163 port 52534 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:10:23.316483 sshd-session[1947]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:10:23.323238 systemd-logind[1750]: New session 2 of user core. Dec 16 13:10:23.341880 systemd[1]: Started session-2.scope - Session 2 of User core. Dec 16 13:10:23.999458 sshd[1950]: Connection closed by 147.75.109.163 port 52534 Dec 16 13:10:24.000158 sshd-session[1947]: pam_unix(sshd:session): session closed for user core Dec 16 13:10:24.006493 systemd[1]: sshd@1-10.0.25.207:22-147.75.109.163:52534.service: Deactivated successfully. Dec 16 13:10:24.010343 systemd[1]: session-2.scope: Deactivated successfully. Dec 16 13:10:24.014750 systemd-logind[1750]: Session 2 logged out. Waiting for processes to exit. Dec 16 13:10:24.016952 systemd-logind[1750]: Removed session 2. Dec 16 13:10:24.188154 systemd[1]: Started sshd@2-10.0.25.207:22-147.75.109.163:52540.service - OpenSSH per-connection server daemon (147.75.109.163:52540). Dec 16 13:10:25.246709 sshd[1956]: Accepted publickey for core from 147.75.109.163 port 52540 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:10:25.248716 sshd-session[1956]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:10:25.255949 systemd-logind[1750]: New session 3 of user core. Dec 16 13:10:25.267605 systemd[1]: Started session-3.scope - Session 3 of User core. Dec 16 13:10:25.934322 sshd[1959]: Connection closed by 147.75.109.163 port 52540 Dec 16 13:10:25.935050 sshd-session[1956]: pam_unix(sshd:session): session closed for user core Dec 16 13:10:25.940052 systemd[1]: sshd@2-10.0.25.207:22-147.75.109.163:52540.service: Deactivated successfully. Dec 16 13:10:25.943191 systemd[1]: session-3.scope: Deactivated successfully. Dec 16 13:10:25.946575 systemd-logind[1750]: Session 3 logged out. Waiting for processes to exit. Dec 16 13:10:25.948435 systemd-logind[1750]: Removed session 3. Dec 16 13:10:26.118634 systemd[1]: Started sshd@3-10.0.25.207:22-147.75.109.163:52542.service - OpenSSH per-connection server daemon (147.75.109.163:52542). Dec 16 13:10:27.177258 sshd[1965]: Accepted publickey for core from 147.75.109.163 port 52542 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:10:27.179591 sshd-session[1965]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:10:27.187833 systemd-logind[1750]: New session 4 of user core. Dec 16 13:10:27.205682 systemd[1]: Started session-4.scope - Session 4 of User core. Dec 16 13:10:27.880644 sshd[1968]: Connection closed by 147.75.109.163 port 52542 Dec 16 13:10:27.881403 sshd-session[1965]: pam_unix(sshd:session): session closed for user core Dec 16 13:10:27.887166 systemd[1]: sshd@3-10.0.25.207:22-147.75.109.163:52542.service: Deactivated successfully. Dec 16 13:10:27.891043 systemd[1]: session-4.scope: Deactivated successfully. Dec 16 13:10:27.895240 systemd-logind[1750]: Session 4 logged out. Waiting for processes to exit. Dec 16 13:10:27.897250 systemd-logind[1750]: Removed session 4. Dec 16 13:10:28.063737 systemd[1]: Started sshd@4-10.0.25.207:22-147.75.109.163:52556.service - OpenSSH per-connection server daemon (147.75.109.163:52556). Dec 16 13:10:29.088526 sshd[1974]: Accepted publickey for core from 147.75.109.163 port 52556 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:10:29.090536 sshd-session[1974]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:10:29.101485 systemd-logind[1750]: New session 5 of user core. Dec 16 13:10:29.123779 systemd[1]: Started session-5.scope - Session 5 of User core. Dec 16 13:10:29.229903 chronyd[1730]: Selected source PHC0 Dec 16 13:10:29.663014 sudo[1978]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Dec 16 13:10:29.663317 sudo[1978]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 13:10:29.681409 sudo[1978]: pam_unix(sudo:session): session closed for user root Dec 16 13:10:29.852289 sshd[1977]: Connection closed by 147.75.109.163 port 52556 Dec 16 13:10:29.853337 sshd-session[1974]: pam_unix(sshd:session): session closed for user core Dec 16 13:10:29.861339 systemd[1]: sshd@4-10.0.25.207:22-147.75.109.163:52556.service: Deactivated successfully. Dec 16 13:10:29.865561 systemd[1]: session-5.scope: Deactivated successfully. Dec 16 13:10:29.868454 systemd-logind[1750]: Session 5 logged out. Waiting for processes to exit. Dec 16 13:10:29.871163 systemd-logind[1750]: Removed session 5. Dec 16 13:10:30.040608 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Dec 16 13:10:30.043085 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:10:30.044733 systemd[1]: Started sshd@5-10.0.25.207:22-147.75.109.163:52562.service - OpenSSH per-connection server daemon (147.75.109.163:52562). Dec 16 13:10:30.245330 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:10:30.250797 (kubelet)[1995]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 13:10:30.310169 kubelet[1995]: E1216 13:10:30.310018 1995 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 13:10:30.313411 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 13:10:30.313566 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 13:10:30.313949 systemd[1]: kubelet.service: Consumed 214ms CPU time, 113.3M memory peak. Dec 16 13:10:31.132605 sshd[1985]: Accepted publickey for core from 147.75.109.163 port 52562 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:10:31.134349 sshd-session[1985]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:10:31.140678 systemd-logind[1750]: New session 6 of user core. Dec 16 13:10:31.151596 systemd[1]: Started session-6.scope - Session 6 of User core. Dec 16 13:10:31.695130 sudo[2009]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Dec 16 13:10:31.695451 sudo[2009]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 13:10:31.701718 sudo[2009]: pam_unix(sudo:session): session closed for user root Dec 16 13:10:31.707202 sudo[2008]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Dec 16 13:10:31.707469 sudo[2008]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 13:10:31.718635 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 13:10:31.767619 augenrules[2031]: No rules Dec 16 13:10:31.768894 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 13:10:31.769478 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 13:10:31.770738 sudo[2008]: pam_unix(sudo:session): session closed for user root Dec 16 13:10:31.941931 sshd[2007]: Connection closed by 147.75.109.163 port 52562 Dec 16 13:10:31.942632 sshd-session[1985]: pam_unix(sshd:session): session closed for user core Dec 16 13:10:31.948448 systemd[1]: sshd@5-10.0.25.207:22-147.75.109.163:52562.service: Deactivated successfully. Dec 16 13:10:31.952774 systemd[1]: session-6.scope: Deactivated successfully. Dec 16 13:10:31.955948 systemd-logind[1750]: Session 6 logged out. Waiting for processes to exit. Dec 16 13:10:31.958466 systemd-logind[1750]: Removed session 6. Dec 16 13:10:32.139794 systemd[1]: Started sshd@6-10.0.25.207:22-147.75.109.163:51194.service - OpenSSH per-connection server daemon (147.75.109.163:51194). Dec 16 13:10:33.229009 sshd[2040]: Accepted publickey for core from 147.75.109.163 port 51194 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:10:33.230322 sshd-session[2040]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:10:33.234638 systemd-logind[1750]: New session 7 of user core. Dec 16 13:10:33.245608 systemd[1]: Started session-7.scope - Session 7 of User core. Dec 16 13:10:33.781732 sudo[2044]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Dec 16 13:10:33.781974 sudo[2044]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 13:10:34.397427 systemd[1]: Starting docker.service - Docker Application Container Engine... Dec 16 13:10:34.415963 (dockerd)[2069]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Dec 16 13:10:35.012611 dockerd[2069]: time="2025-12-16T13:10:35.012497322Z" level=info msg="Starting up" Dec 16 13:10:35.013409 dockerd[2069]: time="2025-12-16T13:10:35.013355067Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Dec 16 13:10:35.053179 dockerd[2069]: time="2025-12-16T13:10:35.053031882Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Dec 16 13:10:35.100625 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport2913238162-merged.mount: Deactivated successfully. Dec 16 13:10:35.160291 dockerd[2069]: time="2025-12-16T13:10:35.160186626Z" level=info msg="Loading containers: start." Dec 16 13:10:35.193446 kernel: Initializing XFRM netlink socket Dec 16 13:10:35.651412 systemd-networkd[1674]: docker0: Link UP Dec 16 13:10:35.660988 dockerd[2069]: time="2025-12-16T13:10:35.660892689Z" level=info msg="Loading containers: done." Dec 16 13:10:35.682794 dockerd[2069]: time="2025-12-16T13:10:35.682707586Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Dec 16 13:10:35.682986 dockerd[2069]: time="2025-12-16T13:10:35.682831817Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Dec 16 13:10:35.682986 dockerd[2069]: time="2025-12-16T13:10:35.682944447Z" level=info msg="Initializing buildkit" Dec 16 13:10:35.735755 dockerd[2069]: time="2025-12-16T13:10:35.735679526Z" level=info msg="Completed buildkit initialization" Dec 16 13:10:35.742638 dockerd[2069]: time="2025-12-16T13:10:35.742581382Z" level=info msg="Daemon has completed initialization" Dec 16 13:10:35.742757 dockerd[2069]: time="2025-12-16T13:10:35.742697918Z" level=info msg="API listen on /run/docker.sock" Dec 16 13:10:35.743472 systemd[1]: Started docker.service - Docker Application Container Engine. Dec 16 13:10:37.137468 containerd[1770]: time="2025-12-16T13:10:37.137363172Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\"" Dec 16 13:10:37.971176 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2096009334.mount: Deactivated successfully. Dec 16 13:10:39.063100 containerd[1770]: time="2025-12-16T13:10:39.063050380Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:10:39.064198 containerd[1770]: time="2025-12-16T13:10:39.064165935Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.3: active requests=0, bytes read=27068171" Dec 16 13:10:39.066015 containerd[1770]: time="2025-12-16T13:10:39.065970814Z" level=info msg="ImageCreate event name:\"sha256:aa27095f5619377172f3d59289ccb2ba567ebea93a736d1705be068b2c030b0c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:10:39.069416 containerd[1770]: time="2025-12-16T13:10:39.069372732Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:10:39.070349 containerd[1770]: time="2025-12-16T13:10:39.070327824Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.3\" with image id \"sha256:aa27095f5619377172f3d59289ccb2ba567ebea93a736d1705be068b2c030b0c\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\", size \"27064672\" in 1.932918764s" Dec 16 13:10:39.070563 containerd[1770]: time="2025-12-16T13:10:39.070418667Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\" returns image reference \"sha256:aa27095f5619377172f3d59289ccb2ba567ebea93a736d1705be068b2c030b0c\"" Dec 16 13:10:39.071045 containerd[1770]: time="2025-12-16T13:10:39.071015142Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\"" Dec 16 13:10:40.338883 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Dec 16 13:10:40.340176 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:10:40.363865 containerd[1770]: time="2025-12-16T13:10:40.363795360Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:10:40.365451 containerd[1770]: time="2025-12-16T13:10:40.365417099Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.3: active requests=0, bytes read=21162460" Dec 16 13:10:40.367254 containerd[1770]: time="2025-12-16T13:10:40.367216651Z" level=info msg="ImageCreate event name:\"sha256:5826b25d990d7d314d236c8d128f43e443583891f5cdffa7bf8bca50ae9e0942\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:10:40.371078 containerd[1770]: time="2025-12-16T13:10:40.371006729Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:10:40.371883 containerd[1770]: time="2025-12-16T13:10:40.371732411Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.3\" with image id \"sha256:5826b25d990d7d314d236c8d128f43e443583891f5cdffa7bf8bca50ae9e0942\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\", size \"22819474\" in 1.300691062s" Dec 16 13:10:40.371883 containerd[1770]: time="2025-12-16T13:10:40.371761728Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\" returns image reference \"sha256:5826b25d990d7d314d236c8d128f43e443583891f5cdffa7bf8bca50ae9e0942\"" Dec 16 13:10:40.372542 containerd[1770]: time="2025-12-16T13:10:40.372487305Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\"" Dec 16 13:10:40.473191 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:10:40.476734 (kubelet)[2366]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 13:10:40.516112 kubelet[2366]: E1216 13:10:40.516062 2366 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 13:10:40.518148 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 13:10:40.518279 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 13:10:40.518663 systemd[1]: kubelet.service: Consumed 149ms CPU time, 113.4M memory peak. Dec 16 13:10:41.430838 containerd[1770]: time="2025-12-16T13:10:41.430756161Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:10:41.431977 containerd[1770]: time="2025-12-16T13:10:41.431936868Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.3: active requests=0, bytes read=15725947" Dec 16 13:10:41.433589 containerd[1770]: time="2025-12-16T13:10:41.433535144Z" level=info msg="ImageCreate event name:\"sha256:aec12dadf56dd45659a682b94571f115a1be02ee4a262b3b5176394f5c030c78\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:10:41.438317 containerd[1770]: time="2025-12-16T13:10:41.437044315Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:10:41.438317 containerd[1770]: time="2025-12-16T13:10:41.438047374Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.3\" with image id \"sha256:aec12dadf56dd45659a682b94571f115a1be02ee4a262b3b5176394f5c030c78\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\", size \"17382979\" in 1.065521045s" Dec 16 13:10:41.438317 containerd[1770]: time="2025-12-16T13:10:41.438095008Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\" returns image reference \"sha256:aec12dadf56dd45659a682b94571f115a1be02ee4a262b3b5176394f5c030c78\"" Dec 16 13:10:41.438825 containerd[1770]: time="2025-12-16T13:10:41.438787727Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\"" Dec 16 13:10:42.381373 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount28428525.mount: Deactivated successfully. Dec 16 13:10:42.638067 containerd[1770]: time="2025-12-16T13:10:42.636621569Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:10:42.639364 containerd[1770]: time="2025-12-16T13:10:42.639290957Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.3: active requests=0, bytes read=25965319" Dec 16 13:10:42.642024 containerd[1770]: time="2025-12-16T13:10:42.641993286Z" level=info msg="ImageCreate event name:\"sha256:36eef8e07bdd6abdc2bbf44041e49480fe499a3cedb0ae054b50daa1a35cf691\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:10:42.645035 containerd[1770]: time="2025-12-16T13:10:42.645005402Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:10:42.645388 containerd[1770]: time="2025-12-16T13:10:42.645361548Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.3\" with image id \"sha256:36eef8e07bdd6abdc2bbf44041e49480fe499a3cedb0ae054b50daa1a35cf691\", repo tag \"registry.k8s.io/kube-proxy:v1.34.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\", size \"25964312\" in 1.206530788s" Dec 16 13:10:42.645419 containerd[1770]: time="2025-12-16T13:10:42.645388579Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\" returns image reference \"sha256:36eef8e07bdd6abdc2bbf44041e49480fe499a3cedb0ae054b50daa1a35cf691\"" Dec 16 13:10:42.646493 containerd[1770]: time="2025-12-16T13:10:42.646459752Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Dec 16 13:10:43.358166 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3939859102.mount: Deactivated successfully. Dec 16 13:10:44.194504 containerd[1770]: time="2025-12-16T13:10:44.194455772Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:10:44.196011 containerd[1770]: time="2025-12-16T13:10:44.195976637Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=22388099" Dec 16 13:10:44.197590 containerd[1770]: time="2025-12-16T13:10:44.197562980Z" level=info msg="ImageCreate event name:\"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:10:44.201073 containerd[1770]: time="2025-12-16T13:10:44.201044765Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:10:44.201830 containerd[1770]: time="2025-12-16T13:10:44.201800790Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"22384805\" in 1.555312916s" Dec 16 13:10:44.201869 containerd[1770]: time="2025-12-16T13:10:44.201833251Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\"" Dec 16 13:10:44.210445 containerd[1770]: time="2025-12-16T13:10:44.210408047Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Dec 16 13:10:44.763778 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3774995440.mount: Deactivated successfully. Dec 16 13:10:44.772335 containerd[1770]: time="2025-12-16T13:10:44.772273026Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:10:44.774040 containerd[1770]: time="2025-12-16T13:10:44.773992151Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=321238" Dec 16 13:10:44.776256 containerd[1770]: time="2025-12-16T13:10:44.776224853Z" level=info msg="ImageCreate event name:\"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:10:44.779043 containerd[1770]: time="2025-12-16T13:10:44.779020286Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:10:44.779628 containerd[1770]: time="2025-12-16T13:10:44.779596940Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"320448\" in 569.153973ms" Dec 16 13:10:44.779660 containerd[1770]: time="2025-12-16T13:10:44.779635866Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\"" Dec 16 13:10:44.780129 containerd[1770]: time="2025-12-16T13:10:44.780101324Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\"" Dec 16 13:10:45.477975 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2905733388.mount: Deactivated successfully. Dec 16 13:10:47.525766 containerd[1770]: time="2025-12-16T13:10:47.525699305Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.4-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:10:47.531252 containerd[1770]: time="2025-12-16T13:10:47.531186196Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.4-0: active requests=0, bytes read=74166870" Dec 16 13:10:47.533930 containerd[1770]: time="2025-12-16T13:10:47.533886967Z" level=info msg="ImageCreate event name:\"sha256:5f1f5298c888daa46c4409ff4cefe5ca9d16e479419f94cdb5f5d5563dac0115\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:10:47.537609 containerd[1770]: time="2025-12-16T13:10:47.537572482Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:10:47.538841 containerd[1770]: time="2025-12-16T13:10:47.538787695Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.4-0\" with image id \"sha256:5f1f5298c888daa46c4409ff4cefe5ca9d16e479419f94cdb5f5d5563dac0115\", repo tag \"registry.k8s.io/etcd:3.6.4-0\", repo digest \"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\", size \"74311308\" in 2.758642033s" Dec 16 13:10:47.538841 containerd[1770]: time="2025-12-16T13:10:47.538836458Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\" returns image reference \"sha256:5f1f5298c888daa46c4409ff4cefe5ca9d16e479419f94cdb5f5d5563dac0115\"" Dec 16 13:10:50.305315 update_engine[1754]: I20251216 13:10:50.305219 1754 update_attempter.cc:509] Updating boot flags... Dec 16 13:10:50.545913 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Dec 16 13:10:50.547453 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:10:50.575164 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Dec 16 13:10:50.575374 systemd[1]: kubelet.service: Failed with result 'signal'. Dec 16 13:10:50.575916 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:10:50.580622 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:10:50.611216 systemd[1]: Reload requested from client PID 2556 ('systemctl') (unit session-7.scope)... Dec 16 13:10:50.611232 systemd[1]: Reloading... Dec 16 13:10:50.686335 zram_generator::config[2599]: No configuration found. Dec 16 13:10:50.891604 systemd[1]: Reloading finished in 279 ms. Dec 16 13:10:50.951109 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Dec 16 13:10:50.951174 systemd[1]: kubelet.service: Failed with result 'signal'. Dec 16 13:10:50.951409 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:10:50.951475 systemd[1]: kubelet.service: Consumed 114ms CPU time, 98.3M memory peak. Dec 16 13:10:50.953400 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:10:51.151487 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:10:51.157324 (kubelet)[2654]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 13:10:51.208530 kubelet[2654]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 13:10:51.208530 kubelet[2654]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 13:10:51.208814 kubelet[2654]: I1216 13:10:51.208581 2654 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 13:10:51.443655 kubelet[2654]: I1216 13:10:51.443537 2654 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Dec 16 13:10:51.443655 kubelet[2654]: I1216 13:10:51.443569 2654 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 13:10:51.443655 kubelet[2654]: I1216 13:10:51.443593 2654 watchdog_linux.go:95] "Systemd watchdog is not enabled" Dec 16 13:10:51.443655 kubelet[2654]: I1216 13:10:51.443601 2654 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 13:10:51.443939 kubelet[2654]: I1216 13:10:51.443916 2654 server.go:956] "Client rotation is on, will bootstrap in background" Dec 16 13:10:51.454313 kubelet[2654]: E1216 13:10:51.454245 2654 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.25.207:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.25.207:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Dec 16 13:10:51.458573 kubelet[2654]: I1216 13:10:51.458548 2654 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 13:10:51.465322 kubelet[2654]: I1216 13:10:51.464839 2654 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 13:10:51.470724 kubelet[2654]: I1216 13:10:51.470708 2654 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Dec 16 13:10:51.472911 kubelet[2654]: I1216 13:10:51.472878 2654 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 13:10:51.473133 kubelet[2654]: I1216 13:10:51.472982 2654 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-9-79ca1ea2c9","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 13:10:51.473252 kubelet[2654]: I1216 13:10:51.473243 2654 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 13:10:51.473293 kubelet[2654]: I1216 13:10:51.473287 2654 container_manager_linux.go:306] "Creating device plugin manager" Dec 16 13:10:51.473415 kubelet[2654]: I1216 13:10:51.473408 2654 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Dec 16 13:10:51.477707 kubelet[2654]: I1216 13:10:51.477683 2654 state_mem.go:36] "Initialized new in-memory state store" Dec 16 13:10:51.477887 kubelet[2654]: I1216 13:10:51.477877 2654 kubelet.go:475] "Attempting to sync node with API server" Dec 16 13:10:51.477915 kubelet[2654]: I1216 13:10:51.477891 2654 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 13:10:51.477915 kubelet[2654]: I1216 13:10:51.477908 2654 kubelet.go:387] "Adding apiserver pod source" Dec 16 13:10:51.477958 kubelet[2654]: I1216 13:10:51.477929 2654 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 13:10:51.478557 kubelet[2654]: E1216 13:10:51.478511 2654 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.25.207:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.25.207:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 16 13:10:51.478977 kubelet[2654]: E1216 13:10:51.478854 2654 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.25.207:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-2-9-79ca1ea2c9&limit=500&resourceVersion=0\": dial tcp 10.0.25.207:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 16 13:10:51.484591 kubelet[2654]: I1216 13:10:51.484562 2654 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 16 13:10:51.485093 kubelet[2654]: I1216 13:10:51.485072 2654 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 16 13:10:51.485148 kubelet[2654]: I1216 13:10:51.485101 2654 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Dec 16 13:10:51.485189 kubelet[2654]: W1216 13:10:51.485144 2654 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Dec 16 13:10:51.490425 kubelet[2654]: I1216 13:10:51.490405 2654 server.go:1262] "Started kubelet" Dec 16 13:10:51.490698 kubelet[2654]: I1216 13:10:51.490614 2654 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 13:10:51.490759 kubelet[2654]: I1216 13:10:51.490726 2654 server_v1.go:49] "podresources" method="list" useActivePods=true Dec 16 13:10:51.491064 kubelet[2654]: I1216 13:10:51.491037 2654 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 13:10:51.491064 kubelet[2654]: I1216 13:10:51.491040 2654 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 13:10:51.491179 kubelet[2654]: I1216 13:10:51.491123 2654 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 13:10:51.491484 kubelet[2654]: E1216 13:10:51.491437 2654 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-9-79ca1ea2c9\" not found" Dec 16 13:10:51.491551 kubelet[2654]: I1216 13:10:51.491527 2654 volume_manager.go:313] "Starting Kubelet Volume Manager" Dec 16 13:10:51.491551 kubelet[2654]: I1216 13:10:51.491530 2654 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 13:10:51.491917 kubelet[2654]: I1216 13:10:51.491830 2654 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 16 13:10:51.491992 kubelet[2654]: I1216 13:10:51.491920 2654 reconciler.go:29] "Reconciler: start to sync state" Dec 16 13:10:51.492208 kubelet[2654]: E1216 13:10:51.492157 2654 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.25.207:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.25.207:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 16 13:10:51.492970 kubelet[2654]: E1216 13:10:51.492496 2654 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.25.207:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-9-79ca1ea2c9?timeout=10s\": dial tcp 10.0.25.207:6443: connect: connection refused" interval="200ms" Dec 16 13:10:51.493637 kubelet[2654]: I1216 13:10:51.493596 2654 server.go:310] "Adding debug handlers to kubelet server" Dec 16 13:10:51.493929 kubelet[2654]: I1216 13:10:51.493904 2654 factory.go:223] Registration of the systemd container factory successfully Dec 16 13:10:51.494204 kubelet[2654]: I1216 13:10:51.494005 2654 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 13:10:51.494284 kubelet[2654]: E1216 13:10:51.494223 2654 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 16 13:10:51.495234 kubelet[2654]: I1216 13:10:51.495206 2654 factory.go:223] Registration of the containerd container factory successfully Dec 16 13:10:51.501367 kubelet[2654]: E1216 13:10:51.498338 2654 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.25.207:6443/api/v1/namespaces/default/events\": dial tcp 10.0.25.207:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-2-9-79ca1ea2c9.1881b43102444a61 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-2-9-79ca1ea2c9,UID:ci-4459-2-2-9-79ca1ea2c9,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-2-9-79ca1ea2c9,},FirstTimestamp:2025-12-16 13:10:51.490372193 +0000 UTC m=+0.328871528,LastTimestamp:2025-12-16 13:10:51.490372193 +0000 UTC m=+0.328871528,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-9-79ca1ea2c9,}" Dec 16 13:10:51.505025 kubelet[2654]: I1216 13:10:51.504995 2654 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 13:10:51.505025 kubelet[2654]: I1216 13:10:51.505012 2654 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 13:10:51.505025 kubelet[2654]: I1216 13:10:51.505028 2654 state_mem.go:36] "Initialized new in-memory state store" Dec 16 13:10:51.508455 kubelet[2654]: I1216 13:10:51.508409 2654 policy_none.go:49] "None policy: Start" Dec 16 13:10:51.508455 kubelet[2654]: I1216 13:10:51.508429 2654 memory_manager.go:187] "Starting memorymanager" policy="None" Dec 16 13:10:51.508455 kubelet[2654]: I1216 13:10:51.508439 2654 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Dec 16 13:10:51.511049 kubelet[2654]: I1216 13:10:51.511027 2654 policy_none.go:47] "Start" Dec 16 13:10:51.515150 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Dec 16 13:10:51.520590 kubelet[2654]: I1216 13:10:51.520545 2654 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Dec 16 13:10:51.522798 kubelet[2654]: I1216 13:10:51.522532 2654 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Dec 16 13:10:51.522798 kubelet[2654]: I1216 13:10:51.522557 2654 status_manager.go:244] "Starting to sync pod status with apiserver" Dec 16 13:10:51.522798 kubelet[2654]: I1216 13:10:51.522581 2654 kubelet.go:2427] "Starting kubelet main sync loop" Dec 16 13:10:51.522798 kubelet[2654]: E1216 13:10:51.522626 2654 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 13:10:51.523437 kubelet[2654]: E1216 13:10:51.523395 2654 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.25.207:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.25.207:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Dec 16 13:10:51.532711 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Dec 16 13:10:51.537009 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Dec 16 13:10:51.553162 kubelet[2654]: E1216 13:10:51.553128 2654 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 16 13:10:51.553363 kubelet[2654]: I1216 13:10:51.553322 2654 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 13:10:51.553363 kubelet[2654]: I1216 13:10:51.553334 2654 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 13:10:51.554003 kubelet[2654]: I1216 13:10:51.553628 2654 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 13:10:51.554281 kubelet[2654]: E1216 13:10:51.554243 2654 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 13:10:51.554411 kubelet[2654]: E1216 13:10:51.554381 2654 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-2-9-79ca1ea2c9\" not found" Dec 16 13:10:51.638682 systemd[1]: Created slice kubepods-burstable-poddb1da35d80e3303675c287cd2133e522.slice - libcontainer container kubepods-burstable-poddb1da35d80e3303675c287cd2133e522.slice. Dec 16 13:10:51.655966 kubelet[2654]: I1216 13:10:51.655898 2654 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:51.656465 kubelet[2654]: E1216 13:10:51.656416 2654 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.25.207:6443/api/v1/nodes\": dial tcp 10.0.25.207:6443: connect: connection refused" node="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:51.675165 kubelet[2654]: E1216 13:10:51.675117 2654 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-9-79ca1ea2c9\" not found" node="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:51.677496 systemd[1]: Created slice kubepods-burstable-pod6473c59e917320be6a9c07d436848961.slice - libcontainer container kubepods-burstable-pod6473c59e917320be6a9c07d436848961.slice. Dec 16 13:10:51.679017 kubelet[2654]: E1216 13:10:51.678876 2654 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-9-79ca1ea2c9\" not found" node="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:51.680773 systemd[1]: Created slice kubepods-burstable-pod86d6a278ef154e19a9dbe0226cc4d120.slice - libcontainer container kubepods-burstable-pod86d6a278ef154e19a9dbe0226cc4d120.slice. Dec 16 13:10:51.683750 kubelet[2654]: E1216 13:10:51.683699 2654 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-9-79ca1ea2c9\" not found" node="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:51.692900 kubelet[2654]: I1216 13:10:51.692840 2654 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/db1da35d80e3303675c287cd2133e522-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-9-79ca1ea2c9\" (UID: \"db1da35d80e3303675c287cd2133e522\") " pod="kube-system/kube-apiserver-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:51.692900 kubelet[2654]: I1216 13:10:51.692889 2654 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/db1da35d80e3303675c287cd2133e522-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-9-79ca1ea2c9\" (UID: \"db1da35d80e3303675c287cd2133e522\") " pod="kube-system/kube-apiserver-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:51.693195 kubelet[2654]: I1216 13:10:51.692921 2654 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6473c59e917320be6a9c07d436848961-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9\" (UID: \"6473c59e917320be6a9c07d436848961\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:51.693195 kubelet[2654]: I1216 13:10:51.692947 2654 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/86d6a278ef154e19a9dbe0226cc4d120-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-9-79ca1ea2c9\" (UID: \"86d6a278ef154e19a9dbe0226cc4d120\") " pod="kube-system/kube-scheduler-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:51.693195 kubelet[2654]: I1216 13:10:51.692974 2654 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/db1da35d80e3303675c287cd2133e522-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-9-79ca1ea2c9\" (UID: \"db1da35d80e3303675c287cd2133e522\") " pod="kube-system/kube-apiserver-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:51.693195 kubelet[2654]: I1216 13:10:51.693064 2654 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6473c59e917320be6a9c07d436848961-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9\" (UID: \"6473c59e917320be6a9c07d436848961\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:51.693195 kubelet[2654]: I1216 13:10:51.693158 2654 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6473c59e917320be6a9c07d436848961-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9\" (UID: \"6473c59e917320be6a9c07d436848961\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:51.693570 kubelet[2654]: I1216 13:10:51.693194 2654 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6473c59e917320be6a9c07d436848961-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9\" (UID: \"6473c59e917320be6a9c07d436848961\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:51.693570 kubelet[2654]: I1216 13:10:51.693232 2654 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6473c59e917320be6a9c07d436848961-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9\" (UID: \"6473c59e917320be6a9c07d436848961\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:51.694128 kubelet[2654]: E1216 13:10:51.693952 2654 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.25.207:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-9-79ca1ea2c9?timeout=10s\": dial tcp 10.0.25.207:6443: connect: connection refused" interval="400ms" Dec 16 13:10:51.859996 kubelet[2654]: I1216 13:10:51.859853 2654 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:51.860698 kubelet[2654]: E1216 13:10:51.860583 2654 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.25.207:6443/api/v1/nodes\": dial tcp 10.0.25.207:6443: connect: connection refused" node="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:51.982654 containerd[1770]: time="2025-12-16T13:10:51.982464349Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-9-79ca1ea2c9,Uid:db1da35d80e3303675c287cd2133e522,Namespace:kube-system,Attempt:0,}" Dec 16 13:10:51.987268 containerd[1770]: time="2025-12-16T13:10:51.987136426Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9,Uid:6473c59e917320be6a9c07d436848961,Namespace:kube-system,Attempt:0,}" Dec 16 13:10:51.990786 containerd[1770]: time="2025-12-16T13:10:51.990696312Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-9-79ca1ea2c9,Uid:86d6a278ef154e19a9dbe0226cc4d120,Namespace:kube-system,Attempt:0,}" Dec 16 13:10:52.095023 kubelet[2654]: E1216 13:10:52.094940 2654 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.25.207:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-9-79ca1ea2c9?timeout=10s\": dial tcp 10.0.25.207:6443: connect: connection refused" interval="800ms" Dec 16 13:10:52.263099 kubelet[2654]: I1216 13:10:52.262993 2654 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:52.264367 kubelet[2654]: E1216 13:10:52.263578 2654 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.25.207:6443/api/v1/nodes\": dial tcp 10.0.25.207:6443: connect: connection refused" node="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:52.387734 kubelet[2654]: E1216 13:10:52.387663 2654 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.25.207:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.25.207:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 16 13:10:52.400901 kubelet[2654]: E1216 13:10:52.400826 2654 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.25.207:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.25.207:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 16 13:10:52.567650 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2805606756.mount: Deactivated successfully. Dec 16 13:10:52.579127 containerd[1770]: time="2025-12-16T13:10:52.578777320Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 13:10:52.587339 containerd[1770]: time="2025-12-16T13:10:52.587271975Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321158" Dec 16 13:10:52.590342 containerd[1770]: time="2025-12-16T13:10:52.590313442Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 13:10:52.592866 containerd[1770]: time="2025-12-16T13:10:52.592772341Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 13:10:52.596960 containerd[1770]: time="2025-12-16T13:10:52.596852829Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 13:10:52.599191 containerd[1770]: time="2025-12-16T13:10:52.599155423Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 16 13:10:52.601078 containerd[1770]: time="2025-12-16T13:10:52.601021411Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 16 13:10:52.603901 containerd[1770]: time="2025-12-16T13:10:52.603776505Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 13:10:52.607725 containerd[1770]: time="2025-12-16T13:10:52.607382607Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 614.06753ms" Dec 16 13:10:52.608074 containerd[1770]: time="2025-12-16T13:10:52.608009456Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 620.775329ms" Dec 16 13:10:52.609930 containerd[1770]: time="2025-12-16T13:10:52.609875851Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 614.512072ms" Dec 16 13:10:52.662831 containerd[1770]: time="2025-12-16T13:10:52.662780089Z" level=info msg="connecting to shim 2967b3c8ead93e0e7671428a1b44d541342ab45110e1863406c4cea1aeaeaf8c" address="unix:///run/containerd/s/66dede7584f8904cf090f9fcb9e1209a337c2104710289581de5cad4ac55d3bc" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:10:52.669924 containerd[1770]: time="2025-12-16T13:10:52.669867451Z" level=info msg="connecting to shim 1e78eedeb6eb60d03631c19dbfe0fb7157739aef37f050f123b381f3c23f1be1" address="unix:///run/containerd/s/75a2863400045880dc21771ac9eb3c30cbd9ac7221f10a32942ad300875a63c1" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:10:52.676612 containerd[1770]: time="2025-12-16T13:10:52.676554884Z" level=info msg="connecting to shim 5f16cf2a9fcd8cedf0c024e2785f9e1b9823becc62a6a63604c662e32bf3b507" address="unix:///run/containerd/s/24684877e733198d03bd57b7bda439f9b5469fea17d3d101e46e0b33d394a64c" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:10:52.699492 systemd[1]: Started cri-containerd-1e78eedeb6eb60d03631c19dbfe0fb7157739aef37f050f123b381f3c23f1be1.scope - libcontainer container 1e78eedeb6eb60d03631c19dbfe0fb7157739aef37f050f123b381f3c23f1be1. Dec 16 13:10:52.700610 systemd[1]: Started cri-containerd-2967b3c8ead93e0e7671428a1b44d541342ab45110e1863406c4cea1aeaeaf8c.scope - libcontainer container 2967b3c8ead93e0e7671428a1b44d541342ab45110e1863406c4cea1aeaeaf8c. Dec 16 13:10:52.703069 systemd[1]: Started cri-containerd-5f16cf2a9fcd8cedf0c024e2785f9e1b9823becc62a6a63604c662e32bf3b507.scope - libcontainer container 5f16cf2a9fcd8cedf0c024e2785f9e1b9823becc62a6a63604c662e32bf3b507. Dec 16 13:10:52.736108 kubelet[2654]: E1216 13:10:52.736079 2654 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.25.207:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.25.207:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Dec 16 13:10:52.748577 containerd[1770]: time="2025-12-16T13:10:52.748531210Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-9-79ca1ea2c9,Uid:db1da35d80e3303675c287cd2133e522,Namespace:kube-system,Attempt:0,} returns sandbox id \"2967b3c8ead93e0e7671428a1b44d541342ab45110e1863406c4cea1aeaeaf8c\"" Dec 16 13:10:52.751771 containerd[1770]: time="2025-12-16T13:10:52.751738841Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-9-79ca1ea2c9,Uid:86d6a278ef154e19a9dbe0226cc4d120,Namespace:kube-system,Attempt:0,} returns sandbox id \"1e78eedeb6eb60d03631c19dbfe0fb7157739aef37f050f123b381f3c23f1be1\"" Dec 16 13:10:52.757344 containerd[1770]: time="2025-12-16T13:10:52.757311083Z" level=info msg="CreateContainer within sandbox \"2967b3c8ead93e0e7671428a1b44d541342ab45110e1863406c4cea1aeaeaf8c\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Dec 16 13:10:52.761769 containerd[1770]: time="2025-12-16T13:10:52.761731314Z" level=info msg="CreateContainer within sandbox \"1e78eedeb6eb60d03631c19dbfe0fb7157739aef37f050f123b381f3c23f1be1\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Dec 16 13:10:52.762108 containerd[1770]: time="2025-12-16T13:10:52.762078195Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9,Uid:6473c59e917320be6a9c07d436848961,Namespace:kube-system,Attempt:0,} returns sandbox id \"5f16cf2a9fcd8cedf0c024e2785f9e1b9823becc62a6a63604c662e32bf3b507\"" Dec 16 13:10:52.767899 containerd[1770]: time="2025-12-16T13:10:52.767866922Z" level=info msg="CreateContainer within sandbox \"5f16cf2a9fcd8cedf0c024e2785f9e1b9823becc62a6a63604c662e32bf3b507\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Dec 16 13:10:52.775619 containerd[1770]: time="2025-12-16T13:10:52.775574136Z" level=info msg="Container 39ca5bbac2b39e8800adb66471e19772fae078f2e2d705939e8400f9e693db6c: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:10:52.779629 containerd[1770]: time="2025-12-16T13:10:52.779593523Z" level=info msg="Container 3e0a92f601c7d19b9b47c7ae1b3e1b6811475177f422394e2b729909176dd01c: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:10:52.790207 containerd[1770]: time="2025-12-16T13:10:52.790166698Z" level=info msg="CreateContainer within sandbox \"2967b3c8ead93e0e7671428a1b44d541342ab45110e1863406c4cea1aeaeaf8c\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"39ca5bbac2b39e8800adb66471e19772fae078f2e2d705939e8400f9e693db6c\"" Dec 16 13:10:52.790830 containerd[1770]: time="2025-12-16T13:10:52.790802507Z" level=info msg="StartContainer for \"39ca5bbac2b39e8800adb66471e19772fae078f2e2d705939e8400f9e693db6c\"" Dec 16 13:10:52.792670 containerd[1770]: time="2025-12-16T13:10:52.792643491Z" level=info msg="connecting to shim 39ca5bbac2b39e8800adb66471e19772fae078f2e2d705939e8400f9e693db6c" address="unix:///run/containerd/s/66dede7584f8904cf090f9fcb9e1209a337c2104710289581de5cad4ac55d3bc" protocol=ttrpc version=3 Dec 16 13:10:52.794863 containerd[1770]: time="2025-12-16T13:10:52.794833065Z" level=info msg="Container eed1db98ae984d4e028be9de39d6aeda01529ec2c28d6831df395540b14bd4f1: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:10:52.799108 containerd[1770]: time="2025-12-16T13:10:52.799079202Z" level=info msg="CreateContainer within sandbox \"1e78eedeb6eb60d03631c19dbfe0fb7157739aef37f050f123b381f3c23f1be1\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"3e0a92f601c7d19b9b47c7ae1b3e1b6811475177f422394e2b729909176dd01c\"" Dec 16 13:10:52.799650 containerd[1770]: time="2025-12-16T13:10:52.799629804Z" level=info msg="StartContainer for \"3e0a92f601c7d19b9b47c7ae1b3e1b6811475177f422394e2b729909176dd01c\"" Dec 16 13:10:52.802389 containerd[1770]: time="2025-12-16T13:10:52.802365270Z" level=info msg="connecting to shim 3e0a92f601c7d19b9b47c7ae1b3e1b6811475177f422394e2b729909176dd01c" address="unix:///run/containerd/s/75a2863400045880dc21771ac9eb3c30cbd9ac7221f10a32942ad300875a63c1" protocol=ttrpc version=3 Dec 16 13:10:52.805235 containerd[1770]: time="2025-12-16T13:10:52.805210688Z" level=info msg="CreateContainer within sandbox \"5f16cf2a9fcd8cedf0c024e2785f9e1b9823becc62a6a63604c662e32bf3b507\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"eed1db98ae984d4e028be9de39d6aeda01529ec2c28d6831df395540b14bd4f1\"" Dec 16 13:10:52.806173 containerd[1770]: time="2025-12-16T13:10:52.806002489Z" level=info msg="StartContainer for \"eed1db98ae984d4e028be9de39d6aeda01529ec2c28d6831df395540b14bd4f1\"" Dec 16 13:10:52.807144 containerd[1770]: time="2025-12-16T13:10:52.807122504Z" level=info msg="connecting to shim eed1db98ae984d4e028be9de39d6aeda01529ec2c28d6831df395540b14bd4f1" address="unix:///run/containerd/s/24684877e733198d03bd57b7bda439f9b5469fea17d3d101e46e0b33d394a64c" protocol=ttrpc version=3 Dec 16 13:10:52.815532 systemd[1]: Started cri-containerd-39ca5bbac2b39e8800adb66471e19772fae078f2e2d705939e8400f9e693db6c.scope - libcontainer container 39ca5bbac2b39e8800adb66471e19772fae078f2e2d705939e8400f9e693db6c. Dec 16 13:10:52.817753 systemd[1]: Started cri-containerd-3e0a92f601c7d19b9b47c7ae1b3e1b6811475177f422394e2b729909176dd01c.scope - libcontainer container 3e0a92f601c7d19b9b47c7ae1b3e1b6811475177f422394e2b729909176dd01c. Dec 16 13:10:52.820438 systemd[1]: Started cri-containerd-eed1db98ae984d4e028be9de39d6aeda01529ec2c28d6831df395540b14bd4f1.scope - libcontainer container eed1db98ae984d4e028be9de39d6aeda01529ec2c28d6831df395540b14bd4f1. Dec 16 13:10:52.873762 containerd[1770]: time="2025-12-16T13:10:52.873724578Z" level=info msg="StartContainer for \"eed1db98ae984d4e028be9de39d6aeda01529ec2c28d6831df395540b14bd4f1\" returns successfully" Dec 16 13:10:52.874166 containerd[1770]: time="2025-12-16T13:10:52.873954852Z" level=info msg="StartContainer for \"39ca5bbac2b39e8800adb66471e19772fae078f2e2d705939e8400f9e693db6c\" returns successfully" Dec 16 13:10:52.874346 containerd[1770]: time="2025-12-16T13:10:52.874322692Z" level=info msg="StartContainer for \"3e0a92f601c7d19b9b47c7ae1b3e1b6811475177f422394e2b729909176dd01c\" returns successfully" Dec 16 13:10:52.896383 kubelet[2654]: E1216 13:10:52.896337 2654 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.25.207:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-9-79ca1ea2c9?timeout=10s\": dial tcp 10.0.25.207:6443: connect: connection refused" interval="1.6s" Dec 16 13:10:53.065995 kubelet[2654]: I1216 13:10:53.065956 2654 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:53.537573 kubelet[2654]: E1216 13:10:53.537263 2654 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-9-79ca1ea2c9\" not found" node="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:53.539716 kubelet[2654]: E1216 13:10:53.539680 2654 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-9-79ca1ea2c9\" not found" node="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:53.543328 kubelet[2654]: E1216 13:10:53.543279 2654 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-9-79ca1ea2c9\" not found" node="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:54.293718 kubelet[2654]: I1216 13:10:54.293397 2654 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:54.392680 kubelet[2654]: I1216 13:10:54.392278 2654 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:54.398483 kubelet[2654]: E1216 13:10:54.398407 2654 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-9-79ca1ea2c9\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:54.398483 kubelet[2654]: I1216 13:10:54.398449 2654 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:54.401554 kubelet[2654]: E1216 13:10:54.401452 2654 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:54.401554 kubelet[2654]: I1216 13:10:54.401514 2654 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:54.403317 kubelet[2654]: E1216 13:10:54.403254 2654 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-9-79ca1ea2c9\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:54.479092 kubelet[2654]: I1216 13:10:54.479003 2654 apiserver.go:52] "Watching apiserver" Dec 16 13:10:54.492735 kubelet[2654]: I1216 13:10:54.492645 2654 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 16 13:10:54.543511 kubelet[2654]: I1216 13:10:54.543412 2654 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:54.544339 kubelet[2654]: I1216 13:10:54.543991 2654 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:54.547417 kubelet[2654]: E1216 13:10:54.547287 2654 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-9-79ca1ea2c9\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:54.547733 kubelet[2654]: E1216 13:10:54.547651 2654 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-9-79ca1ea2c9\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:56.148530 systemd[1]: Reload requested from client PID 2956 ('systemctl') (unit session-7.scope)... Dec 16 13:10:56.148550 systemd[1]: Reloading... Dec 16 13:10:56.207412 zram_generator::config[2999]: No configuration found. Dec 16 13:10:56.397664 systemd[1]: Reloading finished in 248 ms. Dec 16 13:10:56.416726 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:10:56.439351 systemd[1]: kubelet.service: Deactivated successfully. Dec 16 13:10:56.439591 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:10:56.439649 systemd[1]: kubelet.service: Consumed 830ms CPU time, 126.2M memory peak. Dec 16 13:10:56.441178 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:10:56.655723 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:10:56.682042 (kubelet)[3051]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 13:10:56.740653 kubelet[3051]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 13:10:56.740653 kubelet[3051]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 13:10:56.741077 kubelet[3051]: I1216 13:10:56.740757 3051 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 13:10:56.747495 kubelet[3051]: I1216 13:10:56.747442 3051 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Dec 16 13:10:56.747495 kubelet[3051]: I1216 13:10:56.747471 3051 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 13:10:56.747495 kubelet[3051]: I1216 13:10:56.747500 3051 watchdog_linux.go:95] "Systemd watchdog is not enabled" Dec 16 13:10:56.747706 kubelet[3051]: I1216 13:10:56.747508 3051 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 13:10:56.747820 kubelet[3051]: I1216 13:10:56.747789 3051 server.go:956] "Client rotation is on, will bootstrap in background" Dec 16 13:10:56.749364 kubelet[3051]: I1216 13:10:56.749268 3051 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Dec 16 13:10:56.751892 kubelet[3051]: I1216 13:10:56.751816 3051 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 13:10:56.755358 kubelet[3051]: I1216 13:10:56.755339 3051 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 13:10:56.762661 kubelet[3051]: I1216 13:10:56.762610 3051 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Dec 16 13:10:56.763102 kubelet[3051]: I1216 13:10:56.763036 3051 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 13:10:56.763480 kubelet[3051]: I1216 13:10:56.763084 3051 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-9-79ca1ea2c9","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 13:10:56.763480 kubelet[3051]: I1216 13:10:56.763474 3051 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 13:10:56.763773 kubelet[3051]: I1216 13:10:56.763495 3051 container_manager_linux.go:306] "Creating device plugin manager" Dec 16 13:10:56.763773 kubelet[3051]: I1216 13:10:56.763538 3051 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Dec 16 13:10:56.765314 kubelet[3051]: I1216 13:10:56.765263 3051 state_mem.go:36] "Initialized new in-memory state store" Dec 16 13:10:56.765670 kubelet[3051]: I1216 13:10:56.765628 3051 kubelet.go:475] "Attempting to sync node with API server" Dec 16 13:10:56.765670 kubelet[3051]: I1216 13:10:56.765655 3051 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 13:10:56.765827 kubelet[3051]: I1216 13:10:56.765691 3051 kubelet.go:387] "Adding apiserver pod source" Dec 16 13:10:56.765827 kubelet[3051]: I1216 13:10:56.765729 3051 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 13:10:56.769270 kubelet[3051]: I1216 13:10:56.769238 3051 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 16 13:10:56.769686 kubelet[3051]: I1216 13:10:56.769666 3051 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 16 13:10:56.769779 kubelet[3051]: I1216 13:10:56.769694 3051 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Dec 16 13:10:56.773747 kubelet[3051]: I1216 13:10:56.773724 3051 server.go:1262] "Started kubelet" Dec 16 13:10:56.773903 kubelet[3051]: I1216 13:10:56.773836 3051 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 13:10:56.774103 kubelet[3051]: I1216 13:10:56.774023 3051 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 13:10:56.774103 kubelet[3051]: I1216 13:10:56.774073 3051 server_v1.go:49] "podresources" method="list" useActivePods=true Dec 16 13:10:56.774327 kubelet[3051]: I1216 13:10:56.774290 3051 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 13:10:56.775163 kubelet[3051]: I1216 13:10:56.775134 3051 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 13:10:56.775512 kubelet[3051]: I1216 13:10:56.775447 3051 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 13:10:56.775915 kubelet[3051]: I1216 13:10:56.775884 3051 volume_manager.go:313] "Starting Kubelet Volume Manager" Dec 16 13:10:56.776540 kubelet[3051]: I1216 13:10:56.775999 3051 reconciler.go:29] "Reconciler: start to sync state" Dec 16 13:10:56.779319 kubelet[3051]: I1216 13:10:56.778070 3051 server.go:310] "Adding debug handlers to kubelet server" Dec 16 13:10:56.779452 kubelet[3051]: I1216 13:10:56.779420 3051 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 16 13:10:56.779505 kubelet[3051]: E1216 13:10:56.779459 3051 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-9-79ca1ea2c9\" not found" Dec 16 13:10:56.785978 kubelet[3051]: I1216 13:10:56.785952 3051 factory.go:223] Registration of the systemd container factory successfully Dec 16 13:10:56.786132 kubelet[3051]: I1216 13:10:56.786022 3051 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 13:10:56.786756 kubelet[3051]: I1216 13:10:56.786707 3051 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Dec 16 13:10:56.787061 kubelet[3051]: E1216 13:10:56.787031 3051 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 16 13:10:56.787227 kubelet[3051]: I1216 13:10:56.787205 3051 factory.go:223] Registration of the containerd container factory successfully Dec 16 13:10:56.794880 kubelet[3051]: I1216 13:10:56.794844 3051 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Dec 16 13:10:56.794880 kubelet[3051]: I1216 13:10:56.794868 3051 status_manager.go:244] "Starting to sync pod status with apiserver" Dec 16 13:10:56.795019 kubelet[3051]: I1216 13:10:56.794904 3051 kubelet.go:2427] "Starting kubelet main sync loop" Dec 16 13:10:56.795019 kubelet[3051]: E1216 13:10:56.794992 3051 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 13:10:56.816445 kubelet[3051]: I1216 13:10:56.816406 3051 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 13:10:56.816445 kubelet[3051]: I1216 13:10:56.816426 3051 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 13:10:56.816445 kubelet[3051]: I1216 13:10:56.816449 3051 state_mem.go:36] "Initialized new in-memory state store" Dec 16 13:10:56.816660 kubelet[3051]: I1216 13:10:56.816591 3051 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 16 13:10:56.816660 kubelet[3051]: I1216 13:10:56.816600 3051 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 16 13:10:56.816660 kubelet[3051]: I1216 13:10:56.816620 3051 policy_none.go:49] "None policy: Start" Dec 16 13:10:56.816660 kubelet[3051]: I1216 13:10:56.816630 3051 memory_manager.go:187] "Starting memorymanager" policy="None" Dec 16 13:10:56.816660 kubelet[3051]: I1216 13:10:56.816640 3051 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Dec 16 13:10:56.816754 kubelet[3051]: I1216 13:10:56.816727 3051 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Dec 16 13:10:56.816754 kubelet[3051]: I1216 13:10:56.816735 3051 policy_none.go:47] "Start" Dec 16 13:10:56.820450 kubelet[3051]: E1216 13:10:56.820427 3051 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 16 13:10:56.820602 kubelet[3051]: I1216 13:10:56.820583 3051 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 13:10:56.820630 kubelet[3051]: I1216 13:10:56.820598 3051 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 13:10:56.820772 kubelet[3051]: I1216 13:10:56.820754 3051 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 13:10:56.821680 kubelet[3051]: E1216 13:10:56.821659 3051 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 13:10:56.896231 kubelet[3051]: I1216 13:10:56.896164 3051 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:56.896427 kubelet[3051]: I1216 13:10:56.896173 3051 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:56.896427 kubelet[3051]: I1216 13:10:56.896373 3051 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:56.923696 kubelet[3051]: I1216 13:10:56.923648 3051 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:56.933051 kubelet[3051]: I1216 13:10:56.932964 3051 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:56.933051 kubelet[3051]: I1216 13:10:56.933027 3051 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:57.077640 kubelet[3051]: I1216 13:10:57.077520 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/db1da35d80e3303675c287cd2133e522-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-9-79ca1ea2c9\" (UID: \"db1da35d80e3303675c287cd2133e522\") " pod="kube-system/kube-apiserver-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:57.077640 kubelet[3051]: I1216 13:10:57.077609 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6473c59e917320be6a9c07d436848961-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9\" (UID: \"6473c59e917320be6a9c07d436848961\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:57.077981 kubelet[3051]: I1216 13:10:57.077788 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6473c59e917320be6a9c07d436848961-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9\" (UID: \"6473c59e917320be6a9c07d436848961\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:57.077981 kubelet[3051]: I1216 13:10:57.077921 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/db1da35d80e3303675c287cd2133e522-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-9-79ca1ea2c9\" (UID: \"db1da35d80e3303675c287cd2133e522\") " pod="kube-system/kube-apiserver-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:57.078129 kubelet[3051]: I1216 13:10:57.077982 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6473c59e917320be6a9c07d436848961-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9\" (UID: \"6473c59e917320be6a9c07d436848961\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:57.078129 kubelet[3051]: I1216 13:10:57.078024 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6473c59e917320be6a9c07d436848961-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9\" (UID: \"6473c59e917320be6a9c07d436848961\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:57.078129 kubelet[3051]: I1216 13:10:57.078084 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6473c59e917320be6a9c07d436848961-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9\" (UID: \"6473c59e917320be6a9c07d436848961\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:57.078394 kubelet[3051]: I1216 13:10:57.078130 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/86d6a278ef154e19a9dbe0226cc4d120-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-9-79ca1ea2c9\" (UID: \"86d6a278ef154e19a9dbe0226cc4d120\") " pod="kube-system/kube-scheduler-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:57.078394 kubelet[3051]: I1216 13:10:57.078171 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/db1da35d80e3303675c287cd2133e522-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-9-79ca1ea2c9\" (UID: \"db1da35d80e3303675c287cd2133e522\") " pod="kube-system/kube-apiserver-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:57.766280 kubelet[3051]: I1216 13:10:57.766200 3051 apiserver.go:52] "Watching apiserver" Dec 16 13:10:57.780313 kubelet[3051]: I1216 13:10:57.780251 3051 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 16 13:10:57.804573 kubelet[3051]: I1216 13:10:57.804543 3051 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:57.804968 kubelet[3051]: I1216 13:10:57.804951 3051 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:57.805594 kubelet[3051]: I1216 13:10:57.805571 3051 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:57.813370 kubelet[3051]: E1216 13:10:57.813307 3051 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-9-79ca1ea2c9\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:57.813943 kubelet[3051]: E1216 13:10:57.813902 3051 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9\" already exists" pod="kube-system/kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:57.813943 kubelet[3051]: E1216 13:10:57.813918 3051 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-9-79ca1ea2c9\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:10:57.821900 kubelet[3051]: I1216 13:10:57.821831 3051 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-2-9-79ca1ea2c9" podStartSLOduration=1.821811013 podStartE2EDuration="1.821811013s" podCreationTimestamp="2025-12-16 13:10:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:10:57.821215982 +0000 UTC m=+1.133838551" watchObservedRunningTime="2025-12-16 13:10:57.821811013 +0000 UTC m=+1.134433617" Dec 16 13:10:57.829518 kubelet[3051]: I1216 13:10:57.829447 3051 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-2-9-79ca1ea2c9" podStartSLOduration=1.829434625 podStartE2EDuration="1.829434625s" podCreationTimestamp="2025-12-16 13:10:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:10:57.829429825 +0000 UTC m=+1.142052392" watchObservedRunningTime="2025-12-16 13:10:57.829434625 +0000 UTC m=+1.142057197" Dec 16 13:10:57.837952 kubelet[3051]: I1216 13:10:57.837901 3051 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-2-9-79ca1ea2c9" podStartSLOduration=1.837885924 podStartE2EDuration="1.837885924s" podCreationTimestamp="2025-12-16 13:10:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:10:57.837609694 +0000 UTC m=+1.150232266" watchObservedRunningTime="2025-12-16 13:10:57.837885924 +0000 UTC m=+1.150508497" Dec 16 13:11:02.628395 kubelet[3051]: I1216 13:11:02.628271 3051 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Dec 16 13:11:02.629084 containerd[1770]: time="2025-12-16T13:11:02.628999097Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Dec 16 13:11:02.629479 kubelet[3051]: I1216 13:11:02.629328 3051 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Dec 16 13:11:03.685879 systemd[1]: Created slice kubepods-besteffort-pod04ee919a_8c94_4d9d_9912_9c8318ee5d20.slice - libcontainer container kubepods-besteffort-pod04ee919a_8c94_4d9d_9912_9c8318ee5d20.slice. Dec 16 13:11:03.721977 kubelet[3051]: I1216 13:11:03.721905 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/04ee919a-8c94-4d9d-9912-9c8318ee5d20-lib-modules\") pod \"kube-proxy-gdqhm\" (UID: \"04ee919a-8c94-4d9d-9912-9c8318ee5d20\") " pod="kube-system/kube-proxy-gdqhm" Dec 16 13:11:03.722289 kubelet[3051]: I1216 13:11:03.721984 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2wcj\" (UniqueName: \"kubernetes.io/projected/04ee919a-8c94-4d9d-9912-9c8318ee5d20-kube-api-access-d2wcj\") pod \"kube-proxy-gdqhm\" (UID: \"04ee919a-8c94-4d9d-9912-9c8318ee5d20\") " pod="kube-system/kube-proxy-gdqhm" Dec 16 13:11:03.722289 kubelet[3051]: I1216 13:11:03.722018 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/04ee919a-8c94-4d9d-9912-9c8318ee5d20-kube-proxy\") pod \"kube-proxy-gdqhm\" (UID: \"04ee919a-8c94-4d9d-9912-9c8318ee5d20\") " pod="kube-system/kube-proxy-gdqhm" Dec 16 13:11:03.722289 kubelet[3051]: I1216 13:11:03.722043 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/04ee919a-8c94-4d9d-9912-9c8318ee5d20-xtables-lock\") pod \"kube-proxy-gdqhm\" (UID: \"04ee919a-8c94-4d9d-9912-9c8318ee5d20\") " pod="kube-system/kube-proxy-gdqhm" Dec 16 13:11:03.804413 systemd[1]: Created slice kubepods-besteffort-pod94829de0_01a7_487b_b724_32eef91529c4.slice - libcontainer container kubepods-besteffort-pod94829de0_01a7_487b_b724_32eef91529c4.slice. Dec 16 13:11:03.823338 kubelet[3051]: I1216 13:11:03.823284 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/94829de0-01a7-487b-b724-32eef91529c4-var-lib-calico\") pod \"tigera-operator-65cdcdfd6d-w7l2h\" (UID: \"94829de0-01a7-487b-b724-32eef91529c4\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-w7l2h" Dec 16 13:11:03.823338 kubelet[3051]: I1216 13:11:03.823329 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv8gz\" (UniqueName: \"kubernetes.io/projected/94829de0-01a7-487b-b724-32eef91529c4-kube-api-access-lv8gz\") pod \"tigera-operator-65cdcdfd6d-w7l2h\" (UID: \"94829de0-01a7-487b-b724-32eef91529c4\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-w7l2h" Dec 16 13:11:04.010535 containerd[1770]: time="2025-12-16T13:11:04.010350121Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-gdqhm,Uid:04ee919a-8c94-4d9d-9912-9c8318ee5d20,Namespace:kube-system,Attempt:0,}" Dec 16 13:11:04.045320 containerd[1770]: time="2025-12-16T13:11:04.045199873Z" level=info msg="connecting to shim 1754eb70b759939b4e210b06482499d4adf02380f5660363f64f99830c09ee6a" address="unix:///run/containerd/s/b7485171789240d0a535b66f9fc9b91385c81cd60e75d69c38f10c3e02ec2aef" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:11:04.092537 systemd[1]: Started cri-containerd-1754eb70b759939b4e210b06482499d4adf02380f5660363f64f99830c09ee6a.scope - libcontainer container 1754eb70b759939b4e210b06482499d4adf02380f5660363f64f99830c09ee6a. Dec 16 13:11:04.112765 containerd[1770]: time="2025-12-16T13:11:04.112659956Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-w7l2h,Uid:94829de0-01a7-487b-b724-32eef91529c4,Namespace:tigera-operator,Attempt:0,}" Dec 16 13:11:04.135896 containerd[1770]: time="2025-12-16T13:11:04.135808109Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-gdqhm,Uid:04ee919a-8c94-4d9d-9912-9c8318ee5d20,Namespace:kube-system,Attempt:0,} returns sandbox id \"1754eb70b759939b4e210b06482499d4adf02380f5660363f64f99830c09ee6a\"" Dec 16 13:11:04.142193 containerd[1770]: time="2025-12-16T13:11:04.142151795Z" level=info msg="connecting to shim 0abbeac5d8ea9ba7fad0432d4cea30fae873e48ba590697c773d466084a3cdf2" address="unix:///run/containerd/s/a7e13e2b9dbdf51efd3ce75133f9e28080cacb647b43e15ac8ddda6b3773a002" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:11:04.143085 containerd[1770]: time="2025-12-16T13:11:04.143022134Z" level=info msg="CreateContainer within sandbox \"1754eb70b759939b4e210b06482499d4adf02380f5660363f64f99830c09ee6a\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Dec 16 13:11:04.164832 containerd[1770]: time="2025-12-16T13:11:04.164774838Z" level=info msg="Container d600640b7a4ca8ac740a4e438ef66e304e42c921e1cd6ba4630519ae9196a51b: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:11:04.175582 systemd[1]: Started cri-containerd-0abbeac5d8ea9ba7fad0432d4cea30fae873e48ba590697c773d466084a3cdf2.scope - libcontainer container 0abbeac5d8ea9ba7fad0432d4cea30fae873e48ba590697c773d466084a3cdf2. Dec 16 13:11:04.176958 containerd[1770]: time="2025-12-16T13:11:04.176863625Z" level=info msg="CreateContainer within sandbox \"1754eb70b759939b4e210b06482499d4adf02380f5660363f64f99830c09ee6a\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"d600640b7a4ca8ac740a4e438ef66e304e42c921e1cd6ba4630519ae9196a51b\"" Dec 16 13:11:04.177676 containerd[1770]: time="2025-12-16T13:11:04.177637817Z" level=info msg="StartContainer for \"d600640b7a4ca8ac740a4e438ef66e304e42c921e1cd6ba4630519ae9196a51b\"" Dec 16 13:11:04.179896 containerd[1770]: time="2025-12-16T13:11:04.179856590Z" level=info msg="connecting to shim d600640b7a4ca8ac740a4e438ef66e304e42c921e1cd6ba4630519ae9196a51b" address="unix:///run/containerd/s/b7485171789240d0a535b66f9fc9b91385c81cd60e75d69c38f10c3e02ec2aef" protocol=ttrpc version=3 Dec 16 13:11:04.213565 systemd[1]: Started cri-containerd-d600640b7a4ca8ac740a4e438ef66e304e42c921e1cd6ba4630519ae9196a51b.scope - libcontainer container d600640b7a4ca8ac740a4e438ef66e304e42c921e1cd6ba4630519ae9196a51b. Dec 16 13:11:04.242807 containerd[1770]: time="2025-12-16T13:11:04.242759075Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-w7l2h,Uid:94829de0-01a7-487b-b724-32eef91529c4,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"0abbeac5d8ea9ba7fad0432d4cea30fae873e48ba590697c773d466084a3cdf2\"" Dec 16 13:11:04.244619 containerd[1770]: time="2025-12-16T13:11:04.244585809Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Dec 16 13:11:04.294521 containerd[1770]: time="2025-12-16T13:11:04.294392372Z" level=info msg="StartContainer for \"d600640b7a4ca8ac740a4e438ef66e304e42c921e1cd6ba4630519ae9196a51b\" returns successfully" Dec 16 13:11:04.843517 kubelet[3051]: I1216 13:11:04.843424 3051 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-gdqhm" podStartSLOduration=1.84339951 podStartE2EDuration="1.84339951s" podCreationTimestamp="2025-12-16 13:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:11:04.842184144 +0000 UTC m=+8.154806714" watchObservedRunningTime="2025-12-16 13:11:04.84339951 +0000 UTC m=+8.156022080" Dec 16 13:11:06.110640 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount652994574.mount: Deactivated successfully. Dec 16 13:11:06.491709 containerd[1770]: time="2025-12-16T13:11:06.491609954Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:11:06.492754 containerd[1770]: time="2025-12-16T13:11:06.492730753Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=25061691" Dec 16 13:11:06.494352 containerd[1770]: time="2025-12-16T13:11:06.494327802Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:11:06.496854 containerd[1770]: time="2025-12-16T13:11:06.496830941Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:11:06.497346 containerd[1770]: time="2025-12-16T13:11:06.497284379Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 2.252669245s" Dec 16 13:11:06.497388 containerd[1770]: time="2025-12-16T13:11:06.497346880Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Dec 16 13:11:06.501533 containerd[1770]: time="2025-12-16T13:11:06.501506135Z" level=info msg="CreateContainer within sandbox \"0abbeac5d8ea9ba7fad0432d4cea30fae873e48ba590697c773d466084a3cdf2\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Dec 16 13:11:06.509253 containerd[1770]: time="2025-12-16T13:11:06.509025718Z" level=info msg="Container db5db23fac200a0160651391da1cc52e5337cc57b951ee74a4f664d3b0d7242b: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:11:06.516132 containerd[1770]: time="2025-12-16T13:11:06.516099024Z" level=info msg="CreateContainer within sandbox \"0abbeac5d8ea9ba7fad0432d4cea30fae873e48ba590697c773d466084a3cdf2\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"db5db23fac200a0160651391da1cc52e5337cc57b951ee74a4f664d3b0d7242b\"" Dec 16 13:11:06.516576 containerd[1770]: time="2025-12-16T13:11:06.516537403Z" level=info msg="StartContainer for \"db5db23fac200a0160651391da1cc52e5337cc57b951ee74a4f664d3b0d7242b\"" Dec 16 13:11:06.517136 containerd[1770]: time="2025-12-16T13:11:06.517113180Z" level=info msg="connecting to shim db5db23fac200a0160651391da1cc52e5337cc57b951ee74a4f664d3b0d7242b" address="unix:///run/containerd/s/a7e13e2b9dbdf51efd3ce75133f9e28080cacb647b43e15ac8ddda6b3773a002" protocol=ttrpc version=3 Dec 16 13:11:06.548576 systemd[1]: Started cri-containerd-db5db23fac200a0160651391da1cc52e5337cc57b951ee74a4f664d3b0d7242b.scope - libcontainer container db5db23fac200a0160651391da1cc52e5337cc57b951ee74a4f664d3b0d7242b. Dec 16 13:11:06.582446 containerd[1770]: time="2025-12-16T13:11:06.582376594Z" level=info msg="StartContainer for \"db5db23fac200a0160651391da1cc52e5337cc57b951ee74a4f664d3b0d7242b\" returns successfully" Dec 16 13:11:06.847909 kubelet[3051]: I1216 13:11:06.847835 3051 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-65cdcdfd6d-w7l2h" podStartSLOduration=1.593933324 podStartE2EDuration="3.84781309s" podCreationTimestamp="2025-12-16 13:11:03 +0000 UTC" firstStartedPulling="2025-12-16 13:11:04.24420426 +0000 UTC m=+7.556826829" lastFinishedPulling="2025-12-16 13:11:06.498084053 +0000 UTC m=+9.810706595" observedRunningTime="2025-12-16 13:11:06.847680062 +0000 UTC m=+10.160302645" watchObservedRunningTime="2025-12-16 13:11:06.84781309 +0000 UTC m=+10.160435702" Dec 16 13:11:11.577936 sudo[2044]: pam_unix(sudo:session): session closed for user root Dec 16 13:11:11.732725 sshd[2043]: Connection closed by 147.75.109.163 port 51194 Dec 16 13:11:11.733060 sshd-session[2040]: pam_unix(sshd:session): session closed for user core Dec 16 13:11:11.736963 systemd[1]: sshd@6-10.0.25.207:22-147.75.109.163:51194.service: Deactivated successfully. Dec 16 13:11:11.738753 systemd[1]: session-7.scope: Deactivated successfully. Dec 16 13:11:11.738937 systemd[1]: session-7.scope: Consumed 5.612s CPU time, 235.4M memory peak. Dec 16 13:11:11.739913 systemd-logind[1750]: Session 7 logged out. Waiting for processes to exit. Dec 16 13:11:11.741942 systemd-logind[1750]: Removed session 7. Dec 16 13:11:15.752745 systemd[1]: Created slice kubepods-besteffort-pod9c10681a_7b80_4960_9c55_04a16b4ca78b.slice - libcontainer container kubepods-besteffort-pod9c10681a_7b80_4960_9c55_04a16b4ca78b.slice. Dec 16 13:11:15.805752 kubelet[3051]: I1216 13:11:15.805669 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/9c10681a-7b80-4960-9c55-04a16b4ca78b-typha-certs\") pod \"calico-typha-5996c95d7d-tmzv4\" (UID: \"9c10681a-7b80-4960-9c55-04a16b4ca78b\") " pod="calico-system/calico-typha-5996c95d7d-tmzv4" Dec 16 13:11:15.805752 kubelet[3051]: I1216 13:11:15.805715 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfjkc\" (UniqueName: \"kubernetes.io/projected/9c10681a-7b80-4960-9c55-04a16b4ca78b-kube-api-access-cfjkc\") pod \"calico-typha-5996c95d7d-tmzv4\" (UID: \"9c10681a-7b80-4960-9c55-04a16b4ca78b\") " pod="calico-system/calico-typha-5996c95d7d-tmzv4" Dec 16 13:11:15.805752 kubelet[3051]: I1216 13:11:15.805740 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c10681a-7b80-4960-9c55-04a16b4ca78b-tigera-ca-bundle\") pod \"calico-typha-5996c95d7d-tmzv4\" (UID: \"9c10681a-7b80-4960-9c55-04a16b4ca78b\") " pod="calico-system/calico-typha-5996c95d7d-tmzv4" Dec 16 13:11:15.945003 systemd[1]: Created slice kubepods-besteffort-podf01db285_4c53_4fdc_a343_5930169cd3f0.slice - libcontainer container kubepods-besteffort-podf01db285_4c53_4fdc_a343_5930169cd3f0.slice. Dec 16 13:11:16.007315 kubelet[3051]: I1216 13:11:16.007193 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f01db285-4c53-4fdc-a343-5930169cd3f0-tigera-ca-bundle\") pod \"calico-node-x8b4q\" (UID: \"f01db285-4c53-4fdc-a343-5930169cd3f0\") " pod="calico-system/calico-node-x8b4q" Dec 16 13:11:16.007315 kubelet[3051]: I1216 13:11:16.007235 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/f01db285-4c53-4fdc-a343-5930169cd3f0-cni-log-dir\") pod \"calico-node-x8b4q\" (UID: \"f01db285-4c53-4fdc-a343-5930169cd3f0\") " pod="calico-system/calico-node-x8b4q" Dec 16 13:11:16.007315 kubelet[3051]: I1216 13:11:16.007252 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/f01db285-4c53-4fdc-a343-5930169cd3f0-cni-bin-dir\") pod \"calico-node-x8b4q\" (UID: \"f01db285-4c53-4fdc-a343-5930169cd3f0\") " pod="calico-system/calico-node-x8b4q" Dec 16 13:11:16.007315 kubelet[3051]: I1216 13:11:16.007265 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/f01db285-4c53-4fdc-a343-5930169cd3f0-var-lib-calico\") pod \"calico-node-x8b4q\" (UID: \"f01db285-4c53-4fdc-a343-5930169cd3f0\") " pod="calico-system/calico-node-x8b4q" Dec 16 13:11:16.007315 kubelet[3051]: I1216 13:11:16.007281 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zf8v\" (UniqueName: \"kubernetes.io/projected/f01db285-4c53-4fdc-a343-5930169cd3f0-kube-api-access-2zf8v\") pod \"calico-node-x8b4q\" (UID: \"f01db285-4c53-4fdc-a343-5930169cd3f0\") " pod="calico-system/calico-node-x8b4q" Dec 16 13:11:16.007523 kubelet[3051]: I1216 13:11:16.007307 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/f01db285-4c53-4fdc-a343-5930169cd3f0-flexvol-driver-host\") pod \"calico-node-x8b4q\" (UID: \"f01db285-4c53-4fdc-a343-5930169cd3f0\") " pod="calico-system/calico-node-x8b4q" Dec 16 13:11:16.007523 kubelet[3051]: I1216 13:11:16.007323 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/f01db285-4c53-4fdc-a343-5930169cd3f0-node-certs\") pod \"calico-node-x8b4q\" (UID: \"f01db285-4c53-4fdc-a343-5930169cd3f0\") " pod="calico-system/calico-node-x8b4q" Dec 16 13:11:16.007523 kubelet[3051]: I1216 13:11:16.007337 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/f01db285-4c53-4fdc-a343-5930169cd3f0-policysync\") pod \"calico-node-x8b4q\" (UID: \"f01db285-4c53-4fdc-a343-5930169cd3f0\") " pod="calico-system/calico-node-x8b4q" Dec 16 13:11:16.007523 kubelet[3051]: I1216 13:11:16.007350 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/f01db285-4c53-4fdc-a343-5930169cd3f0-var-run-calico\") pod \"calico-node-x8b4q\" (UID: \"f01db285-4c53-4fdc-a343-5930169cd3f0\") " pod="calico-system/calico-node-x8b4q" Dec 16 13:11:16.007523 kubelet[3051]: I1216 13:11:16.007368 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/f01db285-4c53-4fdc-a343-5930169cd3f0-cni-net-dir\") pod \"calico-node-x8b4q\" (UID: \"f01db285-4c53-4fdc-a343-5930169cd3f0\") " pod="calico-system/calico-node-x8b4q" Dec 16 13:11:16.007627 kubelet[3051]: I1216 13:11:16.007381 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f01db285-4c53-4fdc-a343-5930169cd3f0-lib-modules\") pod \"calico-node-x8b4q\" (UID: \"f01db285-4c53-4fdc-a343-5930169cd3f0\") " pod="calico-system/calico-node-x8b4q" Dec 16 13:11:16.007627 kubelet[3051]: I1216 13:11:16.007394 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f01db285-4c53-4fdc-a343-5930169cd3f0-xtables-lock\") pod \"calico-node-x8b4q\" (UID: \"f01db285-4c53-4fdc-a343-5930169cd3f0\") " pod="calico-system/calico-node-x8b4q" Dec 16 13:11:16.061387 containerd[1770]: time="2025-12-16T13:11:16.061321334Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5996c95d7d-tmzv4,Uid:9c10681a-7b80-4960-9c55-04a16b4ca78b,Namespace:calico-system,Attempt:0,}" Dec 16 13:11:16.094732 containerd[1770]: time="2025-12-16T13:11:16.094675815Z" level=info msg="connecting to shim 52a2d0c00998b2db58653c2be14507f2efb6d82cde7dd59c9a41064c49103103" address="unix:///run/containerd/s/70d80bbd625e99ddcaed66035bcc85d020884b5bd19d6553b0551065d64fa266" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:11:16.110090 kubelet[3051]: E1216 13:11:16.110061 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.110090 kubelet[3051]: W1216 13:11:16.110084 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.110235 kubelet[3051]: E1216 13:11:16.110105 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.116322 kubelet[3051]: E1216 13:11:16.114360 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.116322 kubelet[3051]: W1216 13:11:16.114382 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.116322 kubelet[3051]: E1216 13:11:16.114400 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.120566 kubelet[3051]: E1216 13:11:16.120533 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.120566 kubelet[3051]: W1216 13:11:16.120550 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.120566 kubelet[3051]: E1216 13:11:16.120575 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.132547 systemd[1]: Started cri-containerd-52a2d0c00998b2db58653c2be14507f2efb6d82cde7dd59c9a41064c49103103.scope - libcontainer container 52a2d0c00998b2db58653c2be14507f2efb6d82cde7dd59c9a41064c49103103. Dec 16 13:11:16.137883 kubelet[3051]: E1216 13:11:16.137851 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:11:16.185485 containerd[1770]: time="2025-12-16T13:11:16.185440112Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5996c95d7d-tmzv4,Uid:9c10681a-7b80-4960-9c55-04a16b4ca78b,Namespace:calico-system,Attempt:0,} returns sandbox id \"52a2d0c00998b2db58653c2be14507f2efb6d82cde7dd59c9a41064c49103103\"" Dec 16 13:11:16.186845 containerd[1770]: time="2025-12-16T13:11:16.186814503Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Dec 16 13:11:16.193665 kubelet[3051]: E1216 13:11:16.193633 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.193665 kubelet[3051]: W1216 13:11:16.193657 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.193665 kubelet[3051]: E1216 13:11:16.193676 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.193909 kubelet[3051]: E1216 13:11:16.193834 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.193909 kubelet[3051]: W1216 13:11:16.193840 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.193909 kubelet[3051]: E1216 13:11:16.193847 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.194005 kubelet[3051]: E1216 13:11:16.193996 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.194005 kubelet[3051]: W1216 13:11:16.194003 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.194077 kubelet[3051]: E1216 13:11:16.194010 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.194202 kubelet[3051]: E1216 13:11:16.194193 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.194202 kubelet[3051]: W1216 13:11:16.194201 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.194263 kubelet[3051]: E1216 13:11:16.194208 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.194387 kubelet[3051]: E1216 13:11:16.194358 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.194387 kubelet[3051]: W1216 13:11:16.194365 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.194387 kubelet[3051]: E1216 13:11:16.194383 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.195103 kubelet[3051]: E1216 13:11:16.194538 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.195103 kubelet[3051]: W1216 13:11:16.194545 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.195103 kubelet[3051]: E1216 13:11:16.194552 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.195103 kubelet[3051]: E1216 13:11:16.194675 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.195103 kubelet[3051]: W1216 13:11:16.194681 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.195103 kubelet[3051]: E1216 13:11:16.194687 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.195103 kubelet[3051]: E1216 13:11:16.194805 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.195103 kubelet[3051]: W1216 13:11:16.194810 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.195103 kubelet[3051]: E1216 13:11:16.194816 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.195103 kubelet[3051]: E1216 13:11:16.194936 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.195388 kubelet[3051]: W1216 13:11:16.194941 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.195388 kubelet[3051]: E1216 13:11:16.194946 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.195388 kubelet[3051]: E1216 13:11:16.195061 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.195388 kubelet[3051]: W1216 13:11:16.195066 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.195388 kubelet[3051]: E1216 13:11:16.195071 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.195388 kubelet[3051]: E1216 13:11:16.195184 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.195388 kubelet[3051]: W1216 13:11:16.195188 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.195388 kubelet[3051]: E1216 13:11:16.195194 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.195388 kubelet[3051]: E1216 13:11:16.195320 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.195388 kubelet[3051]: W1216 13:11:16.195325 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.195639 kubelet[3051]: E1216 13:11:16.195330 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.195639 kubelet[3051]: E1216 13:11:16.195444 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.195639 kubelet[3051]: W1216 13:11:16.195450 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.195639 kubelet[3051]: E1216 13:11:16.195456 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.195639 kubelet[3051]: E1216 13:11:16.195587 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.195639 kubelet[3051]: W1216 13:11:16.195592 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.195639 kubelet[3051]: E1216 13:11:16.195598 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.195797 kubelet[3051]: E1216 13:11:16.195710 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.195797 kubelet[3051]: W1216 13:11:16.195715 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.195797 kubelet[3051]: E1216 13:11:16.195720 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.195871 kubelet[3051]: E1216 13:11:16.195835 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.195871 kubelet[3051]: W1216 13:11:16.195841 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.195871 kubelet[3051]: E1216 13:11:16.195847 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.196030 kubelet[3051]: E1216 13:11:16.195981 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.196030 kubelet[3051]: W1216 13:11:16.195987 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.196030 kubelet[3051]: E1216 13:11:16.195992 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.196109 kubelet[3051]: E1216 13:11:16.196100 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.196109 kubelet[3051]: W1216 13:11:16.196109 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.196156 kubelet[3051]: E1216 13:11:16.196114 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.196226 kubelet[3051]: E1216 13:11:16.196218 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.196226 kubelet[3051]: W1216 13:11:16.196225 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.196282 kubelet[3051]: E1216 13:11:16.196230 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.196353 kubelet[3051]: E1216 13:11:16.196345 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.196353 kubelet[3051]: W1216 13:11:16.196352 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.196402 kubelet[3051]: E1216 13:11:16.196357 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.209816 kubelet[3051]: E1216 13:11:16.209740 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.209816 kubelet[3051]: W1216 13:11:16.209759 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.209816 kubelet[3051]: E1216 13:11:16.209776 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.209816 kubelet[3051]: I1216 13:11:16.209798 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/49b26efe-3f65-4cf1-9ebd-006f8e8a22f9-socket-dir\") pod \"csi-node-driver-mvzk2\" (UID: \"49b26efe-3f65-4cf1-9ebd-006f8e8a22f9\") " pod="calico-system/csi-node-driver-mvzk2" Dec 16 13:11:16.210013 kubelet[3051]: E1216 13:11:16.209953 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.210013 kubelet[3051]: W1216 13:11:16.209960 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.210013 kubelet[3051]: E1216 13:11:16.209967 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.210013 kubelet[3051]: I1216 13:11:16.209980 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49b26efe-3f65-4cf1-9ebd-006f8e8a22f9-kubelet-dir\") pod \"csi-node-driver-mvzk2\" (UID: \"49b26efe-3f65-4cf1-9ebd-006f8e8a22f9\") " pod="calico-system/csi-node-driver-mvzk2" Dec 16 13:11:16.210161 kubelet[3051]: E1216 13:11:16.210137 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.210161 kubelet[3051]: W1216 13:11:16.210145 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.210161 kubelet[3051]: E1216 13:11:16.210151 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.210234 kubelet[3051]: I1216 13:11:16.210162 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/49b26efe-3f65-4cf1-9ebd-006f8e8a22f9-registration-dir\") pod \"csi-node-driver-mvzk2\" (UID: \"49b26efe-3f65-4cf1-9ebd-006f8e8a22f9\") " pod="calico-system/csi-node-driver-mvzk2" Dec 16 13:11:16.210359 kubelet[3051]: E1216 13:11:16.210347 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.210359 kubelet[3051]: W1216 13:11:16.210357 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.210415 kubelet[3051]: E1216 13:11:16.210377 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.210415 kubelet[3051]: I1216 13:11:16.210406 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/49b26efe-3f65-4cf1-9ebd-006f8e8a22f9-varrun\") pod \"csi-node-driver-mvzk2\" (UID: \"49b26efe-3f65-4cf1-9ebd-006f8e8a22f9\") " pod="calico-system/csi-node-driver-mvzk2" Dec 16 13:11:16.210699 kubelet[3051]: E1216 13:11:16.210665 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.211033 kubelet[3051]: W1216 13:11:16.210705 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.211033 kubelet[3051]: E1216 13:11:16.210714 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.211033 kubelet[3051]: I1216 13:11:16.210728 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxhpv\" (UniqueName: \"kubernetes.io/projected/49b26efe-3f65-4cf1-9ebd-006f8e8a22f9-kube-api-access-cxhpv\") pod \"csi-node-driver-mvzk2\" (UID: \"49b26efe-3f65-4cf1-9ebd-006f8e8a22f9\") " pod="calico-system/csi-node-driver-mvzk2" Dec 16 13:11:16.211033 kubelet[3051]: E1216 13:11:16.210892 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.211033 kubelet[3051]: W1216 13:11:16.210898 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.211033 kubelet[3051]: E1216 13:11:16.210905 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.211746 kubelet[3051]: E1216 13:11:16.211045 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.211746 kubelet[3051]: W1216 13:11:16.211049 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.211746 kubelet[3051]: E1216 13:11:16.211055 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.211746 kubelet[3051]: E1216 13:11:16.211207 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.211746 kubelet[3051]: W1216 13:11:16.211213 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.211746 kubelet[3051]: E1216 13:11:16.211220 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.211746 kubelet[3051]: E1216 13:11:16.211358 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.211746 kubelet[3051]: W1216 13:11:16.211363 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.211746 kubelet[3051]: E1216 13:11:16.211369 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.211746 kubelet[3051]: E1216 13:11:16.211492 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.211992 kubelet[3051]: W1216 13:11:16.211497 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.211992 kubelet[3051]: E1216 13:11:16.211502 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.211992 kubelet[3051]: E1216 13:11:16.211618 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.211992 kubelet[3051]: W1216 13:11:16.211625 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.211992 kubelet[3051]: E1216 13:11:16.211631 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.211992 kubelet[3051]: E1216 13:11:16.211754 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.211992 kubelet[3051]: W1216 13:11:16.211759 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.211992 kubelet[3051]: E1216 13:11:16.211764 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.211992 kubelet[3051]: E1216 13:11:16.211902 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.211992 kubelet[3051]: W1216 13:11:16.211907 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.212191 kubelet[3051]: E1216 13:11:16.211913 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.212191 kubelet[3051]: E1216 13:11:16.212077 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.212191 kubelet[3051]: W1216 13:11:16.212083 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.212191 kubelet[3051]: E1216 13:11:16.212089 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.212264 kubelet[3051]: E1216 13:11:16.212260 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.212283 kubelet[3051]: W1216 13:11:16.212265 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.212283 kubelet[3051]: E1216 13:11:16.212272 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.252151 containerd[1770]: time="2025-12-16T13:11:16.252082311Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-x8b4q,Uid:f01db285-4c53-4fdc-a343-5930169cd3f0,Namespace:calico-system,Attempt:0,}" Dec 16 13:11:16.277958 containerd[1770]: time="2025-12-16T13:11:16.277854960Z" level=info msg="connecting to shim c292fe828bcbb00615c372a1fe5594d8a9193821251933e77e29d90715d44e84" address="unix:///run/containerd/s/129ab271380e0a1cb8df0a0a6c1868897f084e5d865bcbf56f669ac4229be9dd" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:11:16.305550 systemd[1]: Started cri-containerd-c292fe828bcbb00615c372a1fe5594d8a9193821251933e77e29d90715d44e84.scope - libcontainer container c292fe828bcbb00615c372a1fe5594d8a9193821251933e77e29d90715d44e84. Dec 16 13:11:16.311767 kubelet[3051]: E1216 13:11:16.311742 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.311767 kubelet[3051]: W1216 13:11:16.311760 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.311901 kubelet[3051]: E1216 13:11:16.311780 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.311978 kubelet[3051]: E1216 13:11:16.311969 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.312006 kubelet[3051]: W1216 13:11:16.311978 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.312006 kubelet[3051]: E1216 13:11:16.311985 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.312156 kubelet[3051]: E1216 13:11:16.312148 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.312156 kubelet[3051]: W1216 13:11:16.312156 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.312202 kubelet[3051]: E1216 13:11:16.312162 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.312345 kubelet[3051]: E1216 13:11:16.312337 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.312379 kubelet[3051]: W1216 13:11:16.312345 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.312379 kubelet[3051]: E1216 13:11:16.312351 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.312517 kubelet[3051]: E1216 13:11:16.312509 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.312544 kubelet[3051]: W1216 13:11:16.312517 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.312544 kubelet[3051]: E1216 13:11:16.312524 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.312706 kubelet[3051]: E1216 13:11:16.312697 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.312706 kubelet[3051]: W1216 13:11:16.312706 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.312763 kubelet[3051]: E1216 13:11:16.312717 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.312876 kubelet[3051]: E1216 13:11:16.312868 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.312876 kubelet[3051]: W1216 13:11:16.312876 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.312923 kubelet[3051]: E1216 13:11:16.312882 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.313068 kubelet[3051]: E1216 13:11:16.313058 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.313068 kubelet[3051]: W1216 13:11:16.313067 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.313234 kubelet[3051]: E1216 13:11:16.313073 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.313234 kubelet[3051]: E1216 13:11:16.313211 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.313234 kubelet[3051]: W1216 13:11:16.313216 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.313312 kubelet[3051]: E1216 13:11:16.313236 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.313379 kubelet[3051]: E1216 13:11:16.313371 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.313379 kubelet[3051]: W1216 13:11:16.313379 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.313429 kubelet[3051]: E1216 13:11:16.313385 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.313531 kubelet[3051]: E1216 13:11:16.313523 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.313531 kubelet[3051]: W1216 13:11:16.313530 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.313572 kubelet[3051]: E1216 13:11:16.313536 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.313675 kubelet[3051]: E1216 13:11:16.313667 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.313675 kubelet[3051]: W1216 13:11:16.313674 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.313724 kubelet[3051]: E1216 13:11:16.313681 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.313860 kubelet[3051]: E1216 13:11:16.313852 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.313860 kubelet[3051]: W1216 13:11:16.313859 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.313902 kubelet[3051]: E1216 13:11:16.313865 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.314005 kubelet[3051]: E1216 13:11:16.313997 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.314005 kubelet[3051]: W1216 13:11:16.314004 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.314052 kubelet[3051]: E1216 13:11:16.314010 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.314128 kubelet[3051]: E1216 13:11:16.314119 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.314128 kubelet[3051]: W1216 13:11:16.314127 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.314128 kubelet[3051]: E1216 13:11:16.314133 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.314267 kubelet[3051]: E1216 13:11:16.314259 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.314309 kubelet[3051]: W1216 13:11:16.314266 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.314309 kubelet[3051]: E1216 13:11:16.314272 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.314572 kubelet[3051]: E1216 13:11:16.314408 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.314572 kubelet[3051]: W1216 13:11:16.314415 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.314572 kubelet[3051]: E1216 13:11:16.314421 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.314572 kubelet[3051]: E1216 13:11:16.314571 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.314665 kubelet[3051]: W1216 13:11:16.314577 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.314665 kubelet[3051]: E1216 13:11:16.314583 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.314729 kubelet[3051]: E1216 13:11:16.314720 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.314729 kubelet[3051]: W1216 13:11:16.314727 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.315014 kubelet[3051]: E1216 13:11:16.314733 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.315014 kubelet[3051]: E1216 13:11:16.314852 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.315014 kubelet[3051]: W1216 13:11:16.314857 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.315014 kubelet[3051]: E1216 13:11:16.314869 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.315014 kubelet[3051]: E1216 13:11:16.314982 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.315014 kubelet[3051]: W1216 13:11:16.314987 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.315014 kubelet[3051]: E1216 13:11:16.314992 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.315151 kubelet[3051]: E1216 13:11:16.315119 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.315151 kubelet[3051]: W1216 13:11:16.315124 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.315151 kubelet[3051]: E1216 13:11:16.315130 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.315284 kubelet[3051]: E1216 13:11:16.315271 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.315284 kubelet[3051]: W1216 13:11:16.315279 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.315349 kubelet[3051]: E1216 13:11:16.315285 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.315651 kubelet[3051]: E1216 13:11:16.315414 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.315651 kubelet[3051]: W1216 13:11:16.315421 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.315651 kubelet[3051]: E1216 13:11:16.315427 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.315651 kubelet[3051]: E1216 13:11:16.315636 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.315651 kubelet[3051]: W1216 13:11:16.315641 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.315651 kubelet[3051]: E1216 13:11:16.315648 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.324561 kubelet[3051]: E1216 13:11:16.324539 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:16.324561 kubelet[3051]: W1216 13:11:16.324555 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:16.324561 kubelet[3051]: E1216 13:11:16.324571 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:16.328712 containerd[1770]: time="2025-12-16T13:11:16.328683975Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-x8b4q,Uid:f01db285-4c53-4fdc-a343-5930169cd3f0,Namespace:calico-system,Attempt:0,} returns sandbox id \"c292fe828bcbb00615c372a1fe5594d8a9193821251933e77e29d90715d44e84\"" Dec 16 13:11:17.767148 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4055451965.mount: Deactivated successfully. Dec 16 13:11:17.795708 kubelet[3051]: E1216 13:11:17.795344 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:11:18.710759 containerd[1770]: time="2025-12-16T13:11:18.710692713Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:11:18.713364 containerd[1770]: time="2025-12-16T13:11:18.713332463Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=35234628" Dec 16 13:11:18.714971 containerd[1770]: time="2025-12-16T13:11:18.714937795Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:11:18.717193 containerd[1770]: time="2025-12-16T13:11:18.717117953Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:11:18.717857 containerd[1770]: time="2025-12-16T13:11:18.717476402Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 2.530632724s" Dec 16 13:11:18.717857 containerd[1770]: time="2025-12-16T13:11:18.717508995Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Dec 16 13:11:18.719708 containerd[1770]: time="2025-12-16T13:11:18.719333399Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Dec 16 13:11:18.729441 containerd[1770]: time="2025-12-16T13:11:18.729411080Z" level=info msg="CreateContainer within sandbox \"52a2d0c00998b2db58653c2be14507f2efb6d82cde7dd59c9a41064c49103103\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Dec 16 13:11:18.739512 containerd[1770]: time="2025-12-16T13:11:18.739436126Z" level=info msg="Container 9cc1528219f32a69a850312d59c8be6bae5002841cfd5b8aeff65518e6bf53ce: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:11:18.750097 containerd[1770]: time="2025-12-16T13:11:18.750044196Z" level=info msg="CreateContainer within sandbox \"52a2d0c00998b2db58653c2be14507f2efb6d82cde7dd59c9a41064c49103103\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"9cc1528219f32a69a850312d59c8be6bae5002841cfd5b8aeff65518e6bf53ce\"" Dec 16 13:11:18.751108 containerd[1770]: time="2025-12-16T13:11:18.750661469Z" level=info msg="StartContainer for \"9cc1528219f32a69a850312d59c8be6bae5002841cfd5b8aeff65518e6bf53ce\"" Dec 16 13:11:18.751638 containerd[1770]: time="2025-12-16T13:11:18.751618696Z" level=info msg="connecting to shim 9cc1528219f32a69a850312d59c8be6bae5002841cfd5b8aeff65518e6bf53ce" address="unix:///run/containerd/s/70d80bbd625e99ddcaed66035bcc85d020884b5bd19d6553b0551065d64fa266" protocol=ttrpc version=3 Dec 16 13:11:18.779492 systemd[1]: Started cri-containerd-9cc1528219f32a69a850312d59c8be6bae5002841cfd5b8aeff65518e6bf53ce.scope - libcontainer container 9cc1528219f32a69a850312d59c8be6bae5002841cfd5b8aeff65518e6bf53ce. Dec 16 13:11:18.827504 containerd[1770]: time="2025-12-16T13:11:18.827460726Z" level=info msg="StartContainer for \"9cc1528219f32a69a850312d59c8be6bae5002841cfd5b8aeff65518e6bf53ce\" returns successfully" Dec 16 13:11:18.867264 kubelet[3051]: I1216 13:11:18.867206 3051 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5996c95d7d-tmzv4" podStartSLOduration=1.334445372 podStartE2EDuration="3.867192383s" podCreationTimestamp="2025-12-16 13:11:15 +0000 UTC" firstStartedPulling="2025-12-16 13:11:16.186469358 +0000 UTC m=+19.499091901" lastFinishedPulling="2025-12-16 13:11:18.719216368 +0000 UTC m=+22.031838912" observedRunningTime="2025-12-16 13:11:18.867091704 +0000 UTC m=+22.179714290" watchObservedRunningTime="2025-12-16 13:11:18.867192383 +0000 UTC m=+22.179814949" Dec 16 13:11:18.915806 kubelet[3051]: E1216 13:11:18.915775 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.915806 kubelet[3051]: W1216 13:11:18.915799 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.915806 kubelet[3051]: E1216 13:11:18.915819 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.917439 kubelet[3051]: E1216 13:11:18.917423 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.917439 kubelet[3051]: W1216 13:11:18.917437 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.917529 kubelet[3051]: E1216 13:11:18.917450 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.917616 kubelet[3051]: E1216 13:11:18.917607 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.917616 kubelet[3051]: W1216 13:11:18.917614 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.917682 kubelet[3051]: E1216 13:11:18.917621 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.917791 kubelet[3051]: E1216 13:11:18.917779 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.917791 kubelet[3051]: W1216 13:11:18.917787 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.917833 kubelet[3051]: E1216 13:11:18.917793 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.917940 kubelet[3051]: E1216 13:11:18.917928 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.917940 kubelet[3051]: W1216 13:11:18.917935 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.917980 kubelet[3051]: E1216 13:11:18.917941 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.918059 kubelet[3051]: E1216 13:11:18.918051 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.918059 kubelet[3051]: W1216 13:11:18.918058 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.918104 kubelet[3051]: E1216 13:11:18.918064 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.919412 kubelet[3051]: E1216 13:11:18.919399 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.919412 kubelet[3051]: W1216 13:11:18.919410 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.919496 kubelet[3051]: E1216 13:11:18.919421 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.919577 kubelet[3051]: E1216 13:11:18.919568 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.919610 kubelet[3051]: W1216 13:11:18.919576 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.919610 kubelet[3051]: E1216 13:11:18.919583 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.919789 kubelet[3051]: E1216 13:11:18.919781 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.919811 kubelet[3051]: W1216 13:11:18.919789 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.919811 kubelet[3051]: E1216 13:11:18.919795 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.921430 kubelet[3051]: E1216 13:11:18.921417 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.921479 kubelet[3051]: W1216 13:11:18.921430 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.921479 kubelet[3051]: E1216 13:11:18.921442 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.921620 kubelet[3051]: E1216 13:11:18.921610 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.921642 kubelet[3051]: W1216 13:11:18.921619 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.921642 kubelet[3051]: E1216 13:11:18.921626 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.921821 kubelet[3051]: E1216 13:11:18.921747 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.921821 kubelet[3051]: W1216 13:11:18.921755 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.921821 kubelet[3051]: E1216 13:11:18.921761 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.923316 kubelet[3051]: E1216 13:11:18.921896 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.923316 kubelet[3051]: W1216 13:11:18.921903 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.923316 kubelet[3051]: E1216 13:11:18.921909 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.923419 kubelet[3051]: E1216 13:11:18.923407 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.923445 kubelet[3051]: W1216 13:11:18.923419 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.923445 kubelet[3051]: E1216 13:11:18.923430 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.923586 kubelet[3051]: E1216 13:11:18.923578 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.923606 kubelet[3051]: W1216 13:11:18.923586 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.923606 kubelet[3051]: E1216 13:11:18.923592 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.930587 kubelet[3051]: E1216 13:11:18.930121 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.930587 kubelet[3051]: W1216 13:11:18.930143 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.930587 kubelet[3051]: E1216 13:11:18.930161 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.930587 kubelet[3051]: E1216 13:11:18.930405 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.930587 kubelet[3051]: W1216 13:11:18.930414 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.930894 kubelet[3051]: E1216 13:11:18.930877 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.931144 kubelet[3051]: E1216 13:11:18.931125 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.931170 kubelet[3051]: W1216 13:11:18.931144 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.931170 kubelet[3051]: E1216 13:11:18.931158 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.931384 kubelet[3051]: E1216 13:11:18.931373 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.931411 kubelet[3051]: W1216 13:11:18.931384 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.931411 kubelet[3051]: E1216 13:11:18.931392 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.931535 kubelet[3051]: E1216 13:11:18.931527 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.931535 kubelet[3051]: W1216 13:11:18.931535 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.931581 kubelet[3051]: E1216 13:11:18.931541 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.931687 kubelet[3051]: E1216 13:11:18.931679 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.931687 kubelet[3051]: W1216 13:11:18.931686 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.931734 kubelet[3051]: E1216 13:11:18.931692 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.932381 kubelet[3051]: E1216 13:11:18.932366 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.932381 kubelet[3051]: W1216 13:11:18.932378 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.932444 kubelet[3051]: E1216 13:11:18.932388 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.932580 kubelet[3051]: E1216 13:11:18.932570 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.932580 kubelet[3051]: W1216 13:11:18.932578 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.932628 kubelet[3051]: E1216 13:11:18.932585 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.932856 kubelet[3051]: E1216 13:11:18.932732 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.932856 kubelet[3051]: W1216 13:11:18.932740 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.932856 kubelet[3051]: E1216 13:11:18.932746 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.932916 kubelet[3051]: E1216 13:11:18.932865 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.932916 kubelet[3051]: W1216 13:11:18.932871 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.932916 kubelet[3051]: E1216 13:11:18.932876 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.933567 kubelet[3051]: E1216 13:11:18.933009 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.933567 kubelet[3051]: W1216 13:11:18.933017 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.933567 kubelet[3051]: E1216 13:11:18.933022 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.933567 kubelet[3051]: E1216 13:11:18.933135 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.933567 kubelet[3051]: W1216 13:11:18.933140 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.933567 kubelet[3051]: E1216 13:11:18.933146 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.933567 kubelet[3051]: E1216 13:11:18.933273 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.933567 kubelet[3051]: W1216 13:11:18.933278 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.933567 kubelet[3051]: E1216 13:11:18.933283 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.933567 kubelet[3051]: E1216 13:11:18.933415 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.933787 kubelet[3051]: W1216 13:11:18.933420 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.933787 kubelet[3051]: E1216 13:11:18.933427 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.933787 kubelet[3051]: E1216 13:11:18.933590 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.933787 kubelet[3051]: W1216 13:11:18.933595 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.933787 kubelet[3051]: E1216 13:11:18.933601 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.933787 kubelet[3051]: E1216 13:11:18.933785 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.933899 kubelet[3051]: W1216 13:11:18.933790 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.933899 kubelet[3051]: E1216 13:11:18.933797 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.934054 kubelet[3051]: E1216 13:11:18.934038 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.934075 kubelet[3051]: W1216 13:11:18.934052 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.934075 kubelet[3051]: E1216 13:11:18.934065 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:18.934217 kubelet[3051]: E1216 13:11:18.934208 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:18.934217 kubelet[3051]: W1216 13:11:18.934216 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:18.934262 kubelet[3051]: E1216 13:11:18.934223 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.795661 kubelet[3051]: E1216 13:11:19.795479 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:11:19.859896 kubelet[3051]: I1216 13:11:19.859872 3051 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 13:11:19.928989 kubelet[3051]: E1216 13:11:19.928950 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.928989 kubelet[3051]: W1216 13:11:19.928984 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.929384 kubelet[3051]: E1216 13:11:19.929013 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.929384 kubelet[3051]: E1216 13:11:19.929286 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.929384 kubelet[3051]: W1216 13:11:19.929319 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.929384 kubelet[3051]: E1216 13:11:19.929335 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.929712 kubelet[3051]: E1216 13:11:19.929633 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.929712 kubelet[3051]: W1216 13:11:19.929707 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.929761 kubelet[3051]: E1216 13:11:19.929752 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.930110 kubelet[3051]: E1216 13:11:19.930063 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.930136 kubelet[3051]: W1216 13:11:19.930108 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.930136 kubelet[3051]: E1216 13:11:19.930123 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.930518 kubelet[3051]: E1216 13:11:19.930498 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.930548 kubelet[3051]: W1216 13:11:19.930516 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.930548 kubelet[3051]: E1216 13:11:19.930530 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.930776 kubelet[3051]: E1216 13:11:19.930757 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.930799 kubelet[3051]: W1216 13:11:19.930775 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.930799 kubelet[3051]: E1216 13:11:19.930787 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.931100 kubelet[3051]: E1216 13:11:19.931082 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.931123 kubelet[3051]: W1216 13:11:19.931098 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.931123 kubelet[3051]: E1216 13:11:19.931111 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.931365 kubelet[3051]: E1216 13:11:19.931338 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.931365 kubelet[3051]: W1216 13:11:19.931354 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.931365 kubelet[3051]: E1216 13:11:19.931368 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.931678 kubelet[3051]: E1216 13:11:19.931631 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.931678 kubelet[3051]: W1216 13:11:19.931675 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.931773 kubelet[3051]: E1216 13:11:19.931688 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.931976 kubelet[3051]: E1216 13:11:19.931958 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.931976 kubelet[3051]: W1216 13:11:19.931973 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.932062 kubelet[3051]: E1216 13:11:19.932006 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.932328 kubelet[3051]: E1216 13:11:19.932284 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.932328 kubelet[3051]: W1216 13:11:19.932323 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.932429 kubelet[3051]: E1216 13:11:19.932336 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.932567 kubelet[3051]: E1216 13:11:19.932539 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.932567 kubelet[3051]: W1216 13:11:19.932564 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.932658 kubelet[3051]: E1216 13:11:19.932578 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.932847 kubelet[3051]: E1216 13:11:19.932830 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.932893 kubelet[3051]: W1216 13:11:19.932846 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.932893 kubelet[3051]: E1216 13:11:19.932859 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.933116 kubelet[3051]: E1216 13:11:19.933099 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.933116 kubelet[3051]: W1216 13:11:19.933115 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.933208 kubelet[3051]: E1216 13:11:19.933146 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.933410 kubelet[3051]: E1216 13:11:19.933381 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.933410 kubelet[3051]: W1216 13:11:19.933408 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.933507 kubelet[3051]: E1216 13:11:19.933421 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.939793 kubelet[3051]: E1216 13:11:19.939746 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.939793 kubelet[3051]: W1216 13:11:19.939761 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.939793 kubelet[3051]: E1216 13:11:19.939784 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.940009 kubelet[3051]: E1216 13:11:19.939996 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.940009 kubelet[3051]: W1216 13:11:19.940002 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.940009 kubelet[3051]: E1216 13:11:19.940009 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.940278 kubelet[3051]: E1216 13:11:19.940240 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.940278 kubelet[3051]: W1216 13:11:19.940260 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.940278 kubelet[3051]: E1216 13:11:19.940273 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.940460 kubelet[3051]: E1216 13:11:19.940444 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.940460 kubelet[3051]: W1216 13:11:19.940454 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.940535 kubelet[3051]: E1216 13:11:19.940462 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.940641 kubelet[3051]: E1216 13:11:19.940618 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.940641 kubelet[3051]: W1216 13:11:19.940628 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.940641 kubelet[3051]: E1216 13:11:19.940637 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.940867 kubelet[3051]: E1216 13:11:19.940845 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.940867 kubelet[3051]: W1216 13:11:19.940855 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.940949 kubelet[3051]: E1216 13:11:19.940862 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.942572 kubelet[3051]: E1216 13:11:19.942536 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.942572 kubelet[3051]: W1216 13:11:19.942554 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.942572 kubelet[3051]: E1216 13:11:19.942565 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.944277 kubelet[3051]: E1216 13:11:19.944236 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.944352 kubelet[3051]: W1216 13:11:19.944277 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.944352 kubelet[3051]: E1216 13:11:19.944309 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.944776 kubelet[3051]: E1216 13:11:19.944754 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.944776 kubelet[3051]: W1216 13:11:19.944768 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.944822 kubelet[3051]: E1216 13:11:19.944780 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.944975 kubelet[3051]: E1216 13:11:19.944959 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.944975 kubelet[3051]: W1216 13:11:19.944970 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.945028 kubelet[3051]: E1216 13:11:19.944979 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.945517 kubelet[3051]: E1216 13:11:19.945491 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.945517 kubelet[3051]: W1216 13:11:19.945507 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.945581 kubelet[3051]: E1216 13:11:19.945520 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.945690 kubelet[3051]: E1216 13:11:19.945676 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.945690 kubelet[3051]: W1216 13:11:19.945686 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.945735 kubelet[3051]: E1216 13:11:19.945695 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.945918 kubelet[3051]: E1216 13:11:19.945901 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.945918 kubelet[3051]: W1216 13:11:19.945912 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.945965 kubelet[3051]: E1216 13:11:19.945920 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.946317 kubelet[3051]: E1216 13:11:19.946256 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.946317 kubelet[3051]: W1216 13:11:19.946270 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.946317 kubelet[3051]: E1216 13:11:19.946279 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.946487 kubelet[3051]: E1216 13:11:19.946465 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.946487 kubelet[3051]: W1216 13:11:19.946472 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.946487 kubelet[3051]: E1216 13:11:19.946480 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.946794 kubelet[3051]: E1216 13:11:19.946767 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.946794 kubelet[3051]: W1216 13:11:19.946778 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.946794 kubelet[3051]: E1216 13:11:19.946787 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.947134 kubelet[3051]: E1216 13:11:19.947113 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.947134 kubelet[3051]: W1216 13:11:19.947124 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.947134 kubelet[3051]: E1216 13:11:19.947132 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:19.947391 kubelet[3051]: E1216 13:11:19.947378 3051 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:11:19.947391 kubelet[3051]: W1216 13:11:19.947389 3051 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:11:19.947454 kubelet[3051]: E1216 13:11:19.947397 3051 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:11:20.350614 containerd[1770]: time="2025-12-16T13:11:20.350507520Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:11:20.352852 containerd[1770]: time="2025-12-16T13:11:20.352814421Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4446754" Dec 16 13:11:20.355030 containerd[1770]: time="2025-12-16T13:11:20.354988603Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:11:20.357539 containerd[1770]: time="2025-12-16T13:11:20.357451287Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:11:20.357957 containerd[1770]: time="2025-12-16T13:11:20.357858028Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.638499807s" Dec 16 13:11:20.357957 containerd[1770]: time="2025-12-16T13:11:20.357890396Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Dec 16 13:11:20.363315 containerd[1770]: time="2025-12-16T13:11:20.363254474Z" level=info msg="CreateContainer within sandbox \"c292fe828bcbb00615c372a1fe5594d8a9193821251933e77e29d90715d44e84\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Dec 16 13:11:20.378484 containerd[1770]: time="2025-12-16T13:11:20.378265222Z" level=info msg="Container df86f6918a3714b58bf76d3730727490fd28226c0874adb1232dc25073a7f785: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:11:20.391261 containerd[1770]: time="2025-12-16T13:11:20.391182269Z" level=info msg="CreateContainer within sandbox \"c292fe828bcbb00615c372a1fe5594d8a9193821251933e77e29d90715d44e84\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"df86f6918a3714b58bf76d3730727490fd28226c0874adb1232dc25073a7f785\"" Dec 16 13:11:20.391743 containerd[1770]: time="2025-12-16T13:11:20.391722410Z" level=info msg="StartContainer for \"df86f6918a3714b58bf76d3730727490fd28226c0874adb1232dc25073a7f785\"" Dec 16 13:11:20.393002 containerd[1770]: time="2025-12-16T13:11:20.392958703Z" level=info msg="connecting to shim df86f6918a3714b58bf76d3730727490fd28226c0874adb1232dc25073a7f785" address="unix:///run/containerd/s/129ab271380e0a1cb8df0a0a6c1868897f084e5d865bcbf56f669ac4229be9dd" protocol=ttrpc version=3 Dec 16 13:11:20.421630 systemd[1]: Started cri-containerd-df86f6918a3714b58bf76d3730727490fd28226c0874adb1232dc25073a7f785.scope - libcontainer container df86f6918a3714b58bf76d3730727490fd28226c0874adb1232dc25073a7f785. Dec 16 13:11:20.516259 containerd[1770]: time="2025-12-16T13:11:20.516022602Z" level=info msg="StartContainer for \"df86f6918a3714b58bf76d3730727490fd28226c0874adb1232dc25073a7f785\" returns successfully" Dec 16 13:11:20.524883 systemd[1]: cri-containerd-df86f6918a3714b58bf76d3730727490fd28226c0874adb1232dc25073a7f785.scope: Deactivated successfully. Dec 16 13:11:20.528160 containerd[1770]: time="2025-12-16T13:11:20.528106315Z" level=info msg="received container exit event container_id:\"df86f6918a3714b58bf76d3730727490fd28226c0874adb1232dc25073a7f785\" id:\"df86f6918a3714b58bf76d3730727490fd28226c0874adb1232dc25073a7f785\" pid:3815 exited_at:{seconds:1765890680 nanos:527579739}" Dec 16 13:11:20.554225 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-df86f6918a3714b58bf76d3730727490fd28226c0874adb1232dc25073a7f785-rootfs.mount: Deactivated successfully. Dec 16 13:11:20.866905 containerd[1770]: time="2025-12-16T13:11:20.865773105Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Dec 16 13:11:21.795958 kubelet[3051]: E1216 13:11:21.795872 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:11:23.796490 kubelet[3051]: E1216 13:11:23.796321 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:11:24.441905 containerd[1770]: time="2025-12-16T13:11:24.441864557Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:11:24.443382 containerd[1770]: time="2025-12-16T13:11:24.443201337Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70446859" Dec 16 13:11:24.445377 containerd[1770]: time="2025-12-16T13:11:24.445357416Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:11:24.448073 containerd[1770]: time="2025-12-16T13:11:24.448049134Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:11:24.448535 containerd[1770]: time="2025-12-16T13:11:24.448515516Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 3.582703027s" Dec 16 13:11:24.448600 containerd[1770]: time="2025-12-16T13:11:24.448589510Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Dec 16 13:11:24.453363 containerd[1770]: time="2025-12-16T13:11:24.453049685Z" level=info msg="CreateContainer within sandbox \"c292fe828bcbb00615c372a1fe5594d8a9193821251933e77e29d90715d44e84\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Dec 16 13:11:24.469135 containerd[1770]: time="2025-12-16T13:11:24.469097148Z" level=info msg="Container 0af94392054758423909a8b11400fc393941cec86fa3f7ac19f467f1143fe724: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:11:24.481573 containerd[1770]: time="2025-12-16T13:11:24.481525739Z" level=info msg="CreateContainer within sandbox \"c292fe828bcbb00615c372a1fe5594d8a9193821251933e77e29d90715d44e84\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"0af94392054758423909a8b11400fc393941cec86fa3f7ac19f467f1143fe724\"" Dec 16 13:11:24.482041 containerd[1770]: time="2025-12-16T13:11:24.481996785Z" level=info msg="StartContainer for \"0af94392054758423909a8b11400fc393941cec86fa3f7ac19f467f1143fe724\"" Dec 16 13:11:24.485412 containerd[1770]: time="2025-12-16T13:11:24.485371259Z" level=info msg="connecting to shim 0af94392054758423909a8b11400fc393941cec86fa3f7ac19f467f1143fe724" address="unix:///run/containerd/s/129ab271380e0a1cb8df0a0a6c1868897f084e5d865bcbf56f669ac4229be9dd" protocol=ttrpc version=3 Dec 16 13:11:24.518527 systemd[1]: Started cri-containerd-0af94392054758423909a8b11400fc393941cec86fa3f7ac19f467f1143fe724.scope - libcontainer container 0af94392054758423909a8b11400fc393941cec86fa3f7ac19f467f1143fe724. Dec 16 13:11:24.610248 containerd[1770]: time="2025-12-16T13:11:24.610213256Z" level=info msg="StartContainer for \"0af94392054758423909a8b11400fc393941cec86fa3f7ac19f467f1143fe724\" returns successfully" Dec 16 13:11:25.049984 containerd[1770]: time="2025-12-16T13:11:25.049882232Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 16 13:11:25.051993 systemd[1]: cri-containerd-0af94392054758423909a8b11400fc393941cec86fa3f7ac19f467f1143fe724.scope: Deactivated successfully. Dec 16 13:11:25.052378 systemd[1]: cri-containerd-0af94392054758423909a8b11400fc393941cec86fa3f7ac19f467f1143fe724.scope: Consumed 596ms CPU time, 190.8M memory peak, 171.3M written to disk. Dec 16 13:11:25.053247 containerd[1770]: time="2025-12-16T13:11:25.053191083Z" level=info msg="received container exit event container_id:\"0af94392054758423909a8b11400fc393941cec86fa3f7ac19f467f1143fe724\" id:\"0af94392054758423909a8b11400fc393941cec86fa3f7ac19f467f1143fe724\" pid:3880 exited_at:{seconds:1765890685 nanos:52877825}" Dec 16 13:11:25.068196 kubelet[3051]: I1216 13:11:25.068170 3051 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Dec 16 13:11:25.082979 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0af94392054758423909a8b11400fc393941cec86fa3f7ac19f467f1143fe724-rootfs.mount: Deactivated successfully. Dec 16 13:11:25.141148 systemd[1]: Created slice kubepods-burstable-pod4fc441bf_8af0_4751_bda4_15b327d19263.slice - libcontainer container kubepods-burstable-pod4fc441bf_8af0_4751_bda4_15b327d19263.slice. Dec 16 13:11:25.150501 systemd[1]: Created slice kubepods-besteffort-podcfbcb130_7336_4c20_a673_9988ffd0b461.slice - libcontainer container kubepods-besteffort-podcfbcb130_7336_4c20_a673_9988ffd0b461.slice. Dec 16 13:11:25.158028 systemd[1]: Created slice kubepods-besteffort-pod107a6fd4_b5f5_4db7_901c_63c4db9f0b7f.slice - libcontainer container kubepods-besteffort-pod107a6fd4_b5f5_4db7_901c_63c4db9f0b7f.slice. Dec 16 13:11:25.164898 systemd[1]: Created slice kubepods-burstable-podad65a649_0849_4bfa_8148_c5f6ec6669c2.slice - libcontainer container kubepods-burstable-podad65a649_0849_4bfa_8148_c5f6ec6669c2.slice. Dec 16 13:11:25.171655 systemd[1]: Created slice kubepods-besteffort-podf1946944_0457_47c9_82f5_a0302117750a.slice - libcontainer container kubepods-besteffort-podf1946944_0457_47c9_82f5_a0302117750a.slice. Dec 16 13:11:25.176766 systemd[1]: Created slice kubepods-besteffort-pod7fc3c8ab_a97b_45a2_9167_06f605649e74.slice - libcontainer container kubepods-besteffort-pod7fc3c8ab_a97b_45a2_9167_06f605649e74.slice. Dec 16 13:11:25.180398 systemd[1]: Created slice kubepods-besteffort-poddb81499e_70ee_42e1_9fb9_a69e20146fbb.slice - libcontainer container kubepods-besteffort-poddb81499e_70ee_42e1_9fb9_a69e20146fbb.slice. Dec 16 13:11:25.182375 kubelet[3051]: I1216 13:11:25.182347 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gstqx\" (UniqueName: \"kubernetes.io/projected/cfbcb130-7336-4c20-a673-9988ffd0b461-kube-api-access-gstqx\") pod \"calico-apiserver-685f7c88c8-zm2j4\" (UID: \"cfbcb130-7336-4c20-a673-9988ffd0b461\") " pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" Dec 16 13:11:25.182522 kubelet[3051]: I1216 13:11:25.182387 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7jr8\" (UniqueName: \"kubernetes.io/projected/4fc441bf-8af0-4751-bda4-15b327d19263-kube-api-access-x7jr8\") pod \"coredns-66bc5c9577-2kbtr\" (UID: \"4fc441bf-8af0-4751-bda4-15b327d19263\") " pod="kube-system/coredns-66bc5c9577-2kbtr" Dec 16 13:11:25.182522 kubelet[3051]: I1216 13:11:25.182407 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad65a649-0849-4bfa-8148-c5f6ec6669c2-config-volume\") pod \"coredns-66bc5c9577-7dkx7\" (UID: \"ad65a649-0849-4bfa-8148-c5f6ec6669c2\") " pod="kube-system/coredns-66bc5c9577-7dkx7" Dec 16 13:11:25.182522 kubelet[3051]: I1216 13:11:25.182447 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/7fc3c8ab-a97b-45a2-9167-06f605649e74-calico-apiserver-certs\") pod \"calico-apiserver-685f7c88c8-9xxz2\" (UID: \"7fc3c8ab-a97b-45a2-9167-06f605649e74\") " pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" Dec 16 13:11:25.182522 kubelet[3051]: I1216 13:11:25.182480 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwgfb\" (UniqueName: \"kubernetes.io/projected/7fc3c8ab-a97b-45a2-9167-06f605649e74-kube-api-access-pwgfb\") pod \"calico-apiserver-685f7c88c8-9xxz2\" (UID: \"7fc3c8ab-a97b-45a2-9167-06f605649e74\") " pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" Dec 16 13:11:25.182522 kubelet[3051]: I1216 13:11:25.182500 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db81499e-70ee-42e1-9fb9-a69e20146fbb-config\") pod \"goldmane-7c778bb748-sfvmc\" (UID: \"db81499e-70ee-42e1-9fb9-a69e20146fbb\") " pod="calico-system/goldmane-7c778bb748-sfvmc" Dec 16 13:11:25.182708 kubelet[3051]: I1216 13:11:25.182523 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1946944-0457-47c9-82f5-a0302117750a-tigera-ca-bundle\") pod \"calico-kube-controllers-5d8969c677-gf2lj\" (UID: \"f1946944-0457-47c9-82f5-a0302117750a\") " pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" Dec 16 13:11:25.182708 kubelet[3051]: I1216 13:11:25.182537 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnk5w\" (UniqueName: \"kubernetes.io/projected/f1946944-0457-47c9-82f5-a0302117750a-kube-api-access-jnk5w\") pod \"calico-kube-controllers-5d8969c677-gf2lj\" (UID: \"f1946944-0457-47c9-82f5-a0302117750a\") " pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" Dec 16 13:11:25.182708 kubelet[3051]: I1216 13:11:25.182564 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2shpc\" (UniqueName: \"kubernetes.io/projected/db81499e-70ee-42e1-9fb9-a69e20146fbb-kube-api-access-2shpc\") pod \"goldmane-7c778bb748-sfvmc\" (UID: \"db81499e-70ee-42e1-9fb9-a69e20146fbb\") " pod="calico-system/goldmane-7c778bb748-sfvmc" Dec 16 13:11:25.182708 kubelet[3051]: I1216 13:11:25.182605 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/cfbcb130-7336-4c20-a673-9988ffd0b461-calico-apiserver-certs\") pod \"calico-apiserver-685f7c88c8-zm2j4\" (UID: \"cfbcb130-7336-4c20-a673-9988ffd0b461\") " pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" Dec 16 13:11:25.182708 kubelet[3051]: I1216 13:11:25.182628 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/107a6fd4-b5f5-4db7-901c-63c4db9f0b7f-whisker-ca-bundle\") pod \"whisker-78b6d648c5-tk6sx\" (UID: \"107a6fd4-b5f5-4db7-901c-63c4db9f0b7f\") " pod="calico-system/whisker-78b6d648c5-tk6sx" Dec 16 13:11:25.182852 kubelet[3051]: I1216 13:11:25.182644 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzfmb\" (UniqueName: \"kubernetes.io/projected/107a6fd4-b5f5-4db7-901c-63c4db9f0b7f-kube-api-access-zzfmb\") pod \"whisker-78b6d648c5-tk6sx\" (UID: \"107a6fd4-b5f5-4db7-901c-63c4db9f0b7f\") " pod="calico-system/whisker-78b6d648c5-tk6sx" Dec 16 13:11:25.182852 kubelet[3051]: I1216 13:11:25.182662 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fc441bf-8af0-4751-bda4-15b327d19263-config-volume\") pod \"coredns-66bc5c9577-2kbtr\" (UID: \"4fc441bf-8af0-4751-bda4-15b327d19263\") " pod="kube-system/coredns-66bc5c9577-2kbtr" Dec 16 13:11:25.182852 kubelet[3051]: I1216 13:11:25.182680 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/db81499e-70ee-42e1-9fb9-a69e20146fbb-goldmane-key-pair\") pod \"goldmane-7c778bb748-sfvmc\" (UID: \"db81499e-70ee-42e1-9fb9-a69e20146fbb\") " pod="calico-system/goldmane-7c778bb748-sfvmc" Dec 16 13:11:25.182852 kubelet[3051]: I1216 13:11:25.182702 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/107a6fd4-b5f5-4db7-901c-63c4db9f0b7f-whisker-backend-key-pair\") pod \"whisker-78b6d648c5-tk6sx\" (UID: \"107a6fd4-b5f5-4db7-901c-63c4db9f0b7f\") " pod="calico-system/whisker-78b6d648c5-tk6sx" Dec 16 13:11:25.182852 kubelet[3051]: I1216 13:11:25.182717 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvh59\" (UniqueName: \"kubernetes.io/projected/ad65a649-0849-4bfa-8148-c5f6ec6669c2-kube-api-access-bvh59\") pod \"coredns-66bc5c9577-7dkx7\" (UID: \"ad65a649-0849-4bfa-8148-c5f6ec6669c2\") " pod="kube-system/coredns-66bc5c9577-7dkx7" Dec 16 13:11:25.182989 kubelet[3051]: I1216 13:11:25.182729 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db81499e-70ee-42e1-9fb9-a69e20146fbb-goldmane-ca-bundle\") pod \"goldmane-7c778bb748-sfvmc\" (UID: \"db81499e-70ee-42e1-9fb9-a69e20146fbb\") " pod="calico-system/goldmane-7c778bb748-sfvmc" Dec 16 13:11:25.449288 containerd[1770]: time="2025-12-16T13:11:25.449187818Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-2kbtr,Uid:4fc441bf-8af0-4751-bda4-15b327d19263,Namespace:kube-system,Attempt:0,}" Dec 16 13:11:25.460119 containerd[1770]: time="2025-12-16T13:11:25.460079624Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-685f7c88c8-zm2j4,Uid:cfbcb130-7336-4c20-a673-9988ffd0b461,Namespace:calico-apiserver,Attempt:0,}" Dec 16 13:11:25.464894 containerd[1770]: time="2025-12-16T13:11:25.464863146Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-78b6d648c5-tk6sx,Uid:107a6fd4-b5f5-4db7-901c-63c4db9f0b7f,Namespace:calico-system,Attempt:0,}" Dec 16 13:11:25.471440 containerd[1770]: time="2025-12-16T13:11:25.471387910Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-7dkx7,Uid:ad65a649-0849-4bfa-8148-c5f6ec6669c2,Namespace:kube-system,Attempt:0,}" Dec 16 13:11:25.477924 containerd[1770]: time="2025-12-16T13:11:25.477890794Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5d8969c677-gf2lj,Uid:f1946944-0457-47c9-82f5-a0302117750a,Namespace:calico-system,Attempt:0,}" Dec 16 13:11:25.485556 containerd[1770]: time="2025-12-16T13:11:25.485520472Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-685f7c88c8-9xxz2,Uid:7fc3c8ab-a97b-45a2-9167-06f605649e74,Namespace:calico-apiserver,Attempt:0,}" Dec 16 13:11:25.490784 containerd[1770]: time="2025-12-16T13:11:25.490609207Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-sfvmc,Uid:db81499e-70ee-42e1-9fb9-a69e20146fbb,Namespace:calico-system,Attempt:0,}" Dec 16 13:11:25.525553 containerd[1770]: time="2025-12-16T13:11:25.525507862Z" level=error msg="Failed to destroy network for sandbox \"efc6c1d256c4d3448026ede5b0f75eee6f17f1b826ea62633073f962169f7e9e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:11:25.527404 containerd[1770]: time="2025-12-16T13:11:25.527368212Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-2kbtr,Uid:4fc441bf-8af0-4751-bda4-15b327d19263,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"efc6c1d256c4d3448026ede5b0f75eee6f17f1b826ea62633073f962169f7e9e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:11:25.527776 kubelet[3051]: E1216 13:11:25.527737 3051 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"efc6c1d256c4d3448026ede5b0f75eee6f17f1b826ea62633073f962169f7e9e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:11:25.528058 kubelet[3051]: E1216 13:11:25.527882 3051 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"efc6c1d256c4d3448026ede5b0f75eee6f17f1b826ea62633073f962169f7e9e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-2kbtr" Dec 16 13:11:25.528058 kubelet[3051]: E1216 13:11:25.527904 3051 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"efc6c1d256c4d3448026ede5b0f75eee6f17f1b826ea62633073f962169f7e9e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-2kbtr" Dec 16 13:11:25.528058 kubelet[3051]: E1216 13:11:25.527968 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-2kbtr_kube-system(4fc441bf-8af0-4751-bda4-15b327d19263)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-2kbtr_kube-system(4fc441bf-8af0-4751-bda4-15b327d19263)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"efc6c1d256c4d3448026ede5b0f75eee6f17f1b826ea62633073f962169f7e9e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-2kbtr" podUID="4fc441bf-8af0-4751-bda4-15b327d19263" Dec 16 13:11:25.537184 containerd[1770]: time="2025-12-16T13:11:25.537137922Z" level=error msg="Failed to destroy network for sandbox \"c958c9a1afdaf30c26aed1273b7c40cb146a24c5ac8c25f55ae6fb31cfe6eb3d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:11:25.539011 containerd[1770]: time="2025-12-16T13:11:25.538949668Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-685f7c88c8-zm2j4,Uid:cfbcb130-7336-4c20-a673-9988ffd0b461,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c958c9a1afdaf30c26aed1273b7c40cb146a24c5ac8c25f55ae6fb31cfe6eb3d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:11:25.539305 containerd[1770]: time="2025-12-16T13:11:25.539132256Z" level=error msg="Failed to destroy network for sandbox \"67b40b80a88334fc6d934f1c0d59e7ee8dabcb49d542840b853f4532bc17637f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:11:25.539350 kubelet[3051]: E1216 13:11:25.539213 3051 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c958c9a1afdaf30c26aed1273b7c40cb146a24c5ac8c25f55ae6fb31cfe6eb3d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:11:25.539350 kubelet[3051]: E1216 13:11:25.539264 3051 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c958c9a1afdaf30c26aed1273b7c40cb146a24c5ac8c25f55ae6fb31cfe6eb3d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" Dec 16 13:11:25.539350 kubelet[3051]: E1216 13:11:25.539285 3051 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c958c9a1afdaf30c26aed1273b7c40cb146a24c5ac8c25f55ae6fb31cfe6eb3d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" Dec 16 13:11:25.539521 kubelet[3051]: E1216 13:11:25.539496 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-685f7c88c8-zm2j4_calico-apiserver(cfbcb130-7336-4c20-a673-9988ffd0b461)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-685f7c88c8-zm2j4_calico-apiserver(cfbcb130-7336-4c20-a673-9988ffd0b461)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c958c9a1afdaf30c26aed1273b7c40cb146a24c5ac8c25f55ae6fb31cfe6eb3d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:11:25.540758 containerd[1770]: time="2025-12-16T13:11:25.540727833Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-7dkx7,Uid:ad65a649-0849-4bfa-8148-c5f6ec6669c2,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"67b40b80a88334fc6d934f1c0d59e7ee8dabcb49d542840b853f4532bc17637f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:11:25.540992 kubelet[3051]: E1216 13:11:25.540899 3051 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67b40b80a88334fc6d934f1c0d59e7ee8dabcb49d542840b853f4532bc17637f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:11:25.540992 kubelet[3051]: E1216 13:11:25.540932 3051 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67b40b80a88334fc6d934f1c0d59e7ee8dabcb49d542840b853f4532bc17637f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-7dkx7" Dec 16 13:11:25.540992 kubelet[3051]: E1216 13:11:25.540947 3051 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67b40b80a88334fc6d934f1c0d59e7ee8dabcb49d542840b853f4532bc17637f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-7dkx7" Dec 16 13:11:25.541107 containerd[1770]: time="2025-12-16T13:11:25.541066459Z" level=error msg="Failed to destroy network for sandbox \"43a91616ac945ddc1b9e66ef493f857257ac022061c6fd0c1e34adf8c3ea823b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:11:25.541178 kubelet[3051]: E1216 13:11:25.540982 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-7dkx7_kube-system(ad65a649-0849-4bfa-8148-c5f6ec6669c2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-7dkx7_kube-system(ad65a649-0849-4bfa-8148-c5f6ec6669c2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"67b40b80a88334fc6d934f1c0d59e7ee8dabcb49d542840b853f4532bc17637f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-7dkx7" podUID="ad65a649-0849-4bfa-8148-c5f6ec6669c2" Dec 16 13:11:25.542849 containerd[1770]: time="2025-12-16T13:11:25.542818477Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-78b6d648c5-tk6sx,Uid:107a6fd4-b5f5-4db7-901c-63c4db9f0b7f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"43a91616ac945ddc1b9e66ef493f857257ac022061c6fd0c1e34adf8c3ea823b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:11:25.543387 kubelet[3051]: E1216 13:11:25.543270 3051 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"43a91616ac945ddc1b9e66ef493f857257ac022061c6fd0c1e34adf8c3ea823b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:11:25.543387 kubelet[3051]: E1216 13:11:25.543331 3051 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"43a91616ac945ddc1b9e66ef493f857257ac022061c6fd0c1e34adf8c3ea823b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-78b6d648c5-tk6sx" Dec 16 13:11:25.543387 kubelet[3051]: E1216 13:11:25.543350 3051 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"43a91616ac945ddc1b9e66ef493f857257ac022061c6fd0c1e34adf8c3ea823b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-78b6d648c5-tk6sx" Dec 16 13:11:25.543490 kubelet[3051]: E1216 13:11:25.543402 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-78b6d648c5-tk6sx_calico-system(107a6fd4-b5f5-4db7-901c-63c4db9f0b7f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-78b6d648c5-tk6sx_calico-system(107a6fd4-b5f5-4db7-901c-63c4db9f0b7f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"43a91616ac945ddc1b9e66ef493f857257ac022061c6fd0c1e34adf8c3ea823b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-78b6d648c5-tk6sx" podUID="107a6fd4-b5f5-4db7-901c-63c4db9f0b7f" Dec 16 13:11:25.559025 containerd[1770]: time="2025-12-16T13:11:25.558964120Z" level=error msg="Failed to destroy network for sandbox \"2dc1ec825fbb80840a8da0067880e75074ddfb6b94bd17025a624d3dbb6f4939\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:11:25.559151 containerd[1770]: time="2025-12-16T13:11:25.558999435Z" level=error msg="Failed to destroy network for sandbox \"5284fa9a159918e41b14a183a0b67e201def2b8fde139a16067605262f2e2164\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:11:25.559755 containerd[1770]: time="2025-12-16T13:11:25.559725698Z" level=error msg="Failed to destroy network for sandbox \"81808d58e4707a7106a63270c642276cd05d78f2467f7f486a5684f254eaa17f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:11:25.561675 containerd[1770]: time="2025-12-16T13:11:25.561621081Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-sfvmc,Uid:db81499e-70ee-42e1-9fb9-a69e20146fbb,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2dc1ec825fbb80840a8da0067880e75074ddfb6b94bd17025a624d3dbb6f4939\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:11:25.561970 kubelet[3051]: E1216 13:11:25.561930 3051 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2dc1ec825fbb80840a8da0067880e75074ddfb6b94bd17025a624d3dbb6f4939\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:11:25.562066 kubelet[3051]: E1216 13:11:25.562053 3051 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2dc1ec825fbb80840a8da0067880e75074ddfb6b94bd17025a624d3dbb6f4939\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-sfvmc" Dec 16 13:11:25.562165 kubelet[3051]: E1216 13:11:25.562115 3051 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2dc1ec825fbb80840a8da0067880e75074ddfb6b94bd17025a624d3dbb6f4939\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-sfvmc" Dec 16 13:11:25.562250 kubelet[3051]: E1216 13:11:25.562233 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7c778bb748-sfvmc_calico-system(db81499e-70ee-42e1-9fb9-a69e20146fbb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7c778bb748-sfvmc_calico-system(db81499e-70ee-42e1-9fb9-a69e20146fbb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2dc1ec825fbb80840a8da0067880e75074ddfb6b94bd17025a624d3dbb6f4939\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:11:25.563081 containerd[1770]: time="2025-12-16T13:11:25.563043464Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5d8969c677-gf2lj,Uid:f1946944-0457-47c9-82f5-a0302117750a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5284fa9a159918e41b14a183a0b67e201def2b8fde139a16067605262f2e2164\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:11:25.563226 kubelet[3051]: E1216 13:11:25.563195 3051 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5284fa9a159918e41b14a183a0b67e201def2b8fde139a16067605262f2e2164\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:11:25.563275 kubelet[3051]: E1216 13:11:25.563233 3051 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5284fa9a159918e41b14a183a0b67e201def2b8fde139a16067605262f2e2164\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" Dec 16 13:11:25.563275 kubelet[3051]: E1216 13:11:25.563248 3051 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5284fa9a159918e41b14a183a0b67e201def2b8fde139a16067605262f2e2164\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" Dec 16 13:11:25.563367 kubelet[3051]: E1216 13:11:25.563285 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5d8969c677-gf2lj_calico-system(f1946944-0457-47c9-82f5-a0302117750a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5d8969c677-gf2lj_calico-system(f1946944-0457-47c9-82f5-a0302117750a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5284fa9a159918e41b14a183a0b67e201def2b8fde139a16067605262f2e2164\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:11:25.564494 containerd[1770]: time="2025-12-16T13:11:25.564427605Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-685f7c88c8-9xxz2,Uid:7fc3c8ab-a97b-45a2-9167-06f605649e74,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"81808d58e4707a7106a63270c642276cd05d78f2467f7f486a5684f254eaa17f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:11:25.564633 kubelet[3051]: E1216 13:11:25.564606 3051 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"81808d58e4707a7106a63270c642276cd05d78f2467f7f486a5684f254eaa17f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:11:25.564663 kubelet[3051]: E1216 13:11:25.564638 3051 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"81808d58e4707a7106a63270c642276cd05d78f2467f7f486a5684f254eaa17f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" Dec 16 13:11:25.564663 kubelet[3051]: E1216 13:11:25.564652 3051 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"81808d58e4707a7106a63270c642276cd05d78f2467f7f486a5684f254eaa17f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" Dec 16 13:11:25.564705 kubelet[3051]: E1216 13:11:25.564688 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-685f7c88c8-9xxz2_calico-apiserver(7fc3c8ab-a97b-45a2-9167-06f605649e74)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-685f7c88c8-9xxz2_calico-apiserver(7fc3c8ab-a97b-45a2-9167-06f605649e74)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"81808d58e4707a7106a63270c642276cd05d78f2467f7f486a5684f254eaa17f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:11:25.808797 systemd[1]: Created slice kubepods-besteffort-pod49b26efe_3f65_4cf1_9ebd_006f8e8a22f9.slice - libcontainer container kubepods-besteffort-pod49b26efe_3f65_4cf1_9ebd_006f8e8a22f9.slice. Dec 16 13:11:25.818001 containerd[1770]: time="2025-12-16T13:11:25.817755762Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mvzk2,Uid:49b26efe-3f65-4cf1-9ebd-006f8e8a22f9,Namespace:calico-system,Attempt:0,}" Dec 16 13:11:25.881955 containerd[1770]: time="2025-12-16T13:11:25.881903874Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Dec 16 13:11:25.894495 containerd[1770]: time="2025-12-16T13:11:25.894421756Z" level=error msg="Failed to destroy network for sandbox \"2d370bd44a2eebb943796823241d4c22bde57540eedcc056f4b993ad74779521\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:11:25.896667 containerd[1770]: time="2025-12-16T13:11:25.896621101Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mvzk2,Uid:49b26efe-3f65-4cf1-9ebd-006f8e8a22f9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d370bd44a2eebb943796823241d4c22bde57540eedcc056f4b993ad74779521\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:11:25.896893 kubelet[3051]: E1216 13:11:25.896849 3051 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d370bd44a2eebb943796823241d4c22bde57540eedcc056f4b993ad74779521\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:11:25.896940 kubelet[3051]: E1216 13:11:25.896907 3051 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d370bd44a2eebb943796823241d4c22bde57540eedcc056f4b993ad74779521\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-mvzk2" Dec 16 13:11:25.896966 kubelet[3051]: E1216 13:11:25.896930 3051 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d370bd44a2eebb943796823241d4c22bde57540eedcc056f4b993ad74779521\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-mvzk2" Dec 16 13:11:25.897031 kubelet[3051]: E1216 13:11:25.897004 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-mvzk2_calico-system(49b26efe-3f65-4cf1-9ebd-006f8e8a22f9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-mvzk2_calico-system(49b26efe-3f65-4cf1-9ebd-006f8e8a22f9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2d370bd44a2eebb943796823241d4c22bde57540eedcc056f4b993ad74779521\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:11:26.470911 systemd[1]: run-netns-cni\x2d9cd6076c\x2d2d1f\x2db205\x2de085\x2d76ca0a64761f.mount: Deactivated successfully. Dec 16 13:11:26.471004 systemd[1]: run-netns-cni\x2ddf81454f\x2d57e1\x2dbb0d\x2d690f\x2d3f96e5f80b35.mount: Deactivated successfully. Dec 16 13:11:26.471053 systemd[1]: run-netns-cni\x2d04e6cce5\x2dcdb5\x2dc654\x2d4d86\x2d9bf34ff8d5af.mount: Deactivated successfully. Dec 16 13:11:26.471098 systemd[1]: run-netns-cni\x2d99437f4b\x2d1a00\x2d9d84\x2d1936\x2d94ec4c787dde.mount: Deactivated successfully. Dec 16 13:11:26.471146 systemd[1]: run-netns-cni\x2d31956479\x2d532c\x2dad69\x2d6cd1\x2d9aa9496ebdee.mount: Deactivated successfully. Dec 16 13:11:26.471192 systemd[1]: run-netns-cni\x2d2c49f521\x2dd92c\x2de279\x2d28c3\x2dff49c91aea09.mount: Deactivated successfully. Dec 16 13:11:26.471235 systemd[1]: run-netns-cni\x2d8086f57e\x2da877\x2d4002\x2d274b\x2d2f9c0909909b.mount: Deactivated successfully. Dec 16 13:11:33.791128 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount226662100.mount: Deactivated successfully. Dec 16 13:11:33.816499 containerd[1770]: time="2025-12-16T13:11:33.816439179Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:11:33.819482 containerd[1770]: time="2025-12-16T13:11:33.819452252Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156883675" Dec 16 13:11:33.821313 containerd[1770]: time="2025-12-16T13:11:33.821229427Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:11:33.824164 containerd[1770]: time="2025-12-16T13:11:33.824131715Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:11:33.824796 containerd[1770]: time="2025-12-16T13:11:33.824769968Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 7.942815715s" Dec 16 13:11:33.824830 containerd[1770]: time="2025-12-16T13:11:33.824801792Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Dec 16 13:11:33.846077 containerd[1770]: time="2025-12-16T13:11:33.846011610Z" level=info msg="CreateContainer within sandbox \"c292fe828bcbb00615c372a1fe5594d8a9193821251933e77e29d90715d44e84\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Dec 16 13:11:33.862372 containerd[1770]: time="2025-12-16T13:11:33.861708545Z" level=info msg="Container e1b4e8fed3ab448a4c1f59af45f7f79fe5dbff3ff60b7cc6f765698e380bdd08: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:11:33.877232 containerd[1770]: time="2025-12-16T13:11:33.877165807Z" level=info msg="CreateContainer within sandbox \"c292fe828bcbb00615c372a1fe5594d8a9193821251933e77e29d90715d44e84\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"e1b4e8fed3ab448a4c1f59af45f7f79fe5dbff3ff60b7cc6f765698e380bdd08\"" Dec 16 13:11:33.877864 containerd[1770]: time="2025-12-16T13:11:33.877803669Z" level=info msg="StartContainer for \"e1b4e8fed3ab448a4c1f59af45f7f79fe5dbff3ff60b7cc6f765698e380bdd08\"" Dec 16 13:11:33.879211 containerd[1770]: time="2025-12-16T13:11:33.879137321Z" level=info msg="connecting to shim e1b4e8fed3ab448a4c1f59af45f7f79fe5dbff3ff60b7cc6f765698e380bdd08" address="unix:///run/containerd/s/129ab271380e0a1cb8df0a0a6c1868897f084e5d865bcbf56f669ac4229be9dd" protocol=ttrpc version=3 Dec 16 13:11:33.905605 systemd[1]: Started cri-containerd-e1b4e8fed3ab448a4c1f59af45f7f79fe5dbff3ff60b7cc6f765698e380bdd08.scope - libcontainer container e1b4e8fed3ab448a4c1f59af45f7f79fe5dbff3ff60b7cc6f765698e380bdd08. Dec 16 13:11:34.003126 containerd[1770]: time="2025-12-16T13:11:34.003073034Z" level=info msg="StartContainer for \"e1b4e8fed3ab448a4c1f59af45f7f79fe5dbff3ff60b7cc6f765698e380bdd08\" returns successfully" Dec 16 13:11:34.101175 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Dec 16 13:11:34.101386 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Dec 16 13:11:34.243511 kubelet[3051]: I1216 13:11:34.243475 3051 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzfmb\" (UniqueName: \"kubernetes.io/projected/107a6fd4-b5f5-4db7-901c-63c4db9f0b7f-kube-api-access-zzfmb\") pod \"107a6fd4-b5f5-4db7-901c-63c4db9f0b7f\" (UID: \"107a6fd4-b5f5-4db7-901c-63c4db9f0b7f\") " Dec 16 13:11:34.244261 kubelet[3051]: I1216 13:11:34.243537 3051 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/107a6fd4-b5f5-4db7-901c-63c4db9f0b7f-whisker-backend-key-pair\") pod \"107a6fd4-b5f5-4db7-901c-63c4db9f0b7f\" (UID: \"107a6fd4-b5f5-4db7-901c-63c4db9f0b7f\") " Dec 16 13:11:34.244261 kubelet[3051]: I1216 13:11:34.243561 3051 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/107a6fd4-b5f5-4db7-901c-63c4db9f0b7f-whisker-ca-bundle\") pod \"107a6fd4-b5f5-4db7-901c-63c4db9f0b7f\" (UID: \"107a6fd4-b5f5-4db7-901c-63c4db9f0b7f\") " Dec 16 13:11:34.244261 kubelet[3051]: I1216 13:11:34.243890 3051 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/107a6fd4-b5f5-4db7-901c-63c4db9f0b7f-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "107a6fd4-b5f5-4db7-901c-63c4db9f0b7f" (UID: "107a6fd4-b5f5-4db7-901c-63c4db9f0b7f"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Dec 16 13:11:34.246409 kubelet[3051]: I1216 13:11:34.246365 3051 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107a6fd4-b5f5-4db7-901c-63c4db9f0b7f-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "107a6fd4-b5f5-4db7-901c-63c4db9f0b7f" (UID: "107a6fd4-b5f5-4db7-901c-63c4db9f0b7f"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Dec 16 13:11:34.246660 kubelet[3051]: I1216 13:11:34.246627 3051 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/107a6fd4-b5f5-4db7-901c-63c4db9f0b7f-kube-api-access-zzfmb" (OuterVolumeSpecName: "kube-api-access-zzfmb") pod "107a6fd4-b5f5-4db7-901c-63c4db9f0b7f" (UID: "107a6fd4-b5f5-4db7-901c-63c4db9f0b7f"). InnerVolumeSpecName "kube-api-access-zzfmb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Dec 16 13:11:34.294497 kubelet[3051]: I1216 13:11:34.294293 3051 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 13:11:34.344560 kubelet[3051]: I1216 13:11:34.344507 3051 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/107a6fd4-b5f5-4db7-901c-63c4db9f0b7f-whisker-backend-key-pair\") on node \"ci-4459-2-2-9-79ca1ea2c9\" DevicePath \"\"" Dec 16 13:11:34.344560 kubelet[3051]: I1216 13:11:34.344547 3051 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/107a6fd4-b5f5-4db7-901c-63c4db9f0b7f-whisker-ca-bundle\") on node \"ci-4459-2-2-9-79ca1ea2c9\" DevicePath \"\"" Dec 16 13:11:34.344560 kubelet[3051]: I1216 13:11:34.344556 3051 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zzfmb\" (UniqueName: \"kubernetes.io/projected/107a6fd4-b5f5-4db7-901c-63c4db9f0b7f-kube-api-access-zzfmb\") on node \"ci-4459-2-2-9-79ca1ea2c9\" DevicePath \"\"" Dec 16 13:11:34.794524 systemd[1]: var-lib-kubelet-pods-107a6fd4\x2db5f5\x2d4db7\x2d901c\x2d63c4db9f0b7f-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Dec 16 13:11:34.794706 systemd[1]: var-lib-kubelet-pods-107a6fd4\x2db5f5\x2d4db7\x2d901c\x2d63c4db9f0b7f-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dzzfmb.mount: Deactivated successfully. Dec 16 13:11:34.805855 systemd[1]: Removed slice kubepods-besteffort-pod107a6fd4_b5f5_4db7_901c_63c4db9f0b7f.slice - libcontainer container kubepods-besteffort-pod107a6fd4_b5f5_4db7_901c_63c4db9f0b7f.slice. Dec 16 13:11:34.940996 kubelet[3051]: I1216 13:11:34.940888 3051 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-x8b4q" podStartSLOduration=2.444843043 podStartE2EDuration="19.940856392s" podCreationTimestamp="2025-12-16 13:11:15 +0000 UTC" firstStartedPulling="2025-12-16 13:11:16.329546412 +0000 UTC m=+19.642168954" lastFinishedPulling="2025-12-16 13:11:33.825559759 +0000 UTC m=+37.138182303" observedRunningTime="2025-12-16 13:11:34.940242105 +0000 UTC m=+38.252864711" watchObservedRunningTime="2025-12-16 13:11:34.940856392 +0000 UTC m=+38.253479057" Dec 16 13:11:35.007581 systemd[1]: Created slice kubepods-besteffort-podbe9b0dbc_c111_4079_904c_bc275a2adf0f.slice - libcontainer container kubepods-besteffort-podbe9b0dbc_c111_4079_904c_bc275a2adf0f.slice. Dec 16 13:11:35.048990 kubelet[3051]: I1216 13:11:35.048853 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j27gf\" (UniqueName: \"kubernetes.io/projected/be9b0dbc-c111-4079-904c-bc275a2adf0f-kube-api-access-j27gf\") pod \"whisker-f47d785c5-qmfjz\" (UID: \"be9b0dbc-c111-4079-904c-bc275a2adf0f\") " pod="calico-system/whisker-f47d785c5-qmfjz" Dec 16 13:11:35.048990 kubelet[3051]: I1216 13:11:35.048897 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/be9b0dbc-c111-4079-904c-bc275a2adf0f-whisker-backend-key-pair\") pod \"whisker-f47d785c5-qmfjz\" (UID: \"be9b0dbc-c111-4079-904c-bc275a2adf0f\") " pod="calico-system/whisker-f47d785c5-qmfjz" Dec 16 13:11:35.048990 kubelet[3051]: I1216 13:11:35.048938 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be9b0dbc-c111-4079-904c-bc275a2adf0f-whisker-ca-bundle\") pod \"whisker-f47d785c5-qmfjz\" (UID: \"be9b0dbc-c111-4079-904c-bc275a2adf0f\") " pod="calico-system/whisker-f47d785c5-qmfjz" Dec 16 13:11:35.313967 containerd[1770]: time="2025-12-16T13:11:35.313847141Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-f47d785c5-qmfjz,Uid:be9b0dbc-c111-4079-904c-bc275a2adf0f,Namespace:calico-system,Attempt:0,}" Dec 16 13:11:35.447001 systemd-networkd[1674]: cali3db55bf7f08: Link UP Dec 16 13:11:35.447236 systemd-networkd[1674]: cali3db55bf7f08: Gained carrier Dec 16 13:11:35.459335 containerd[1770]: 2025-12-16 13:11:35.347 [INFO][4304] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 13:11:35.459335 containerd[1770]: 2025-12-16 13:11:35.362 [INFO][4304] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--9--79ca1ea2c9-k8s-whisker--f47d785c5--qmfjz-eth0 whisker-f47d785c5- calico-system be9b0dbc-c111-4079-904c-bc275a2adf0f 877 0 2025-12-16 13:11:34 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:f47d785c5 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-2-9-79ca1ea2c9 whisker-f47d785c5-qmfjz eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali3db55bf7f08 [] [] }} ContainerID="8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4" Namespace="calico-system" Pod="whisker-f47d785c5-qmfjz" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-whisker--f47d785c5--qmfjz-" Dec 16 13:11:35.459335 containerd[1770]: 2025-12-16 13:11:35.363 [INFO][4304] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4" Namespace="calico-system" Pod="whisker-f47d785c5-qmfjz" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-whisker--f47d785c5--qmfjz-eth0" Dec 16 13:11:35.459335 containerd[1770]: 2025-12-16 13:11:35.396 [INFO][4426] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4" HandleID="k8s-pod-network.8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4" Workload="ci--4459--2--2--9--79ca1ea2c9-k8s-whisker--f47d785c5--qmfjz-eth0" Dec 16 13:11:35.459550 containerd[1770]: 2025-12-16 13:11:35.396 [INFO][4426] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4" HandleID="k8s-pod-network.8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4" Workload="ci--4459--2--2--9--79ca1ea2c9-k8s-whisker--f47d785c5--qmfjz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f510), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-9-79ca1ea2c9", "pod":"whisker-f47d785c5-qmfjz", "timestamp":"2025-12-16 13:11:35.396146469 +0000 UTC"}, Hostname:"ci-4459-2-2-9-79ca1ea2c9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:11:35.459550 containerd[1770]: 2025-12-16 13:11:35.396 [INFO][4426] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:11:35.459550 containerd[1770]: 2025-12-16 13:11:35.396 [INFO][4426] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:11:35.459550 containerd[1770]: 2025-12-16 13:11:35.396 [INFO][4426] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-9-79ca1ea2c9' Dec 16 13:11:35.459550 containerd[1770]: 2025-12-16 13:11:35.407 [INFO][4426] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:35.459550 containerd[1770]: 2025-12-16 13:11:35.412 [INFO][4426] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:35.459550 containerd[1770]: 2025-12-16 13:11:35.418 [INFO][4426] ipam/ipam.go 511: Trying affinity for 192.168.114.192/26 host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:35.459550 containerd[1770]: 2025-12-16 13:11:35.419 [INFO][4426] ipam/ipam.go 158: Attempting to load block cidr=192.168.114.192/26 host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:35.459550 containerd[1770]: 2025-12-16 13:11:35.421 [INFO][4426] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.114.192/26 host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:35.459757 containerd[1770]: 2025-12-16 13:11:35.421 [INFO][4426] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.114.192/26 handle="k8s-pod-network.8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:35.459757 containerd[1770]: 2025-12-16 13:11:35.423 [INFO][4426] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4 Dec 16 13:11:35.459757 containerd[1770]: 2025-12-16 13:11:35.431 [INFO][4426] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.114.192/26 handle="k8s-pod-network.8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:35.459757 containerd[1770]: 2025-12-16 13:11:35.437 [INFO][4426] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.114.193/26] block=192.168.114.192/26 handle="k8s-pod-network.8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:35.459757 containerd[1770]: 2025-12-16 13:11:35.437 [INFO][4426] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.114.193/26] handle="k8s-pod-network.8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:35.459757 containerd[1770]: 2025-12-16 13:11:35.437 [INFO][4426] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:11:35.459757 containerd[1770]: 2025-12-16 13:11:35.437 [INFO][4426] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.114.193/26] IPv6=[] ContainerID="8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4" HandleID="k8s-pod-network.8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4" Workload="ci--4459--2--2--9--79ca1ea2c9-k8s-whisker--f47d785c5--qmfjz-eth0" Dec 16 13:11:35.459888 containerd[1770]: 2025-12-16 13:11:35.441 [INFO][4304] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4" Namespace="calico-system" Pod="whisker-f47d785c5-qmfjz" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-whisker--f47d785c5--qmfjz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--9--79ca1ea2c9-k8s-whisker--f47d785c5--qmfjz-eth0", GenerateName:"whisker-f47d785c5-", Namespace:"calico-system", SelfLink:"", UID:"be9b0dbc-c111-4079-904c-bc275a2adf0f", ResourceVersion:"877", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 11, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"f47d785c5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-9-79ca1ea2c9", ContainerID:"", Pod:"whisker-f47d785c5-qmfjz", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.114.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali3db55bf7f08", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:11:35.459888 containerd[1770]: 2025-12-16 13:11:35.441 [INFO][4304] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.114.193/32] ContainerID="8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4" Namespace="calico-system" Pod="whisker-f47d785c5-qmfjz" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-whisker--f47d785c5--qmfjz-eth0" Dec 16 13:11:35.459959 containerd[1770]: 2025-12-16 13:11:35.441 [INFO][4304] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3db55bf7f08 ContainerID="8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4" Namespace="calico-system" Pod="whisker-f47d785c5-qmfjz" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-whisker--f47d785c5--qmfjz-eth0" Dec 16 13:11:35.459959 containerd[1770]: 2025-12-16 13:11:35.447 [INFO][4304] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4" Namespace="calico-system" Pod="whisker-f47d785c5-qmfjz" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-whisker--f47d785c5--qmfjz-eth0" Dec 16 13:11:35.459996 containerd[1770]: 2025-12-16 13:11:35.447 [INFO][4304] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4" Namespace="calico-system" Pod="whisker-f47d785c5-qmfjz" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-whisker--f47d785c5--qmfjz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--9--79ca1ea2c9-k8s-whisker--f47d785c5--qmfjz-eth0", GenerateName:"whisker-f47d785c5-", Namespace:"calico-system", SelfLink:"", UID:"be9b0dbc-c111-4079-904c-bc275a2adf0f", ResourceVersion:"877", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 11, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"f47d785c5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-9-79ca1ea2c9", ContainerID:"8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4", Pod:"whisker-f47d785c5-qmfjz", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.114.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali3db55bf7f08", MAC:"1a:50:b3:f8:35:de", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:11:35.460046 containerd[1770]: 2025-12-16 13:11:35.456 [INFO][4304] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4" Namespace="calico-system" Pod="whisker-f47d785c5-qmfjz" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-whisker--f47d785c5--qmfjz-eth0" Dec 16 13:11:35.492204 containerd[1770]: time="2025-12-16T13:11:35.492156491Z" level=info msg="connecting to shim 8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4" address="unix:///run/containerd/s/86aafd0d5b9b4ed863b72314750e9447dab50d681fbe10122dd2472ad150c71b" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:11:35.512524 systemd[1]: Started cri-containerd-8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4.scope - libcontainer container 8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4. Dec 16 13:11:35.559856 containerd[1770]: time="2025-12-16T13:11:35.559803687Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-f47d785c5-qmfjz,Uid:be9b0dbc-c111-4079-904c-bc275a2adf0f,Namespace:calico-system,Attempt:0,} returns sandbox id \"8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4\"" Dec 16 13:11:35.561506 containerd[1770]: time="2025-12-16T13:11:35.561454821Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 13:11:35.687084 systemd-networkd[1674]: vxlan.calico: Link UP Dec 16 13:11:35.687094 systemd-networkd[1674]: vxlan.calico: Gained carrier Dec 16 13:11:35.799435 containerd[1770]: time="2025-12-16T13:11:35.799375173Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-2kbtr,Uid:4fc441bf-8af0-4751-bda4-15b327d19263,Namespace:kube-system,Attempt:0,}" Dec 16 13:11:35.904623 systemd-networkd[1674]: calic9d69ebaeb2: Link UP Dec 16 13:11:35.905024 systemd-networkd[1674]: calic9d69ebaeb2: Gained carrier Dec 16 13:11:35.906513 containerd[1770]: time="2025-12-16T13:11:35.906477274Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:11:35.908343 containerd[1770]: time="2025-12-16T13:11:35.908269715Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 13:11:35.908343 containerd[1770]: time="2025-12-16T13:11:35.908316072Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 13:11:35.908560 kubelet[3051]: E1216 13:11:35.908521 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:11:35.909210 kubelet[3051]: E1216 13:11:35.908581 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:11:35.909210 kubelet[3051]: E1216 13:11:35.908663 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-f47d785c5-qmfjz_calico-system(be9b0dbc-c111-4079-904c-bc275a2adf0f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 13:11:35.909773 containerd[1770]: time="2025-12-16T13:11:35.909743150Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 13:11:35.915052 kubelet[3051]: I1216 13:11:35.915029 3051 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 13:11:35.921205 containerd[1770]: 2025-12-16 13:11:35.839 [INFO][4580] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--2kbtr-eth0 coredns-66bc5c9577- kube-system 4fc441bf-8af0-4751-bda4-15b327d19263 797 0 2025-12-16 13:11:03 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-9-79ca1ea2c9 coredns-66bc5c9577-2kbtr eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calic9d69ebaeb2 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672" Namespace="kube-system" Pod="coredns-66bc5c9577-2kbtr" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--2kbtr-" Dec 16 13:11:35.921205 containerd[1770]: 2025-12-16 13:11:35.839 [INFO][4580] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672" Namespace="kube-system" Pod="coredns-66bc5c9577-2kbtr" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--2kbtr-eth0" Dec 16 13:11:35.921205 containerd[1770]: 2025-12-16 13:11:35.862 [INFO][4596] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672" HandleID="k8s-pod-network.b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672" Workload="ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--2kbtr-eth0" Dec 16 13:11:35.921419 containerd[1770]: 2025-12-16 13:11:35.862 [INFO][4596] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672" HandleID="k8s-pod-network.b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672" Workload="ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--2kbtr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0005a89b0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-9-79ca1ea2c9", "pod":"coredns-66bc5c9577-2kbtr", "timestamp":"2025-12-16 13:11:35.862200606 +0000 UTC"}, Hostname:"ci-4459-2-2-9-79ca1ea2c9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:11:35.921419 containerd[1770]: 2025-12-16 13:11:35.862 [INFO][4596] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:11:35.921419 containerd[1770]: 2025-12-16 13:11:35.863 [INFO][4596] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:11:35.921419 containerd[1770]: 2025-12-16 13:11:35.863 [INFO][4596] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-9-79ca1ea2c9' Dec 16 13:11:35.921419 containerd[1770]: 2025-12-16 13:11:35.872 [INFO][4596] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:35.921419 containerd[1770]: 2025-12-16 13:11:35.878 [INFO][4596] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:35.921419 containerd[1770]: 2025-12-16 13:11:35.883 [INFO][4596] ipam/ipam.go 511: Trying affinity for 192.168.114.192/26 host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:35.921419 containerd[1770]: 2025-12-16 13:11:35.885 [INFO][4596] ipam/ipam.go 158: Attempting to load block cidr=192.168.114.192/26 host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:35.921419 containerd[1770]: 2025-12-16 13:11:35.887 [INFO][4596] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.114.192/26 host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:35.921605 containerd[1770]: 2025-12-16 13:11:35.887 [INFO][4596] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.114.192/26 handle="k8s-pod-network.b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:35.921605 containerd[1770]: 2025-12-16 13:11:35.889 [INFO][4596] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672 Dec 16 13:11:35.921605 containerd[1770]: 2025-12-16 13:11:35.894 [INFO][4596] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.114.192/26 handle="k8s-pod-network.b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:35.921605 containerd[1770]: 2025-12-16 13:11:35.900 [INFO][4596] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.114.194/26] block=192.168.114.192/26 handle="k8s-pod-network.b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:35.921605 containerd[1770]: 2025-12-16 13:11:35.900 [INFO][4596] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.114.194/26] handle="k8s-pod-network.b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:35.921605 containerd[1770]: 2025-12-16 13:11:35.900 [INFO][4596] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:11:35.921605 containerd[1770]: 2025-12-16 13:11:35.900 [INFO][4596] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.114.194/26] IPv6=[] ContainerID="b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672" HandleID="k8s-pod-network.b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672" Workload="ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--2kbtr-eth0" Dec 16 13:11:35.921736 containerd[1770]: 2025-12-16 13:11:35.902 [INFO][4580] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672" Namespace="kube-system" Pod="coredns-66bc5c9577-2kbtr" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--2kbtr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--2kbtr-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"4fc441bf-8af0-4751-bda4-15b327d19263", ResourceVersion:"797", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 11, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-9-79ca1ea2c9", ContainerID:"", Pod:"coredns-66bc5c9577-2kbtr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.114.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic9d69ebaeb2", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:11:35.921736 containerd[1770]: 2025-12-16 13:11:35.902 [INFO][4580] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.114.194/32] ContainerID="b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672" Namespace="kube-system" Pod="coredns-66bc5c9577-2kbtr" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--2kbtr-eth0" Dec 16 13:11:35.921736 containerd[1770]: 2025-12-16 13:11:35.902 [INFO][4580] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic9d69ebaeb2 ContainerID="b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672" Namespace="kube-system" Pod="coredns-66bc5c9577-2kbtr" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--2kbtr-eth0" Dec 16 13:11:35.921736 containerd[1770]: 2025-12-16 13:11:35.905 [INFO][4580] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672" Namespace="kube-system" Pod="coredns-66bc5c9577-2kbtr" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--2kbtr-eth0" Dec 16 13:11:35.921736 containerd[1770]: 2025-12-16 13:11:35.906 [INFO][4580] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672" Namespace="kube-system" Pod="coredns-66bc5c9577-2kbtr" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--2kbtr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--2kbtr-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"4fc441bf-8af0-4751-bda4-15b327d19263", ResourceVersion:"797", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 11, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-9-79ca1ea2c9", ContainerID:"b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672", Pod:"coredns-66bc5c9577-2kbtr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.114.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic9d69ebaeb2", MAC:"c6:40:b5:8d:96:81", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:11:35.921924 containerd[1770]: 2025-12-16 13:11:35.919 [INFO][4580] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672" Namespace="kube-system" Pod="coredns-66bc5c9577-2kbtr" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--2kbtr-eth0" Dec 16 13:11:35.949062 containerd[1770]: time="2025-12-16T13:11:35.948961831Z" level=info msg="connecting to shim b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672" address="unix:///run/containerd/s/c69f98f4c2ea979ffd1851fca0ae5438dfe1e05841128d323d69f4db4c93c354" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:11:35.979530 systemd[1]: Started cri-containerd-b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672.scope - libcontainer container b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672. Dec 16 13:11:36.022030 containerd[1770]: time="2025-12-16T13:11:36.021995382Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-2kbtr,Uid:4fc441bf-8af0-4751-bda4-15b327d19263,Namespace:kube-system,Attempt:0,} returns sandbox id \"b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672\"" Dec 16 13:11:36.029087 containerd[1770]: time="2025-12-16T13:11:36.029049714Z" level=info msg="CreateContainer within sandbox \"b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 13:11:36.045029 containerd[1770]: time="2025-12-16T13:11:36.044504021Z" level=info msg="Container ec9b3f66485c048d43db19e024f11f058e1b4501d341981a395798879af6caf7: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:11:36.052428 containerd[1770]: time="2025-12-16T13:11:36.052388937Z" level=info msg="CreateContainer within sandbox \"b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ec9b3f66485c048d43db19e024f11f058e1b4501d341981a395798879af6caf7\"" Dec 16 13:11:36.053612 containerd[1770]: time="2025-12-16T13:11:36.053086213Z" level=info msg="StartContainer for \"ec9b3f66485c048d43db19e024f11f058e1b4501d341981a395798879af6caf7\"" Dec 16 13:11:36.054216 containerd[1770]: time="2025-12-16T13:11:36.054178002Z" level=info msg="connecting to shim ec9b3f66485c048d43db19e024f11f058e1b4501d341981a395798879af6caf7" address="unix:///run/containerd/s/c69f98f4c2ea979ffd1851fca0ae5438dfe1e05841128d323d69f4db4c93c354" protocol=ttrpc version=3 Dec 16 13:11:36.075532 systemd[1]: Started cri-containerd-ec9b3f66485c048d43db19e024f11f058e1b4501d341981a395798879af6caf7.scope - libcontainer container ec9b3f66485c048d43db19e024f11f058e1b4501d341981a395798879af6caf7. Dec 16 13:11:36.104713 containerd[1770]: time="2025-12-16T13:11:36.104674274Z" level=info msg="StartContainer for \"ec9b3f66485c048d43db19e024f11f058e1b4501d341981a395798879af6caf7\" returns successfully" Dec 16 13:11:36.244893 containerd[1770]: time="2025-12-16T13:11:36.244608439Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:11:36.246713 containerd[1770]: time="2025-12-16T13:11:36.246623491Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 13:11:36.246868 containerd[1770]: time="2025-12-16T13:11:36.246712348Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 13:11:36.246998 kubelet[3051]: E1216 13:11:36.246935 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:11:36.247106 kubelet[3051]: E1216 13:11:36.247006 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:11:36.247184 kubelet[3051]: E1216 13:11:36.247113 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-f47d785c5-qmfjz_calico-system(be9b0dbc-c111-4079-904c-bc275a2adf0f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 13:11:36.247269 kubelet[3051]: E1216 13:11:36.247174 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:11:36.799598 kubelet[3051]: I1216 13:11:36.799545 3051 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="107a6fd4-b5f5-4db7-901c-63c4db9f0b7f" path="/var/lib/kubelet/pods/107a6fd4-b5f5-4db7-901c-63c4db9f0b7f/volumes" Dec 16 13:11:36.801255 containerd[1770]: time="2025-12-16T13:11:36.801201456Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-685f7c88c8-zm2j4,Uid:cfbcb130-7336-4c20-a673-9988ffd0b461,Namespace:calico-apiserver,Attempt:0,}" Dec 16 13:11:36.872846 systemd-networkd[1674]: vxlan.calico: Gained IPv6LL Dec 16 13:11:36.899718 systemd-networkd[1674]: cali253eca3e4b4: Link UP Dec 16 13:11:36.900376 systemd-networkd[1674]: cali253eca3e4b4: Gained carrier Dec 16 13:11:36.913596 containerd[1770]: 2025-12-16 13:11:36.836 [INFO][4736] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--zm2j4-eth0 calico-apiserver-685f7c88c8- calico-apiserver cfbcb130-7336-4c20-a673-9988ffd0b461 798 0 2025-12-16 13:11:11 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:685f7c88c8 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-9-79ca1ea2c9 calico-apiserver-685f7c88c8-zm2j4 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali253eca3e4b4 [] [] }} ContainerID="0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5" Namespace="calico-apiserver" Pod="calico-apiserver-685f7c88c8-zm2j4" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--zm2j4-" Dec 16 13:11:36.913596 containerd[1770]: 2025-12-16 13:11:36.836 [INFO][4736] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5" Namespace="calico-apiserver" Pod="calico-apiserver-685f7c88c8-zm2j4" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--zm2j4-eth0" Dec 16 13:11:36.913596 containerd[1770]: 2025-12-16 13:11:36.861 [INFO][4755] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5" HandleID="k8s-pod-network.0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5" Workload="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--zm2j4-eth0" Dec 16 13:11:36.913596 containerd[1770]: 2025-12-16 13:11:36.861 [INFO][4755] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5" HandleID="k8s-pod-network.0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5" Workload="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--zm2j4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000184af0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-9-79ca1ea2c9", "pod":"calico-apiserver-685f7c88c8-zm2j4", "timestamp":"2025-12-16 13:11:36.861284211 +0000 UTC"}, Hostname:"ci-4459-2-2-9-79ca1ea2c9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:11:36.913596 containerd[1770]: 2025-12-16 13:11:36.861 [INFO][4755] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:11:36.913596 containerd[1770]: 2025-12-16 13:11:36.861 [INFO][4755] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:11:36.913596 containerd[1770]: 2025-12-16 13:11:36.861 [INFO][4755] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-9-79ca1ea2c9' Dec 16 13:11:36.913596 containerd[1770]: 2025-12-16 13:11:36.869 [INFO][4755] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:36.913596 containerd[1770]: 2025-12-16 13:11:36.874 [INFO][4755] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:36.913596 containerd[1770]: 2025-12-16 13:11:36.878 [INFO][4755] ipam/ipam.go 511: Trying affinity for 192.168.114.192/26 host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:36.913596 containerd[1770]: 2025-12-16 13:11:36.880 [INFO][4755] ipam/ipam.go 158: Attempting to load block cidr=192.168.114.192/26 host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:36.913596 containerd[1770]: 2025-12-16 13:11:36.882 [INFO][4755] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.114.192/26 host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:36.913596 containerd[1770]: 2025-12-16 13:11:36.882 [INFO][4755] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.114.192/26 handle="k8s-pod-network.0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:36.913596 containerd[1770]: 2025-12-16 13:11:36.884 [INFO][4755] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5 Dec 16 13:11:36.913596 containerd[1770]: 2025-12-16 13:11:36.888 [INFO][4755] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.114.192/26 handle="k8s-pod-network.0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:36.913596 containerd[1770]: 2025-12-16 13:11:36.895 [INFO][4755] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.114.195/26] block=192.168.114.192/26 handle="k8s-pod-network.0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:36.913596 containerd[1770]: 2025-12-16 13:11:36.895 [INFO][4755] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.114.195/26] handle="k8s-pod-network.0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:36.913596 containerd[1770]: 2025-12-16 13:11:36.895 [INFO][4755] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:11:36.913596 containerd[1770]: 2025-12-16 13:11:36.895 [INFO][4755] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.114.195/26] IPv6=[] ContainerID="0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5" HandleID="k8s-pod-network.0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5" Workload="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--zm2j4-eth0" Dec 16 13:11:36.914110 containerd[1770]: 2025-12-16 13:11:36.897 [INFO][4736] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5" Namespace="calico-apiserver" Pod="calico-apiserver-685f7c88c8-zm2j4" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--zm2j4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--zm2j4-eth0", GenerateName:"calico-apiserver-685f7c88c8-", Namespace:"calico-apiserver", SelfLink:"", UID:"cfbcb130-7336-4c20-a673-9988ffd0b461", ResourceVersion:"798", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 11, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"685f7c88c8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-9-79ca1ea2c9", ContainerID:"", Pod:"calico-apiserver-685f7c88c8-zm2j4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.114.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali253eca3e4b4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:11:36.914110 containerd[1770]: 2025-12-16 13:11:36.897 [INFO][4736] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.114.195/32] ContainerID="0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5" Namespace="calico-apiserver" Pod="calico-apiserver-685f7c88c8-zm2j4" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--zm2j4-eth0" Dec 16 13:11:36.914110 containerd[1770]: 2025-12-16 13:11:36.897 [INFO][4736] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali253eca3e4b4 ContainerID="0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5" Namespace="calico-apiserver" Pod="calico-apiserver-685f7c88c8-zm2j4" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--zm2j4-eth0" Dec 16 13:11:36.914110 containerd[1770]: 2025-12-16 13:11:36.900 [INFO][4736] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5" Namespace="calico-apiserver" Pod="calico-apiserver-685f7c88c8-zm2j4" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--zm2j4-eth0" Dec 16 13:11:36.914110 containerd[1770]: 2025-12-16 13:11:36.901 [INFO][4736] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5" Namespace="calico-apiserver" Pod="calico-apiserver-685f7c88c8-zm2j4" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--zm2j4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--zm2j4-eth0", GenerateName:"calico-apiserver-685f7c88c8-", Namespace:"calico-apiserver", SelfLink:"", UID:"cfbcb130-7336-4c20-a673-9988ffd0b461", ResourceVersion:"798", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 11, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"685f7c88c8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-9-79ca1ea2c9", ContainerID:"0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5", Pod:"calico-apiserver-685f7c88c8-zm2j4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.114.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali253eca3e4b4", MAC:"6a:f0:c9:b3:c1:f0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:11:36.914110 containerd[1770]: 2025-12-16 13:11:36.911 [INFO][4736] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5" Namespace="calico-apiserver" Pod="calico-apiserver-685f7c88c8-zm2j4" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--zm2j4-eth0" Dec 16 13:11:36.920511 kubelet[3051]: E1216 13:11:36.920468 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:11:36.937270 systemd-networkd[1674]: calic9d69ebaeb2: Gained IPv6LL Dec 16 13:11:36.939384 kubelet[3051]: I1216 13:11:36.939338 3051 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-2kbtr" podStartSLOduration=33.939322825 podStartE2EDuration="33.939322825s" podCreationTimestamp="2025-12-16 13:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:11:36.93887532 +0000 UTC m=+40.251497885" watchObservedRunningTime="2025-12-16 13:11:36.939322825 +0000 UTC m=+40.251945389" Dec 16 13:11:36.955069 containerd[1770]: time="2025-12-16T13:11:36.955008450Z" level=info msg="connecting to shim 0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5" address="unix:///run/containerd/s/688a4ba594dd53094c52773d9fedd2a0087eb465bfd6cee9ddbc93b9cc57f227" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:11:36.981589 systemd[1]: Started cri-containerd-0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5.scope - libcontainer container 0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5. Dec 16 13:11:37.023689 containerd[1770]: time="2025-12-16T13:11:37.023643462Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-685f7c88c8-zm2j4,Uid:cfbcb130-7336-4c20-a673-9988ffd0b461,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5\"" Dec 16 13:11:37.025142 containerd[1770]: time="2025-12-16T13:11:37.025027497Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:11:37.386105 containerd[1770]: time="2025-12-16T13:11:37.386039333Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:11:37.387857 containerd[1770]: time="2025-12-16T13:11:37.387723496Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:11:37.387857 containerd[1770]: time="2025-12-16T13:11:37.387751029Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:11:37.388234 kubelet[3051]: E1216 13:11:37.388010 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:11:37.388234 kubelet[3051]: E1216 13:11:37.388067 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:11:37.388234 kubelet[3051]: E1216 13:11:37.388152 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-685f7c88c8-zm2j4_calico-apiserver(cfbcb130-7336-4c20-a673-9988ffd0b461): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:11:37.388234 kubelet[3051]: E1216 13:11:37.388187 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:11:37.448574 systemd-networkd[1674]: cali3db55bf7f08: Gained IPv6LL Dec 16 13:11:37.800245 containerd[1770]: time="2025-12-16T13:11:37.800129094Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-685f7c88c8-9xxz2,Uid:7fc3c8ab-a97b-45a2-9167-06f605649e74,Namespace:calico-apiserver,Attempt:0,}" Dec 16 13:11:37.895697 systemd-networkd[1674]: cali485194a4edd: Link UP Dec 16 13:11:37.896220 systemd-networkd[1674]: cali485194a4edd: Gained carrier Dec 16 13:11:37.906671 containerd[1770]: 2025-12-16 13:11:37.835 [INFO][4824] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--9xxz2-eth0 calico-apiserver-685f7c88c8- calico-apiserver 7fc3c8ab-a97b-45a2-9167-06f605649e74 799 0 2025-12-16 13:11:11 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:685f7c88c8 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-9-79ca1ea2c9 calico-apiserver-685f7c88c8-9xxz2 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali485194a4edd [] [] }} ContainerID="23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f" Namespace="calico-apiserver" Pod="calico-apiserver-685f7c88c8-9xxz2" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--9xxz2-" Dec 16 13:11:37.906671 containerd[1770]: 2025-12-16 13:11:37.835 [INFO][4824] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f" Namespace="calico-apiserver" Pod="calico-apiserver-685f7c88c8-9xxz2" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--9xxz2-eth0" Dec 16 13:11:37.906671 containerd[1770]: 2025-12-16 13:11:37.857 [INFO][4841] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f" HandleID="k8s-pod-network.23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f" Workload="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--9xxz2-eth0" Dec 16 13:11:37.906671 containerd[1770]: 2025-12-16 13:11:37.857 [INFO][4841] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f" HandleID="k8s-pod-network.23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f" Workload="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--9xxz2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000333390), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-9-79ca1ea2c9", "pod":"calico-apiserver-685f7c88c8-9xxz2", "timestamp":"2025-12-16 13:11:37.857181655 +0000 UTC"}, Hostname:"ci-4459-2-2-9-79ca1ea2c9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:11:37.906671 containerd[1770]: 2025-12-16 13:11:37.857 [INFO][4841] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:11:37.906671 containerd[1770]: 2025-12-16 13:11:37.857 [INFO][4841] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:11:37.906671 containerd[1770]: 2025-12-16 13:11:37.857 [INFO][4841] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-9-79ca1ea2c9' Dec 16 13:11:37.906671 containerd[1770]: 2025-12-16 13:11:37.863 [INFO][4841] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:37.906671 containerd[1770]: 2025-12-16 13:11:37.870 [INFO][4841] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:37.906671 containerd[1770]: 2025-12-16 13:11:37.876 [INFO][4841] ipam/ipam.go 511: Trying affinity for 192.168.114.192/26 host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:37.906671 containerd[1770]: 2025-12-16 13:11:37.878 [INFO][4841] ipam/ipam.go 158: Attempting to load block cidr=192.168.114.192/26 host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:37.906671 containerd[1770]: 2025-12-16 13:11:37.880 [INFO][4841] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.114.192/26 host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:37.906671 containerd[1770]: 2025-12-16 13:11:37.880 [INFO][4841] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.114.192/26 handle="k8s-pod-network.23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:37.906671 containerd[1770]: 2025-12-16 13:11:37.881 [INFO][4841] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f Dec 16 13:11:37.906671 containerd[1770]: 2025-12-16 13:11:37.885 [INFO][4841] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.114.192/26 handle="k8s-pod-network.23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:37.906671 containerd[1770]: 2025-12-16 13:11:37.892 [INFO][4841] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.114.196/26] block=192.168.114.192/26 handle="k8s-pod-network.23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:37.906671 containerd[1770]: 2025-12-16 13:11:37.892 [INFO][4841] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.114.196/26] handle="k8s-pod-network.23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:37.906671 containerd[1770]: 2025-12-16 13:11:37.892 [INFO][4841] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:11:37.906671 containerd[1770]: 2025-12-16 13:11:37.892 [INFO][4841] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.114.196/26] IPv6=[] ContainerID="23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f" HandleID="k8s-pod-network.23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f" Workload="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--9xxz2-eth0" Dec 16 13:11:37.907557 containerd[1770]: 2025-12-16 13:11:37.893 [INFO][4824] cni-plugin/k8s.go 418: Populated endpoint ContainerID="23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f" Namespace="calico-apiserver" Pod="calico-apiserver-685f7c88c8-9xxz2" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--9xxz2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--9xxz2-eth0", GenerateName:"calico-apiserver-685f7c88c8-", Namespace:"calico-apiserver", SelfLink:"", UID:"7fc3c8ab-a97b-45a2-9167-06f605649e74", ResourceVersion:"799", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 11, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"685f7c88c8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-9-79ca1ea2c9", ContainerID:"", Pod:"calico-apiserver-685f7c88c8-9xxz2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.114.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali485194a4edd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:11:37.907557 containerd[1770]: 2025-12-16 13:11:37.893 [INFO][4824] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.114.196/32] ContainerID="23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f" Namespace="calico-apiserver" Pod="calico-apiserver-685f7c88c8-9xxz2" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--9xxz2-eth0" Dec 16 13:11:37.907557 containerd[1770]: 2025-12-16 13:11:37.894 [INFO][4824] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali485194a4edd ContainerID="23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f" Namespace="calico-apiserver" Pod="calico-apiserver-685f7c88c8-9xxz2" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--9xxz2-eth0" Dec 16 13:11:37.907557 containerd[1770]: 2025-12-16 13:11:37.896 [INFO][4824] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f" Namespace="calico-apiserver" Pod="calico-apiserver-685f7c88c8-9xxz2" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--9xxz2-eth0" Dec 16 13:11:37.907557 containerd[1770]: 2025-12-16 13:11:37.897 [INFO][4824] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f" Namespace="calico-apiserver" Pod="calico-apiserver-685f7c88c8-9xxz2" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--9xxz2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--9xxz2-eth0", GenerateName:"calico-apiserver-685f7c88c8-", Namespace:"calico-apiserver", SelfLink:"", UID:"7fc3c8ab-a97b-45a2-9167-06f605649e74", ResourceVersion:"799", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 11, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"685f7c88c8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-9-79ca1ea2c9", ContainerID:"23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f", Pod:"calico-apiserver-685f7c88c8-9xxz2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.114.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali485194a4edd", MAC:"0e:4a:c2:e9:4e:15", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:11:37.907557 containerd[1770]: 2025-12-16 13:11:37.905 [INFO][4824] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f" Namespace="calico-apiserver" Pod="calico-apiserver-685f7c88c8-9xxz2" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--apiserver--685f7c88c8--9xxz2-eth0" Dec 16 13:11:37.920433 kubelet[3051]: E1216 13:11:37.920372 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:11:37.934658 containerd[1770]: time="2025-12-16T13:11:37.934605552Z" level=info msg="connecting to shim 23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f" address="unix:///run/containerd/s/add898257e2388b03a061df7d675b962838b027f3eb72baf7fd1854220947068" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:11:37.965478 systemd[1]: Started cri-containerd-23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f.scope - libcontainer container 23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f. Dec 16 13:11:38.011828 containerd[1770]: time="2025-12-16T13:11:38.011784981Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-685f7c88c8-9xxz2,Uid:7fc3c8ab-a97b-45a2-9167-06f605649e74,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f\"" Dec 16 13:11:38.013209 containerd[1770]: time="2025-12-16T13:11:38.013041607Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:11:38.344476 systemd-networkd[1674]: cali253eca3e4b4: Gained IPv6LL Dec 16 13:11:38.355481 containerd[1770]: time="2025-12-16T13:11:38.355436027Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:11:38.360060 containerd[1770]: time="2025-12-16T13:11:38.360015057Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:11:38.360163 containerd[1770]: time="2025-12-16T13:11:38.360092588Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:11:38.360334 kubelet[3051]: E1216 13:11:38.360266 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:11:38.360586 kubelet[3051]: E1216 13:11:38.360355 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:11:38.360586 kubelet[3051]: E1216 13:11:38.360461 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-685f7c88c8-9xxz2_calico-apiserver(7fc3c8ab-a97b-45a2-9167-06f605649e74): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:11:38.360586 kubelet[3051]: E1216 13:11:38.360507 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:11:38.801829 containerd[1770]: time="2025-12-16T13:11:38.801724865Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5d8969c677-gf2lj,Uid:f1946944-0457-47c9-82f5-a0302117750a,Namespace:calico-system,Attempt:0,}" Dec 16 13:11:38.805199 containerd[1770]: time="2025-12-16T13:11:38.805145242Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-7dkx7,Uid:ad65a649-0849-4bfa-8148-c5f6ec6669c2,Namespace:kube-system,Attempt:0,}" Dec 16 13:11:38.902731 systemd-networkd[1674]: calic5ee69fce03: Link UP Dec 16 13:11:38.902916 systemd-networkd[1674]: calic5ee69fce03: Gained carrier Dec 16 13:11:38.914546 containerd[1770]: 2025-12-16 13:11:38.841 [INFO][4906] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--9--79ca1ea2c9-k8s-calico--kube--controllers--5d8969c677--gf2lj-eth0 calico-kube-controllers-5d8969c677- calico-system f1946944-0457-47c9-82f5-a0302117750a 802 0 2025-12-16 13:11:16 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5d8969c677 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-2-9-79ca1ea2c9 calico-kube-controllers-5d8969c677-gf2lj eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calic5ee69fce03 [] [] }} ContainerID="11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374" Namespace="calico-system" Pod="calico-kube-controllers-5d8969c677-gf2lj" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--kube--controllers--5d8969c677--gf2lj-" Dec 16 13:11:38.914546 containerd[1770]: 2025-12-16 13:11:38.841 [INFO][4906] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374" Namespace="calico-system" Pod="calico-kube-controllers-5d8969c677-gf2lj" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--kube--controllers--5d8969c677--gf2lj-eth0" Dec 16 13:11:38.914546 containerd[1770]: 2025-12-16 13:11:38.871 [INFO][4943] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374" HandleID="k8s-pod-network.11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374" Workload="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--kube--controllers--5d8969c677--gf2lj-eth0" Dec 16 13:11:38.914546 containerd[1770]: 2025-12-16 13:11:38.871 [INFO][4943] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374" HandleID="k8s-pod-network.11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374" Workload="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--kube--controllers--5d8969c677--gf2lj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000138380), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-9-79ca1ea2c9", "pod":"calico-kube-controllers-5d8969c677-gf2lj", "timestamp":"2025-12-16 13:11:38.871269269 +0000 UTC"}, Hostname:"ci-4459-2-2-9-79ca1ea2c9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:11:38.914546 containerd[1770]: 2025-12-16 13:11:38.871 [INFO][4943] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:11:38.914546 containerd[1770]: 2025-12-16 13:11:38.871 [INFO][4943] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:11:38.914546 containerd[1770]: 2025-12-16 13:11:38.871 [INFO][4943] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-9-79ca1ea2c9' Dec 16 13:11:38.914546 containerd[1770]: 2025-12-16 13:11:38.877 [INFO][4943] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:38.914546 containerd[1770]: 2025-12-16 13:11:38.880 [INFO][4943] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:38.914546 containerd[1770]: 2025-12-16 13:11:38.884 [INFO][4943] ipam/ipam.go 511: Trying affinity for 192.168.114.192/26 host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:38.914546 containerd[1770]: 2025-12-16 13:11:38.885 [INFO][4943] ipam/ipam.go 158: Attempting to load block cidr=192.168.114.192/26 host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:38.914546 containerd[1770]: 2025-12-16 13:11:38.887 [INFO][4943] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.114.192/26 host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:38.914546 containerd[1770]: 2025-12-16 13:11:38.887 [INFO][4943] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.114.192/26 handle="k8s-pod-network.11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:38.914546 containerd[1770]: 2025-12-16 13:11:38.889 [INFO][4943] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374 Dec 16 13:11:38.914546 containerd[1770]: 2025-12-16 13:11:38.892 [INFO][4943] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.114.192/26 handle="k8s-pod-network.11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:38.914546 containerd[1770]: 2025-12-16 13:11:38.900 [INFO][4943] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.114.197/26] block=192.168.114.192/26 handle="k8s-pod-network.11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:38.914546 containerd[1770]: 2025-12-16 13:11:38.900 [INFO][4943] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.114.197/26] handle="k8s-pod-network.11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:38.914546 containerd[1770]: 2025-12-16 13:11:38.900 [INFO][4943] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:11:38.914546 containerd[1770]: 2025-12-16 13:11:38.900 [INFO][4943] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.114.197/26] IPv6=[] ContainerID="11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374" HandleID="k8s-pod-network.11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374" Workload="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--kube--controllers--5d8969c677--gf2lj-eth0" Dec 16 13:11:38.915406 containerd[1770]: 2025-12-16 13:11:38.901 [INFO][4906] cni-plugin/k8s.go 418: Populated endpoint ContainerID="11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374" Namespace="calico-system" Pod="calico-kube-controllers-5d8969c677-gf2lj" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--kube--controllers--5d8969c677--gf2lj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--9--79ca1ea2c9-k8s-calico--kube--controllers--5d8969c677--gf2lj-eth0", GenerateName:"calico-kube-controllers-5d8969c677-", Namespace:"calico-system", SelfLink:"", UID:"f1946944-0457-47c9-82f5-a0302117750a", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 11, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5d8969c677", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-9-79ca1ea2c9", ContainerID:"", Pod:"calico-kube-controllers-5d8969c677-gf2lj", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.114.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic5ee69fce03", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:11:38.915406 containerd[1770]: 2025-12-16 13:11:38.901 [INFO][4906] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.114.197/32] ContainerID="11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374" Namespace="calico-system" Pod="calico-kube-controllers-5d8969c677-gf2lj" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--kube--controllers--5d8969c677--gf2lj-eth0" Dec 16 13:11:38.915406 containerd[1770]: 2025-12-16 13:11:38.901 [INFO][4906] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic5ee69fce03 ContainerID="11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374" Namespace="calico-system" Pod="calico-kube-controllers-5d8969c677-gf2lj" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--kube--controllers--5d8969c677--gf2lj-eth0" Dec 16 13:11:38.915406 containerd[1770]: 2025-12-16 13:11:38.903 [INFO][4906] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374" Namespace="calico-system" Pod="calico-kube-controllers-5d8969c677-gf2lj" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--kube--controllers--5d8969c677--gf2lj-eth0" Dec 16 13:11:38.915406 containerd[1770]: 2025-12-16 13:11:38.903 [INFO][4906] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374" Namespace="calico-system" Pod="calico-kube-controllers-5d8969c677-gf2lj" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--kube--controllers--5d8969c677--gf2lj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--9--79ca1ea2c9-k8s-calico--kube--controllers--5d8969c677--gf2lj-eth0", GenerateName:"calico-kube-controllers-5d8969c677-", Namespace:"calico-system", SelfLink:"", UID:"f1946944-0457-47c9-82f5-a0302117750a", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 11, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5d8969c677", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-9-79ca1ea2c9", ContainerID:"11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374", Pod:"calico-kube-controllers-5d8969c677-gf2lj", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.114.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic5ee69fce03", MAC:"5a:0f:0b:65:c1:c4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:11:38.915406 containerd[1770]: 2025-12-16 13:11:38.912 [INFO][4906] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374" Namespace="calico-system" Pod="calico-kube-controllers-5d8969c677-gf2lj" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-calico--kube--controllers--5d8969c677--gf2lj-eth0" Dec 16 13:11:38.923653 kubelet[3051]: E1216 13:11:38.923537 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:11:38.923807 kubelet[3051]: E1216 13:11:38.923772 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:11:38.950773 containerd[1770]: time="2025-12-16T13:11:38.950721433Z" level=info msg="connecting to shim 11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374" address="unix:///run/containerd/s/a5c4d124ab9b9f15c6a4c31aceb78da2c0d66ddda1f754db275d36ccd7b08855" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:11:38.977542 systemd[1]: Started cri-containerd-11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374.scope - libcontainer container 11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374. Dec 16 13:11:39.009694 systemd-networkd[1674]: cali3a37323cd37: Link UP Dec 16 13:11:39.009883 systemd-networkd[1674]: cali3a37323cd37: Gained carrier Dec 16 13:11:39.023410 containerd[1770]: time="2025-12-16T13:11:39.022960241Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5d8969c677-gf2lj,Uid:f1946944-0457-47c9-82f5-a0302117750a,Namespace:calico-system,Attempt:0,} returns sandbox id \"11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374\"" Dec 16 13:11:39.026335 containerd[1770]: time="2025-12-16T13:11:39.025513765Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 13:11:39.026472 containerd[1770]: 2025-12-16 13:11:38.853 [INFO][4918] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--7dkx7-eth0 coredns-66bc5c9577- kube-system ad65a649-0849-4bfa-8148-c5f6ec6669c2 801 0 2025-12-16 13:11:03 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-9-79ca1ea2c9 coredns-66bc5c9577-7dkx7 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali3a37323cd37 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025" Namespace="kube-system" Pod="coredns-66bc5c9577-7dkx7" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--7dkx7-" Dec 16 13:11:39.026472 containerd[1770]: 2025-12-16 13:11:38.853 [INFO][4918] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025" Namespace="kube-system" Pod="coredns-66bc5c9577-7dkx7" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--7dkx7-eth0" Dec 16 13:11:39.026472 containerd[1770]: 2025-12-16 13:11:38.883 [INFO][4953] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025" HandleID="k8s-pod-network.8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025" Workload="ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--7dkx7-eth0" Dec 16 13:11:39.026472 containerd[1770]: 2025-12-16 13:11:38.883 [INFO][4953] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025" HandleID="k8s-pod-network.8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025" Workload="ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--7dkx7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000592a40), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-9-79ca1ea2c9", "pod":"coredns-66bc5c9577-7dkx7", "timestamp":"2025-12-16 13:11:38.88323093 +0000 UTC"}, Hostname:"ci-4459-2-2-9-79ca1ea2c9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:11:39.026472 containerd[1770]: 2025-12-16 13:11:38.883 [INFO][4953] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:11:39.026472 containerd[1770]: 2025-12-16 13:11:38.900 [INFO][4953] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:11:39.026472 containerd[1770]: 2025-12-16 13:11:38.900 [INFO][4953] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-9-79ca1ea2c9' Dec 16 13:11:39.026472 containerd[1770]: 2025-12-16 13:11:38.979 [INFO][4953] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:39.026472 containerd[1770]: 2025-12-16 13:11:38.983 [INFO][4953] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:39.026472 containerd[1770]: 2025-12-16 13:11:38.988 [INFO][4953] ipam/ipam.go 511: Trying affinity for 192.168.114.192/26 host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:39.026472 containerd[1770]: 2025-12-16 13:11:38.989 [INFO][4953] ipam/ipam.go 158: Attempting to load block cidr=192.168.114.192/26 host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:39.026472 containerd[1770]: 2025-12-16 13:11:38.992 [INFO][4953] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.114.192/26 host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:39.026472 containerd[1770]: 2025-12-16 13:11:38.992 [INFO][4953] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.114.192/26 handle="k8s-pod-network.8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:39.026472 containerd[1770]: 2025-12-16 13:11:38.994 [INFO][4953] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025 Dec 16 13:11:39.026472 containerd[1770]: 2025-12-16 13:11:38.998 [INFO][4953] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.114.192/26 handle="k8s-pod-network.8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:39.026472 containerd[1770]: 2025-12-16 13:11:39.006 [INFO][4953] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.114.198/26] block=192.168.114.192/26 handle="k8s-pod-network.8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:39.026472 containerd[1770]: 2025-12-16 13:11:39.006 [INFO][4953] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.114.198/26] handle="k8s-pod-network.8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:39.026472 containerd[1770]: 2025-12-16 13:11:39.006 [INFO][4953] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:11:39.026472 containerd[1770]: 2025-12-16 13:11:39.006 [INFO][4953] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.114.198/26] IPv6=[] ContainerID="8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025" HandleID="k8s-pod-network.8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025" Workload="ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--7dkx7-eth0" Dec 16 13:11:39.026932 containerd[1770]: 2025-12-16 13:11:39.007 [INFO][4918] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025" Namespace="kube-system" Pod="coredns-66bc5c9577-7dkx7" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--7dkx7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--7dkx7-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"ad65a649-0849-4bfa-8148-c5f6ec6669c2", ResourceVersion:"801", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 11, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-9-79ca1ea2c9", ContainerID:"", Pod:"coredns-66bc5c9577-7dkx7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.114.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3a37323cd37", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:11:39.026932 containerd[1770]: 2025-12-16 13:11:39.007 [INFO][4918] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.114.198/32] ContainerID="8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025" Namespace="kube-system" Pod="coredns-66bc5c9577-7dkx7" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--7dkx7-eth0" Dec 16 13:11:39.026932 containerd[1770]: 2025-12-16 13:11:39.008 [INFO][4918] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3a37323cd37 ContainerID="8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025" Namespace="kube-system" Pod="coredns-66bc5c9577-7dkx7" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--7dkx7-eth0" Dec 16 13:11:39.026932 containerd[1770]: 2025-12-16 13:11:39.009 [INFO][4918] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025" Namespace="kube-system" Pod="coredns-66bc5c9577-7dkx7" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--7dkx7-eth0" Dec 16 13:11:39.026932 containerd[1770]: 2025-12-16 13:11:39.011 [INFO][4918] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025" Namespace="kube-system" Pod="coredns-66bc5c9577-7dkx7" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--7dkx7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--7dkx7-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"ad65a649-0849-4bfa-8148-c5f6ec6669c2", ResourceVersion:"801", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 11, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-9-79ca1ea2c9", ContainerID:"8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025", Pod:"coredns-66bc5c9577-7dkx7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.114.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3a37323cd37", MAC:"b6:2f:7e:4b:d6:12", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:11:39.027106 containerd[1770]: 2025-12-16 13:11:39.024 [INFO][4918] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025" Namespace="kube-system" Pod="coredns-66bc5c9577-7dkx7" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-coredns--66bc5c9577--7dkx7-eth0" Dec 16 13:11:39.047922 containerd[1770]: time="2025-12-16T13:11:39.047879537Z" level=info msg="connecting to shim 8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025" address="unix:///run/containerd/s/1aa1f194a91cbb24224e9f9bb701268099178f40852045f0e73cb2a52cb351b2" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:11:39.077474 systemd[1]: Started cri-containerd-8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025.scope - libcontainer container 8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025. Dec 16 13:11:39.125662 containerd[1770]: time="2025-12-16T13:11:39.125599789Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-7dkx7,Uid:ad65a649-0849-4bfa-8148-c5f6ec6669c2,Namespace:kube-system,Attempt:0,} returns sandbox id \"8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025\"" Dec 16 13:11:39.131807 containerd[1770]: time="2025-12-16T13:11:39.131759246Z" level=info msg="CreateContainer within sandbox \"8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 13:11:39.146142 containerd[1770]: time="2025-12-16T13:11:39.146090214Z" level=info msg="Container 93162a156ed4dc75934262353827398939d0cd2912b94f9e04d2bc5363dba7b6: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:11:39.156036 containerd[1770]: time="2025-12-16T13:11:39.155989910Z" level=info msg="CreateContainer within sandbox \"8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"93162a156ed4dc75934262353827398939d0cd2912b94f9e04d2bc5363dba7b6\"" Dec 16 13:11:39.156499 containerd[1770]: time="2025-12-16T13:11:39.156461916Z" level=info msg="StartContainer for \"93162a156ed4dc75934262353827398939d0cd2912b94f9e04d2bc5363dba7b6\"" Dec 16 13:11:39.157189 containerd[1770]: time="2025-12-16T13:11:39.157151768Z" level=info msg="connecting to shim 93162a156ed4dc75934262353827398939d0cd2912b94f9e04d2bc5363dba7b6" address="unix:///run/containerd/s/1aa1f194a91cbb24224e9f9bb701268099178f40852045f0e73cb2a52cb351b2" protocol=ttrpc version=3 Dec 16 13:11:39.173507 systemd[1]: Started cri-containerd-93162a156ed4dc75934262353827398939d0cd2912b94f9e04d2bc5363dba7b6.scope - libcontainer container 93162a156ed4dc75934262353827398939d0cd2912b94f9e04d2bc5363dba7b6. Dec 16 13:11:39.177440 systemd-networkd[1674]: cali485194a4edd: Gained IPv6LL Dec 16 13:11:39.199535 containerd[1770]: time="2025-12-16T13:11:39.199483700Z" level=info msg="StartContainer for \"93162a156ed4dc75934262353827398939d0cd2912b94f9e04d2bc5363dba7b6\" returns successfully" Dec 16 13:11:39.357766 containerd[1770]: time="2025-12-16T13:11:39.357677111Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:11:39.359537 containerd[1770]: time="2025-12-16T13:11:39.359458020Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 13:11:39.359667 containerd[1770]: time="2025-12-16T13:11:39.359534160Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 13:11:39.359743 kubelet[3051]: E1216 13:11:39.359663 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:11:39.359743 kubelet[3051]: E1216 13:11:39.359703 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:11:39.359867 kubelet[3051]: E1216 13:11:39.359777 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-5d8969c677-gf2lj_calico-system(f1946944-0457-47c9-82f5-a0302117750a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 13:11:39.359867 kubelet[3051]: E1216 13:11:39.359806 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:11:39.926147 kubelet[3051]: E1216 13:11:39.926099 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:11:39.928157 kubelet[3051]: E1216 13:11:39.928100 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:11:39.944601 systemd-networkd[1674]: calic5ee69fce03: Gained IPv6LL Dec 16 13:11:39.952759 kubelet[3051]: I1216 13:11:39.952663 3051 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-7dkx7" podStartSLOduration=36.952630737 podStartE2EDuration="36.952630737s" podCreationTimestamp="2025-12-16 13:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:11:39.95169942 +0000 UTC m=+43.264322076" watchObservedRunningTime="2025-12-16 13:11:39.952630737 +0000 UTC m=+43.265253423" Dec 16 13:11:40.585528 systemd-networkd[1674]: cali3a37323cd37: Gained IPv6LL Dec 16 13:11:40.800837 containerd[1770]: time="2025-12-16T13:11:40.800765935Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-sfvmc,Uid:db81499e-70ee-42e1-9fb9-a69e20146fbb,Namespace:calico-system,Attempt:0,}" Dec 16 13:11:40.805588 containerd[1770]: time="2025-12-16T13:11:40.805500669Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mvzk2,Uid:49b26efe-3f65-4cf1-9ebd-006f8e8a22f9,Namespace:calico-system,Attempt:0,}" Dec 16 13:11:40.931139 kubelet[3051]: E1216 13:11:40.931082 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:11:40.944228 systemd-networkd[1674]: cali16ab3b1c874: Link UP Dec 16 13:11:40.945033 systemd-networkd[1674]: cali16ab3b1c874: Gained carrier Dec 16 13:11:40.963698 containerd[1770]: 2025-12-16 13:11:40.874 [INFO][5116] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--9--79ca1ea2c9-k8s-goldmane--7c778bb748--sfvmc-eth0 goldmane-7c778bb748- calico-system db81499e-70ee-42e1-9fb9-a69e20146fbb 803 0 2025-12-16 13:11:13 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7c778bb748 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-2-9-79ca1ea2c9 goldmane-7c778bb748-sfvmc eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali16ab3b1c874 [] [] }} ContainerID="576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd" Namespace="calico-system" Pod="goldmane-7c778bb748-sfvmc" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-goldmane--7c778bb748--sfvmc-" Dec 16 13:11:40.963698 containerd[1770]: 2025-12-16 13:11:40.874 [INFO][5116] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd" Namespace="calico-system" Pod="goldmane-7c778bb748-sfvmc" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-goldmane--7c778bb748--sfvmc-eth0" Dec 16 13:11:40.963698 containerd[1770]: 2025-12-16 13:11:40.898 [INFO][5152] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd" HandleID="k8s-pod-network.576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd" Workload="ci--4459--2--2--9--79ca1ea2c9-k8s-goldmane--7c778bb748--sfvmc-eth0" Dec 16 13:11:40.963698 containerd[1770]: 2025-12-16 13:11:40.898 [INFO][5152] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd" HandleID="k8s-pod-network.576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd" Workload="ci--4459--2--2--9--79ca1ea2c9-k8s-goldmane--7c778bb748--sfvmc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0005b3da0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-9-79ca1ea2c9", "pod":"goldmane-7c778bb748-sfvmc", "timestamp":"2025-12-16 13:11:40.898778559 +0000 UTC"}, Hostname:"ci-4459-2-2-9-79ca1ea2c9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:11:40.963698 containerd[1770]: 2025-12-16 13:11:40.899 [INFO][5152] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:11:40.963698 containerd[1770]: 2025-12-16 13:11:40.899 [INFO][5152] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:11:40.963698 containerd[1770]: 2025-12-16 13:11:40.899 [INFO][5152] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-9-79ca1ea2c9' Dec 16 13:11:40.963698 containerd[1770]: 2025-12-16 13:11:40.910 [INFO][5152] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:40.963698 containerd[1770]: 2025-12-16 13:11:40.915 [INFO][5152] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:40.963698 containerd[1770]: 2025-12-16 13:11:40.919 [INFO][5152] ipam/ipam.go 511: Trying affinity for 192.168.114.192/26 host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:40.963698 containerd[1770]: 2025-12-16 13:11:40.921 [INFO][5152] ipam/ipam.go 158: Attempting to load block cidr=192.168.114.192/26 host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:40.963698 containerd[1770]: 2025-12-16 13:11:40.923 [INFO][5152] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.114.192/26 host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:40.963698 containerd[1770]: 2025-12-16 13:11:40.923 [INFO][5152] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.114.192/26 handle="k8s-pod-network.576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:40.963698 containerd[1770]: 2025-12-16 13:11:40.924 [INFO][5152] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd Dec 16 13:11:40.963698 containerd[1770]: 2025-12-16 13:11:40.929 [INFO][5152] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.114.192/26 handle="k8s-pod-network.576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:40.963698 containerd[1770]: 2025-12-16 13:11:40.939 [INFO][5152] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.114.199/26] block=192.168.114.192/26 handle="k8s-pod-network.576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:40.963698 containerd[1770]: 2025-12-16 13:11:40.939 [INFO][5152] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.114.199/26] handle="k8s-pod-network.576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:40.963698 containerd[1770]: 2025-12-16 13:11:40.939 [INFO][5152] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:11:40.963698 containerd[1770]: 2025-12-16 13:11:40.939 [INFO][5152] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.114.199/26] IPv6=[] ContainerID="576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd" HandleID="k8s-pod-network.576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd" Workload="ci--4459--2--2--9--79ca1ea2c9-k8s-goldmane--7c778bb748--sfvmc-eth0" Dec 16 13:11:40.964442 containerd[1770]: 2025-12-16 13:11:40.941 [INFO][5116] cni-plugin/k8s.go 418: Populated endpoint ContainerID="576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd" Namespace="calico-system" Pod="goldmane-7c778bb748-sfvmc" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-goldmane--7c778bb748--sfvmc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--9--79ca1ea2c9-k8s-goldmane--7c778bb748--sfvmc-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"db81499e-70ee-42e1-9fb9-a69e20146fbb", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 11, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-9-79ca1ea2c9", ContainerID:"", Pod:"goldmane-7c778bb748-sfvmc", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.114.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali16ab3b1c874", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:11:40.964442 containerd[1770]: 2025-12-16 13:11:40.941 [INFO][5116] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.114.199/32] ContainerID="576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd" Namespace="calico-system" Pod="goldmane-7c778bb748-sfvmc" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-goldmane--7c778bb748--sfvmc-eth0" Dec 16 13:11:40.964442 containerd[1770]: 2025-12-16 13:11:40.942 [INFO][5116] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali16ab3b1c874 ContainerID="576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd" Namespace="calico-system" Pod="goldmane-7c778bb748-sfvmc" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-goldmane--7c778bb748--sfvmc-eth0" Dec 16 13:11:40.964442 containerd[1770]: 2025-12-16 13:11:40.947 [INFO][5116] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd" Namespace="calico-system" Pod="goldmane-7c778bb748-sfvmc" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-goldmane--7c778bb748--sfvmc-eth0" Dec 16 13:11:40.964442 containerd[1770]: 2025-12-16 13:11:40.949 [INFO][5116] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd" Namespace="calico-system" Pod="goldmane-7c778bb748-sfvmc" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-goldmane--7c778bb748--sfvmc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--9--79ca1ea2c9-k8s-goldmane--7c778bb748--sfvmc-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"db81499e-70ee-42e1-9fb9-a69e20146fbb", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 11, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-9-79ca1ea2c9", ContainerID:"576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd", Pod:"goldmane-7c778bb748-sfvmc", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.114.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali16ab3b1c874", MAC:"fe:72:71:f0:34:0f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:11:40.964442 containerd[1770]: 2025-12-16 13:11:40.961 [INFO][5116] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd" Namespace="calico-system" Pod="goldmane-7c778bb748-sfvmc" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-goldmane--7c778bb748--sfvmc-eth0" Dec 16 13:11:40.995064 containerd[1770]: time="2025-12-16T13:11:40.995018485Z" level=info msg="connecting to shim 576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd" address="unix:///run/containerd/s/34c40b4f7e65a46e0d7f6e7108cd15be2f438a61db7c19a86edac2a9fb76f9f8" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:11:41.021514 systemd[1]: Started cri-containerd-576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd.scope - libcontainer container 576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd. Dec 16 13:11:41.041421 systemd-networkd[1674]: calia1857b844be: Link UP Dec 16 13:11:41.041984 systemd-networkd[1674]: calia1857b844be: Gained carrier Dec 16 13:11:41.054109 containerd[1770]: 2025-12-16 13:11:40.871 [INFO][5118] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--9--79ca1ea2c9-k8s-csi--node--driver--mvzk2-eth0 csi-node-driver- calico-system 49b26efe-3f65-4cf1-9ebd-006f8e8a22f9 699 0 2025-12-16 13:11:16 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:9d99788f7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-2-9-79ca1ea2c9 csi-node-driver-mvzk2 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calia1857b844be [] [] }} ContainerID="8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40" Namespace="calico-system" Pod="csi-node-driver-mvzk2" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-csi--node--driver--mvzk2-" Dec 16 13:11:41.054109 containerd[1770]: 2025-12-16 13:11:40.871 [INFO][5118] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40" Namespace="calico-system" Pod="csi-node-driver-mvzk2" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-csi--node--driver--mvzk2-eth0" Dec 16 13:11:41.054109 containerd[1770]: 2025-12-16 13:11:40.911 [INFO][5158] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40" HandleID="k8s-pod-network.8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40" Workload="ci--4459--2--2--9--79ca1ea2c9-k8s-csi--node--driver--mvzk2-eth0" Dec 16 13:11:41.054109 containerd[1770]: 2025-12-16 13:11:40.911 [INFO][5158] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40" HandleID="k8s-pod-network.8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40" Workload="ci--4459--2--2--9--79ca1ea2c9-k8s-csi--node--driver--mvzk2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d50f0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-9-79ca1ea2c9", "pod":"csi-node-driver-mvzk2", "timestamp":"2025-12-16 13:11:40.911190678 +0000 UTC"}, Hostname:"ci-4459-2-2-9-79ca1ea2c9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:11:41.054109 containerd[1770]: 2025-12-16 13:11:40.911 [INFO][5158] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:11:41.054109 containerd[1770]: 2025-12-16 13:11:40.939 [INFO][5158] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:11:41.054109 containerd[1770]: 2025-12-16 13:11:40.939 [INFO][5158] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-9-79ca1ea2c9' Dec 16 13:11:41.054109 containerd[1770]: 2025-12-16 13:11:41.011 [INFO][5158] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:41.054109 containerd[1770]: 2025-12-16 13:11:41.016 [INFO][5158] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:41.054109 containerd[1770]: 2025-12-16 13:11:41.023 [INFO][5158] ipam/ipam.go 511: Trying affinity for 192.168.114.192/26 host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:41.054109 containerd[1770]: 2025-12-16 13:11:41.024 [INFO][5158] ipam/ipam.go 158: Attempting to load block cidr=192.168.114.192/26 host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:41.054109 containerd[1770]: 2025-12-16 13:11:41.026 [INFO][5158] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.114.192/26 host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:41.054109 containerd[1770]: 2025-12-16 13:11:41.026 [INFO][5158] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.114.192/26 handle="k8s-pod-network.8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:41.054109 containerd[1770]: 2025-12-16 13:11:41.028 [INFO][5158] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40 Dec 16 13:11:41.054109 containerd[1770]: 2025-12-16 13:11:41.031 [INFO][5158] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.114.192/26 handle="k8s-pod-network.8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:41.054109 containerd[1770]: 2025-12-16 13:11:41.038 [INFO][5158] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.114.200/26] block=192.168.114.192/26 handle="k8s-pod-network.8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:41.054109 containerd[1770]: 2025-12-16 13:11:41.038 [INFO][5158] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.114.200/26] handle="k8s-pod-network.8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40" host="ci-4459-2-2-9-79ca1ea2c9" Dec 16 13:11:41.054109 containerd[1770]: 2025-12-16 13:11:41.038 [INFO][5158] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:11:41.054109 containerd[1770]: 2025-12-16 13:11:41.038 [INFO][5158] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.114.200/26] IPv6=[] ContainerID="8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40" HandleID="k8s-pod-network.8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40" Workload="ci--4459--2--2--9--79ca1ea2c9-k8s-csi--node--driver--mvzk2-eth0" Dec 16 13:11:41.054634 containerd[1770]: 2025-12-16 13:11:41.039 [INFO][5118] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40" Namespace="calico-system" Pod="csi-node-driver-mvzk2" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-csi--node--driver--mvzk2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--9--79ca1ea2c9-k8s-csi--node--driver--mvzk2-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"49b26efe-3f65-4cf1-9ebd-006f8e8a22f9", ResourceVersion:"699", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 11, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-9-79ca1ea2c9", ContainerID:"", Pod:"csi-node-driver-mvzk2", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.114.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calia1857b844be", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:11:41.054634 containerd[1770]: 2025-12-16 13:11:41.040 [INFO][5118] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.114.200/32] ContainerID="8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40" Namespace="calico-system" Pod="csi-node-driver-mvzk2" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-csi--node--driver--mvzk2-eth0" Dec 16 13:11:41.054634 containerd[1770]: 2025-12-16 13:11:41.040 [INFO][5118] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia1857b844be ContainerID="8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40" Namespace="calico-system" Pod="csi-node-driver-mvzk2" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-csi--node--driver--mvzk2-eth0" Dec 16 13:11:41.054634 containerd[1770]: 2025-12-16 13:11:41.041 [INFO][5118] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40" Namespace="calico-system" Pod="csi-node-driver-mvzk2" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-csi--node--driver--mvzk2-eth0" Dec 16 13:11:41.054634 containerd[1770]: 2025-12-16 13:11:41.042 [INFO][5118] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40" Namespace="calico-system" Pod="csi-node-driver-mvzk2" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-csi--node--driver--mvzk2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--9--79ca1ea2c9-k8s-csi--node--driver--mvzk2-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"49b26efe-3f65-4cf1-9ebd-006f8e8a22f9", ResourceVersion:"699", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 11, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-9-79ca1ea2c9", ContainerID:"8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40", Pod:"csi-node-driver-mvzk2", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.114.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calia1857b844be", MAC:"56:be:7e:d2:54:99", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:11:41.054634 containerd[1770]: 2025-12-16 13:11:41.052 [INFO][5118] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40" Namespace="calico-system" Pod="csi-node-driver-mvzk2" WorkloadEndpoint="ci--4459--2--2--9--79ca1ea2c9-k8s-csi--node--driver--mvzk2-eth0" Dec 16 13:11:41.079132 containerd[1770]: time="2025-12-16T13:11:41.079096543Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-sfvmc,Uid:db81499e-70ee-42e1-9fb9-a69e20146fbb,Namespace:calico-system,Attempt:0,} returns sandbox id \"576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd\"" Dec 16 13:11:41.081125 containerd[1770]: time="2025-12-16T13:11:41.081082360Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 13:11:41.088583 containerd[1770]: time="2025-12-16T13:11:41.088554829Z" level=info msg="connecting to shim 8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40" address="unix:///run/containerd/s/7d9edefc6b8b4c3c316cf2f506582bd32726165933071dbcd386113a7e6ee777" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:11:41.115454 systemd[1]: Started cri-containerd-8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40.scope - libcontainer container 8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40. Dec 16 13:11:41.138192 containerd[1770]: time="2025-12-16T13:11:41.138124289Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mvzk2,Uid:49b26efe-3f65-4cf1-9ebd-006f8e8a22f9,Namespace:calico-system,Attempt:0,} returns sandbox id \"8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40\"" Dec 16 13:11:41.462698 containerd[1770]: time="2025-12-16T13:11:41.462612638Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:11:41.464984 containerd[1770]: time="2025-12-16T13:11:41.464885504Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 13:11:41.465089 containerd[1770]: time="2025-12-16T13:11:41.464943290Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 13:11:41.465446 kubelet[3051]: E1216 13:11:41.465363 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:11:41.465550 kubelet[3051]: E1216 13:11:41.465451 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:11:41.465695 kubelet[3051]: E1216 13:11:41.465655 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-sfvmc_calico-system(db81499e-70ee-42e1-9fb9-a69e20146fbb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 13:11:41.465746 kubelet[3051]: E1216 13:11:41.465700 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:11:41.467354 containerd[1770]: time="2025-12-16T13:11:41.465911665Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 13:11:41.806573 containerd[1770]: time="2025-12-16T13:11:41.806353416Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:11:41.808126 containerd[1770]: time="2025-12-16T13:11:41.808070235Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 13:11:41.808291 containerd[1770]: time="2025-12-16T13:11:41.808150698Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 13:11:41.808382 kubelet[3051]: E1216 13:11:41.808288 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:11:41.808382 kubelet[3051]: E1216 13:11:41.808352 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:11:41.808501 kubelet[3051]: E1216 13:11:41.808430 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-mvzk2_calico-system(49b26efe-3f65-4cf1-9ebd-006f8e8a22f9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 13:11:41.809772 containerd[1770]: time="2025-12-16T13:11:41.809726632Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 13:11:41.936335 kubelet[3051]: E1216 13:11:41.936272 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:11:42.136552 containerd[1770]: time="2025-12-16T13:11:42.136424289Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:11:42.138992 containerd[1770]: time="2025-12-16T13:11:42.138900444Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 13:11:42.139077 containerd[1770]: time="2025-12-16T13:11:42.138992756Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 13:11:42.139620 kubelet[3051]: E1216 13:11:42.139358 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:11:42.139620 kubelet[3051]: E1216 13:11:42.139420 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:11:42.139620 kubelet[3051]: E1216 13:11:42.139514 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-mvzk2_calico-system(49b26efe-3f65-4cf1-9ebd-006f8e8a22f9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 13:11:42.139821 kubelet[3051]: E1216 13:11:42.139566 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:11:42.632699 systemd-networkd[1674]: cali16ab3b1c874: Gained IPv6LL Dec 16 13:11:42.696580 systemd-networkd[1674]: calia1857b844be: Gained IPv6LL Dec 16 13:11:42.939357 kubelet[3051]: E1216 13:11:42.939201 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:11:42.939357 kubelet[3051]: E1216 13:11:42.939349 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:11:43.476131 kubelet[3051]: I1216 13:11:43.476038 3051 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 13:11:49.799316 containerd[1770]: time="2025-12-16T13:11:49.798716883Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 13:11:50.116324 containerd[1770]: time="2025-12-16T13:11:50.115509379Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:11:50.118810 containerd[1770]: time="2025-12-16T13:11:50.118743486Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 13:11:50.118929 containerd[1770]: time="2025-12-16T13:11:50.118882600Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 13:11:50.119185 kubelet[3051]: E1216 13:11:50.119126 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:11:50.119673 kubelet[3051]: E1216 13:11:50.119205 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:11:50.119673 kubelet[3051]: E1216 13:11:50.119601 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-f47d785c5-qmfjz_calico-system(be9b0dbc-c111-4079-904c-bc275a2adf0f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 13:11:50.122328 containerd[1770]: time="2025-12-16T13:11:50.122133065Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 13:11:50.495101 containerd[1770]: time="2025-12-16T13:11:50.494730146Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:11:50.497111 containerd[1770]: time="2025-12-16T13:11:50.497039432Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 13:11:50.497293 containerd[1770]: time="2025-12-16T13:11:50.497113987Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 13:11:50.497416 kubelet[3051]: E1216 13:11:50.497375 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:11:50.497512 kubelet[3051]: E1216 13:11:50.497428 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:11:50.497623 kubelet[3051]: E1216 13:11:50.497529 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-f47d785c5-qmfjz_calico-system(be9b0dbc-c111-4079-904c-bc275a2adf0f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 13:11:50.497623 kubelet[3051]: E1216 13:11:50.497597 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:11:51.796836 containerd[1770]: time="2025-12-16T13:11:51.796730030Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 13:11:52.132026 containerd[1770]: time="2025-12-16T13:11:52.131810805Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:11:52.134228 containerd[1770]: time="2025-12-16T13:11:52.134147977Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 13:11:52.134228 containerd[1770]: time="2025-12-16T13:11:52.134204564Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 13:11:52.134880 kubelet[3051]: E1216 13:11:52.134708 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:11:52.134880 kubelet[3051]: E1216 13:11:52.134750 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:11:52.134880 kubelet[3051]: E1216 13:11:52.134823 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-5d8969c677-gf2lj_calico-system(f1946944-0457-47c9-82f5-a0302117750a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 13:11:52.134880 kubelet[3051]: E1216 13:11:52.134855 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:11:53.796788 containerd[1770]: time="2025-12-16T13:11:53.796712729Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:11:54.162774 containerd[1770]: time="2025-12-16T13:11:54.162712462Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:11:54.165700 containerd[1770]: time="2025-12-16T13:11:54.165642368Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:11:54.165804 containerd[1770]: time="2025-12-16T13:11:54.165729288Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:11:54.165958 kubelet[3051]: E1216 13:11:54.165898 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:11:54.166259 kubelet[3051]: E1216 13:11:54.165969 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:11:54.166259 kubelet[3051]: E1216 13:11:54.166071 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-685f7c88c8-zm2j4_calico-apiserver(cfbcb130-7336-4c20-a673-9988ffd0b461): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:11:54.166259 kubelet[3051]: E1216 13:11:54.166104 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:11:54.796743 containerd[1770]: time="2025-12-16T13:11:54.796684212Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:11:55.125280 containerd[1770]: time="2025-12-16T13:11:55.125208895Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:11:55.129400 containerd[1770]: time="2025-12-16T13:11:55.129266684Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:11:55.129400 containerd[1770]: time="2025-12-16T13:11:55.129323765Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:11:55.129798 kubelet[3051]: E1216 13:11:55.129688 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:11:55.129798 kubelet[3051]: E1216 13:11:55.129787 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:11:55.130087 kubelet[3051]: E1216 13:11:55.129938 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-685f7c88c8-9xxz2_calico-apiserver(7fc3c8ab-a97b-45a2-9167-06f605649e74): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:11:55.130087 kubelet[3051]: E1216 13:11:55.130000 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:11:55.797180 containerd[1770]: time="2025-12-16T13:11:55.797146435Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 13:11:56.144376 containerd[1770]: time="2025-12-16T13:11:56.144328718Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:11:56.146471 containerd[1770]: time="2025-12-16T13:11:56.146422721Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 13:11:56.146603 containerd[1770]: time="2025-12-16T13:11:56.146472473Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 13:11:56.146683 kubelet[3051]: E1216 13:11:56.146617 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:11:56.146683 kubelet[3051]: E1216 13:11:56.146658 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:11:56.147200 kubelet[3051]: E1216 13:11:56.146723 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-sfvmc_calico-system(db81499e-70ee-42e1-9fb9-a69e20146fbb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 13:11:56.147200 kubelet[3051]: E1216 13:11:56.146749 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:11:57.797377 containerd[1770]: time="2025-12-16T13:11:57.797266775Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 13:11:58.140919 containerd[1770]: time="2025-12-16T13:11:58.140750771Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:11:58.143176 containerd[1770]: time="2025-12-16T13:11:58.142980973Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 13:11:58.143176 containerd[1770]: time="2025-12-16T13:11:58.143027178Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 13:11:58.143293 kubelet[3051]: E1216 13:11:58.143183 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:11:58.143293 kubelet[3051]: E1216 13:11:58.143224 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:11:58.143766 kubelet[3051]: E1216 13:11:58.143313 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-mvzk2_calico-system(49b26efe-3f65-4cf1-9ebd-006f8e8a22f9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 13:11:58.144405 containerd[1770]: time="2025-12-16T13:11:58.144378741Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 13:11:58.499816 containerd[1770]: time="2025-12-16T13:11:58.499551913Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:11:58.502178 containerd[1770]: time="2025-12-16T13:11:58.502022243Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 13:11:58.502178 containerd[1770]: time="2025-12-16T13:11:58.502125591Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 13:11:58.502604 kubelet[3051]: E1216 13:11:58.502537 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:11:58.502743 kubelet[3051]: E1216 13:11:58.502620 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:11:58.503029 kubelet[3051]: E1216 13:11:58.502795 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-mvzk2_calico-system(49b26efe-3f65-4cf1-9ebd-006f8e8a22f9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 13:11:58.503029 kubelet[3051]: E1216 13:11:58.502975 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:12:02.797432 kubelet[3051]: E1216 13:12:02.797388 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:12:03.797384 kubelet[3051]: E1216 13:12:03.797072 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:12:05.796426 kubelet[3051]: E1216 13:12:05.796376 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:12:10.796647 kubelet[3051]: E1216 13:12:10.796600 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:12:10.797906 kubelet[3051]: E1216 13:12:10.796613 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:12:11.796648 kubelet[3051]: E1216 13:12:11.796599 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:12:14.797878 containerd[1770]: time="2025-12-16T13:12:14.797794107Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 13:12:15.159742 containerd[1770]: time="2025-12-16T13:12:15.159682132Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:12:15.161456 containerd[1770]: time="2025-12-16T13:12:15.161404514Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 13:12:15.161516 containerd[1770]: time="2025-12-16T13:12:15.161494118Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 13:12:15.161729 kubelet[3051]: E1216 13:12:15.161676 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:12:15.162009 kubelet[3051]: E1216 13:12:15.161743 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:12:15.162009 kubelet[3051]: E1216 13:12:15.161847 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-f47d785c5-qmfjz_calico-system(be9b0dbc-c111-4079-904c-bc275a2adf0f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 13:12:15.163259 containerd[1770]: time="2025-12-16T13:12:15.163224303Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 13:12:15.521519 containerd[1770]: time="2025-12-16T13:12:15.521407643Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:12:15.523793 containerd[1770]: time="2025-12-16T13:12:15.523754642Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 13:12:15.523876 containerd[1770]: time="2025-12-16T13:12:15.523812018Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 13:12:15.524015 kubelet[3051]: E1216 13:12:15.523980 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:12:15.524054 kubelet[3051]: E1216 13:12:15.524033 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:12:15.524129 kubelet[3051]: E1216 13:12:15.524112 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-f47d785c5-qmfjz_calico-system(be9b0dbc-c111-4079-904c-bc275a2adf0f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 13:12:15.524178 kubelet[3051]: E1216 13:12:15.524154 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:12:16.797457 containerd[1770]: time="2025-12-16T13:12:16.797397369Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 13:12:17.124929 containerd[1770]: time="2025-12-16T13:12:17.124860644Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:12:17.126827 containerd[1770]: time="2025-12-16T13:12:17.126767274Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 13:12:17.126931 containerd[1770]: time="2025-12-16T13:12:17.126813430Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 13:12:17.127021 kubelet[3051]: E1216 13:12:17.126979 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:12:17.127365 kubelet[3051]: E1216 13:12:17.127023 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:12:17.127365 kubelet[3051]: E1216 13:12:17.127094 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-5d8969c677-gf2lj_calico-system(f1946944-0457-47c9-82f5-a0302117750a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 13:12:17.127365 kubelet[3051]: E1216 13:12:17.127123 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:12:19.796649 containerd[1770]: time="2025-12-16T13:12:19.796581054Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:12:20.124741 containerd[1770]: time="2025-12-16T13:12:20.124623107Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:12:20.126287 containerd[1770]: time="2025-12-16T13:12:20.126258146Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:12:20.126342 containerd[1770]: time="2025-12-16T13:12:20.126291306Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:12:20.126553 kubelet[3051]: E1216 13:12:20.126504 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:12:20.126801 kubelet[3051]: E1216 13:12:20.126559 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:12:20.126801 kubelet[3051]: E1216 13:12:20.126673 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-685f7c88c8-zm2j4_calico-apiserver(cfbcb130-7336-4c20-a673-9988ffd0b461): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:12:20.126801 kubelet[3051]: E1216 13:12:20.126742 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:12:21.796732 containerd[1770]: time="2025-12-16T13:12:21.796689898Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 13:12:22.375569 containerd[1770]: time="2025-12-16T13:12:22.375481692Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:12:22.377589 containerd[1770]: time="2025-12-16T13:12:22.377505797Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 13:12:22.377666 containerd[1770]: time="2025-12-16T13:12:22.377598701Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 13:12:22.378192 kubelet[3051]: E1216 13:12:22.378130 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:12:22.378192 kubelet[3051]: E1216 13:12:22.378180 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:12:22.378581 kubelet[3051]: E1216 13:12:22.378260 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-sfvmc_calico-system(db81499e-70ee-42e1-9fb9-a69e20146fbb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 13:12:22.378581 kubelet[3051]: E1216 13:12:22.378292 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:12:22.797251 containerd[1770]: time="2025-12-16T13:12:22.796942825Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:12:23.119000 containerd[1770]: time="2025-12-16T13:12:23.118934458Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:12:23.120600 containerd[1770]: time="2025-12-16T13:12:23.120552375Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:12:23.120665 containerd[1770]: time="2025-12-16T13:12:23.120593214Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:12:23.120860 kubelet[3051]: E1216 13:12:23.120806 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:12:23.120912 kubelet[3051]: E1216 13:12:23.120870 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:12:23.121372 kubelet[3051]: E1216 13:12:23.121351 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-685f7c88c8-9xxz2_calico-apiserver(7fc3c8ab-a97b-45a2-9167-06f605649e74): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:12:23.121432 kubelet[3051]: E1216 13:12:23.121394 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:12:25.798013 kubelet[3051]: E1216 13:12:25.797886 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:12:26.799130 containerd[1770]: time="2025-12-16T13:12:26.799031174Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 13:12:27.130946 containerd[1770]: time="2025-12-16T13:12:27.130856959Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:12:27.132864 containerd[1770]: time="2025-12-16T13:12:27.132780147Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 13:12:27.132953 containerd[1770]: time="2025-12-16T13:12:27.132907378Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 13:12:27.133194 kubelet[3051]: E1216 13:12:27.133153 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:12:27.133666 kubelet[3051]: E1216 13:12:27.133597 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:12:27.133779 kubelet[3051]: E1216 13:12:27.133751 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-mvzk2_calico-system(49b26efe-3f65-4cf1-9ebd-006f8e8a22f9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 13:12:27.134854 containerd[1770]: time="2025-12-16T13:12:27.134796810Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 13:12:27.487776 containerd[1770]: time="2025-12-16T13:12:27.487551810Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:12:27.491586 containerd[1770]: time="2025-12-16T13:12:27.491426943Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 13:12:27.491586 containerd[1770]: time="2025-12-16T13:12:27.491536698Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 13:12:27.491886 kubelet[3051]: E1216 13:12:27.491809 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:12:27.492072 kubelet[3051]: E1216 13:12:27.491894 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:12:27.492278 kubelet[3051]: E1216 13:12:27.492058 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-mvzk2_calico-system(49b26efe-3f65-4cf1-9ebd-006f8e8a22f9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 13:12:27.492278 kubelet[3051]: E1216 13:12:27.492156 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:12:27.797828 kubelet[3051]: E1216 13:12:27.797644 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:12:33.796415 kubelet[3051]: E1216 13:12:33.796359 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:12:34.797691 kubelet[3051]: E1216 13:12:34.797640 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:12:36.797716 kubelet[3051]: E1216 13:12:36.797661 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:12:39.796894 kubelet[3051]: E1216 13:12:39.796693 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:12:40.803349 kubelet[3051]: E1216 13:12:40.802927 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:12:40.804403 kubelet[3051]: E1216 13:12:40.804220 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:12:45.797513 kubelet[3051]: E1216 13:12:45.797408 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:12:47.796977 kubelet[3051]: E1216 13:12:47.796914 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:12:51.797441 kubelet[3051]: E1216 13:12:51.797381 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:12:51.798151 kubelet[3051]: E1216 13:12:51.797470 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:12:51.798151 kubelet[3051]: E1216 13:12:51.798086 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:12:54.797663 kubelet[3051]: E1216 13:12:54.797578 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:12:59.797399 kubelet[3051]: E1216 13:12:59.797336 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:13:00.798401 kubelet[3051]: E1216 13:13:00.798286 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:13:03.796426 containerd[1770]: time="2025-12-16T13:13:03.796283518Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 13:13:04.138737 containerd[1770]: time="2025-12-16T13:13:04.138622240Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:13:04.140732 containerd[1770]: time="2025-12-16T13:13:04.140625719Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 13:13:04.140896 containerd[1770]: time="2025-12-16T13:13:04.140785969Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 13:13:04.141194 kubelet[3051]: E1216 13:13:04.141109 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:13:04.141737 kubelet[3051]: E1216 13:13:04.141198 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:13:04.141737 kubelet[3051]: E1216 13:13:04.141520 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-5d8969c677-gf2lj_calico-system(f1946944-0457-47c9-82f5-a0302117750a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 13:13:04.141737 kubelet[3051]: E1216 13:13:04.141626 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:13:04.142212 containerd[1770]: time="2025-12-16T13:13:04.142101624Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 13:13:04.507220 containerd[1770]: time="2025-12-16T13:13:04.506981978Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:13:04.509051 containerd[1770]: time="2025-12-16T13:13:04.509015840Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 13:13:04.509208 containerd[1770]: time="2025-12-16T13:13:04.509094109Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 13:13:04.509286 kubelet[3051]: E1216 13:13:04.509253 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:13:04.509946 kubelet[3051]: E1216 13:13:04.509314 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:13:04.509946 kubelet[3051]: E1216 13:13:04.509406 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-f47d785c5-qmfjz_calico-system(be9b0dbc-c111-4079-904c-bc275a2adf0f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 13:13:04.511117 containerd[1770]: time="2025-12-16T13:13:04.511091576Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 13:13:04.842275 containerd[1770]: time="2025-12-16T13:13:04.842171344Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:13:04.844687 containerd[1770]: time="2025-12-16T13:13:04.844466194Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 13:13:04.844687 containerd[1770]: time="2025-12-16T13:13:04.844532648Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 13:13:04.844864 kubelet[3051]: E1216 13:13:04.844762 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:13:04.844864 kubelet[3051]: E1216 13:13:04.844802 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:13:04.844988 kubelet[3051]: E1216 13:13:04.844884 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-f47d785c5-qmfjz_calico-system(be9b0dbc-c111-4079-904c-bc275a2adf0f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 13:13:04.844988 kubelet[3051]: E1216 13:13:04.844929 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:13:06.797740 containerd[1770]: time="2025-12-16T13:13:06.797695228Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 13:13:07.120961 containerd[1770]: time="2025-12-16T13:13:07.120850368Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:13:07.123095 containerd[1770]: time="2025-12-16T13:13:07.122996534Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 13:13:07.123369 containerd[1770]: time="2025-12-16T13:13:07.123079223Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 13:13:07.123539 kubelet[3051]: E1216 13:13:07.123456 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:13:07.123539 kubelet[3051]: E1216 13:13:07.123525 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:13:07.124718 kubelet[3051]: E1216 13:13:07.123624 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-sfvmc_calico-system(db81499e-70ee-42e1-9fb9-a69e20146fbb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 13:13:07.124718 kubelet[3051]: E1216 13:13:07.123672 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:13:09.796422 containerd[1770]: time="2025-12-16T13:13:09.796364330Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 13:13:10.149485 containerd[1770]: time="2025-12-16T13:13:10.149434392Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:13:10.151425 containerd[1770]: time="2025-12-16T13:13:10.151387097Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 13:13:10.151516 containerd[1770]: time="2025-12-16T13:13:10.151460633Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 13:13:10.151759 kubelet[3051]: E1216 13:13:10.151701 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:13:10.151996 kubelet[3051]: E1216 13:13:10.151778 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:13:10.151996 kubelet[3051]: E1216 13:13:10.151918 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-mvzk2_calico-system(49b26efe-3f65-4cf1-9ebd-006f8e8a22f9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 13:13:10.155509 containerd[1770]: time="2025-12-16T13:13:10.155476328Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 13:13:10.515276 containerd[1770]: time="2025-12-16T13:13:10.514910467Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:13:10.517656 containerd[1770]: time="2025-12-16T13:13:10.517582339Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 13:13:10.517702 containerd[1770]: time="2025-12-16T13:13:10.517634572Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 13:13:10.517988 kubelet[3051]: E1216 13:13:10.517946 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:13:10.518031 kubelet[3051]: E1216 13:13:10.517997 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:13:10.518105 kubelet[3051]: E1216 13:13:10.518080 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-mvzk2_calico-system(49b26efe-3f65-4cf1-9ebd-006f8e8a22f9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 13:13:10.518167 kubelet[3051]: E1216 13:13:10.518137 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:13:14.798115 containerd[1770]: time="2025-12-16T13:13:14.797712686Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:13:15.160811 containerd[1770]: time="2025-12-16T13:13:15.160598133Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:13:15.164156 containerd[1770]: time="2025-12-16T13:13:15.164031043Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:13:15.164459 containerd[1770]: time="2025-12-16T13:13:15.164069264Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:13:15.164652 kubelet[3051]: E1216 13:13:15.164565 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:13:15.165811 kubelet[3051]: E1216 13:13:15.164665 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:13:15.165811 kubelet[3051]: E1216 13:13:15.164970 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-685f7c88c8-zm2j4_calico-apiserver(cfbcb130-7336-4c20-a673-9988ffd0b461): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:13:15.165811 kubelet[3051]: E1216 13:13:15.165075 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:13:15.166142 containerd[1770]: time="2025-12-16T13:13:15.165240534Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:13:15.524457 containerd[1770]: time="2025-12-16T13:13:15.524317582Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:13:15.526763 containerd[1770]: time="2025-12-16T13:13:15.526644077Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:13:15.526867 containerd[1770]: time="2025-12-16T13:13:15.526667031Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:13:15.527007 kubelet[3051]: E1216 13:13:15.526964 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:13:15.527007 kubelet[3051]: E1216 13:13:15.527004 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:13:15.527141 kubelet[3051]: E1216 13:13:15.527080 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-685f7c88c8-9xxz2_calico-apiserver(7fc3c8ab-a97b-45a2-9167-06f605649e74): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:13:15.527141 kubelet[3051]: E1216 13:13:15.527111 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:13:15.796823 kubelet[3051]: E1216 13:13:15.796655 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:13:17.796783 kubelet[3051]: E1216 13:13:17.796734 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:13:19.798021 kubelet[3051]: E1216 13:13:19.797940 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:13:22.798724 kubelet[3051]: E1216 13:13:22.798643 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:13:26.796957 kubelet[3051]: E1216 13:13:26.796880 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:13:27.798185 kubelet[3051]: E1216 13:13:27.798078 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:13:29.796336 kubelet[3051]: E1216 13:13:29.796274 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:13:30.797562 kubelet[3051]: E1216 13:13:30.797474 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:13:32.798499 kubelet[3051]: E1216 13:13:32.798405 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:13:33.796928 kubelet[3051]: E1216 13:13:33.796849 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:13:40.797740 kubelet[3051]: E1216 13:13:40.797663 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:13:41.797193 kubelet[3051]: E1216 13:13:41.797048 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:13:41.797800 kubelet[3051]: E1216 13:13:41.797734 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:13:43.796781 kubelet[3051]: E1216 13:13:43.796680 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:13:44.798969 kubelet[3051]: E1216 13:13:44.798883 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:13:47.796678 kubelet[3051]: E1216 13:13:47.796622 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:13:53.797875 kubelet[3051]: E1216 13:13:53.797756 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:13:54.796264 kubelet[3051]: E1216 13:13:54.796212 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:13:55.797409 kubelet[3051]: E1216 13:13:55.797333 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:13:57.797729 kubelet[3051]: E1216 13:13:57.797674 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:13:57.799007 kubelet[3051]: E1216 13:13:57.798939 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:14:01.799415 kubelet[3051]: E1216 13:14:01.799208 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:14:05.796953 kubelet[3051]: E1216 13:14:05.796892 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:14:08.798326 kubelet[3051]: E1216 13:14:08.797801 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:14:08.799275 kubelet[3051]: E1216 13:14:08.798737 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:14:10.797594 kubelet[3051]: E1216 13:14:10.797521 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:14:11.798192 kubelet[3051]: E1216 13:14:11.798092 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:14:13.797147 kubelet[3051]: E1216 13:14:13.797091 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:14:16.798656 kubelet[3051]: E1216 13:14:16.798259 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:14:21.797681 kubelet[3051]: E1216 13:14:21.797627 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:14:21.798730 kubelet[3051]: E1216 13:14:21.797745 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:14:21.798730 kubelet[3051]: E1216 13:14:21.797745 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:14:26.798929 kubelet[3051]: E1216 13:14:26.798834 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:14:27.799993 containerd[1770]: time="2025-12-16T13:14:27.799919236Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 13:14:28.350933 containerd[1770]: time="2025-12-16T13:14:28.350825564Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:14:28.353049 containerd[1770]: time="2025-12-16T13:14:28.352981160Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 13:14:28.353164 containerd[1770]: time="2025-12-16T13:14:28.353062193Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 13:14:28.353442 kubelet[3051]: E1216 13:14:28.353361 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:14:28.353886 kubelet[3051]: E1216 13:14:28.353449 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:14:28.353886 kubelet[3051]: E1216 13:14:28.353569 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-f47d785c5-qmfjz_calico-system(be9b0dbc-c111-4079-904c-bc275a2adf0f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 13:14:28.354757 containerd[1770]: time="2025-12-16T13:14:28.354727746Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 13:14:28.709985 containerd[1770]: time="2025-12-16T13:14:28.709476158Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:14:28.712719 containerd[1770]: time="2025-12-16T13:14:28.712653986Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 13:14:28.712802 containerd[1770]: time="2025-12-16T13:14:28.712733335Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 13:14:28.713290 kubelet[3051]: E1216 13:14:28.713025 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:14:28.713290 kubelet[3051]: E1216 13:14:28.713089 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:14:28.713290 kubelet[3051]: E1216 13:14:28.713191 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-f47d785c5-qmfjz_calico-system(be9b0dbc-c111-4079-904c-bc275a2adf0f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 13:14:28.713411 kubelet[3051]: E1216 13:14:28.713241 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:14:28.798080 kubelet[3051]: E1216 13:14:28.798014 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:14:35.798210 containerd[1770]: time="2025-12-16T13:14:35.798077497Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 13:14:36.135796 containerd[1770]: time="2025-12-16T13:14:36.135727402Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:14:36.139463 containerd[1770]: time="2025-12-16T13:14:36.139408452Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 13:14:36.139620 containerd[1770]: time="2025-12-16T13:14:36.139514067Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 13:14:36.139751 kubelet[3051]: E1216 13:14:36.139707 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:14:36.142002 kubelet[3051]: E1216 13:14:36.139755 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:14:36.142002 kubelet[3051]: E1216 13:14:36.139894 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-sfvmc_calico-system(db81499e-70ee-42e1-9fb9-a69e20146fbb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 13:14:36.142002 kubelet[3051]: E1216 13:14:36.139924 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:14:36.142283 containerd[1770]: time="2025-12-16T13:14:36.140705718Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 13:14:36.485417 containerd[1770]: time="2025-12-16T13:14:36.484689638Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:14:36.487584 containerd[1770]: time="2025-12-16T13:14:36.487497823Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 13:14:36.487584 containerd[1770]: time="2025-12-16T13:14:36.487552956Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 13:14:36.487882 kubelet[3051]: E1216 13:14:36.487841 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:14:36.487937 kubelet[3051]: E1216 13:14:36.487892 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:14:36.488004 kubelet[3051]: E1216 13:14:36.487987 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-5d8969c677-gf2lj_calico-system(f1946944-0457-47c9-82f5-a0302117750a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 13:14:36.488319 kubelet[3051]: E1216 13:14:36.488288 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:14:36.799336 containerd[1770]: time="2025-12-16T13:14:36.798619467Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:14:37.158527 containerd[1770]: time="2025-12-16T13:14:37.158388074Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:14:37.161263 containerd[1770]: time="2025-12-16T13:14:37.160875125Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:14:37.161263 containerd[1770]: time="2025-12-16T13:14:37.161179460Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:14:37.161520 kubelet[3051]: E1216 13:14:37.161429 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:14:37.161520 kubelet[3051]: E1216 13:14:37.161505 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:14:37.162211 kubelet[3051]: E1216 13:14:37.161665 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-685f7c88c8-zm2j4_calico-apiserver(cfbcb130-7336-4c20-a673-9988ffd0b461): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:14:37.162211 kubelet[3051]: E1216 13:14:37.161733 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:14:39.797333 containerd[1770]: time="2025-12-16T13:14:39.797245246Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 13:14:40.140255 containerd[1770]: time="2025-12-16T13:14:40.140156040Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:14:40.142903 containerd[1770]: time="2025-12-16T13:14:40.142773857Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 13:14:40.142903 containerd[1770]: time="2025-12-16T13:14:40.142799190Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 13:14:40.143125 kubelet[3051]: E1216 13:14:40.143078 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:14:40.143125 kubelet[3051]: E1216 13:14:40.143122 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:14:40.143671 kubelet[3051]: E1216 13:14:40.143187 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-mvzk2_calico-system(49b26efe-3f65-4cf1-9ebd-006f8e8a22f9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 13:14:40.144361 containerd[1770]: time="2025-12-16T13:14:40.144090466Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 13:14:40.484629 containerd[1770]: time="2025-12-16T13:14:40.484102379Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:14:40.486030 containerd[1770]: time="2025-12-16T13:14:40.485954518Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 13:14:40.486174 containerd[1770]: time="2025-12-16T13:14:40.486046929Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 13:14:40.486292 kubelet[3051]: E1216 13:14:40.486242 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:14:40.486622 kubelet[3051]: E1216 13:14:40.486316 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:14:40.486622 kubelet[3051]: E1216 13:14:40.486431 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-mvzk2_calico-system(49b26efe-3f65-4cf1-9ebd-006f8e8a22f9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 13:14:40.486622 kubelet[3051]: E1216 13:14:40.486529 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:14:43.797524 containerd[1770]: time="2025-12-16T13:14:43.797391072Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:14:43.798229 kubelet[3051]: E1216 13:14:43.797976 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:14:44.154589 containerd[1770]: time="2025-12-16T13:14:44.154514996Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:14:44.156494 containerd[1770]: time="2025-12-16T13:14:44.156399796Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:14:44.156578 containerd[1770]: time="2025-12-16T13:14:44.156519825Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:14:44.156884 kubelet[3051]: E1216 13:14:44.156812 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:14:44.156936 kubelet[3051]: E1216 13:14:44.156905 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:14:44.157114 kubelet[3051]: E1216 13:14:44.157072 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-685f7c88c8-9xxz2_calico-apiserver(7fc3c8ab-a97b-45a2-9167-06f605649e74): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:14:44.157201 kubelet[3051]: E1216 13:14:44.157154 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:14:47.796515 kubelet[3051]: E1216 13:14:47.796479 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:14:48.801418 kubelet[3051]: E1216 13:14:48.801326 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:14:50.798117 kubelet[3051]: E1216 13:14:50.798044 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:14:53.798104 kubelet[3051]: E1216 13:14:53.798005 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:14:55.797930 kubelet[3051]: E1216 13:14:55.797679 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:14:56.798209 kubelet[3051]: E1216 13:14:56.798159 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:15:00.798176 kubelet[3051]: E1216 13:15:00.798056 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:15:02.797975 kubelet[3051]: E1216 13:15:02.797931 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:15:03.798085 kubelet[3051]: E1216 13:15:03.797611 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:15:06.801578 kubelet[3051]: E1216 13:15:06.801474 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:15:07.798135 kubelet[3051]: E1216 13:15:07.798049 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:15:10.799764 kubelet[3051]: E1216 13:15:10.799620 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:15:12.796995 kubelet[3051]: E1216 13:15:12.796884 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:15:14.965901 systemd[1]: Started sshd@7-10.0.25.207:22-152.42.140.187:32776.service - OpenSSH per-connection server daemon (152.42.140.187:32776). Dec 16 13:15:15.048590 sshd[5670]: Connection closed by 152.42.140.187 port 32776 Dec 16 13:15:15.051332 systemd[1]: sshd@7-10.0.25.207:22-152.42.140.187:32776.service: Deactivated successfully. Dec 16 13:15:16.798171 kubelet[3051]: E1216 13:15:16.798105 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:15:17.798588 kubelet[3051]: E1216 13:15:17.798523 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:15:17.801457 kubelet[3051]: E1216 13:15:17.801339 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:15:22.798128 kubelet[3051]: E1216 13:15:22.798078 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:15:23.797264 kubelet[3051]: E1216 13:15:23.797200 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:15:26.797983 kubelet[3051]: E1216 13:15:26.797288 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:15:29.798177 kubelet[3051]: E1216 13:15:29.798051 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:15:31.797472 kubelet[3051]: E1216 13:15:31.797424 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:15:31.797472 kubelet[3051]: E1216 13:15:31.797452 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:15:35.798229 kubelet[3051]: E1216 13:15:35.797754 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:15:37.797649 kubelet[3051]: E1216 13:15:37.797595 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:15:41.797471 kubelet[3051]: E1216 13:15:41.797352 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:15:42.798648 kubelet[3051]: E1216 13:15:42.798570 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:15:42.799243 kubelet[3051]: E1216 13:15:42.798977 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:15:46.798415 kubelet[3051]: E1216 13:15:46.798368 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:15:50.798382 kubelet[3051]: E1216 13:15:50.798267 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:15:51.797320 kubelet[3051]: E1216 13:15:51.797248 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:15:52.749251 containerd[1770]: time="2025-12-16T13:15:52.749126533Z" level=warning msg="container event discarded" container=2967b3c8ead93e0e7671428a1b44d541342ab45110e1863406c4cea1aeaeaf8c type=CONTAINER_CREATED_EVENT Dec 16 13:15:52.749251 containerd[1770]: time="2025-12-16T13:15:52.749198260Z" level=warning msg="container event discarded" container=2967b3c8ead93e0e7671428a1b44d541342ab45110e1863406c4cea1aeaeaf8c type=CONTAINER_STARTED_EVENT Dec 16 13:15:52.762799 containerd[1770]: time="2025-12-16T13:15:52.762678086Z" level=warning msg="container event discarded" container=1e78eedeb6eb60d03631c19dbfe0fb7157739aef37f050f123b381f3c23f1be1 type=CONTAINER_CREATED_EVENT Dec 16 13:15:52.762799 containerd[1770]: time="2025-12-16T13:15:52.762760741Z" level=warning msg="container event discarded" container=1e78eedeb6eb60d03631c19dbfe0fb7157739aef37f050f123b381f3c23f1be1 type=CONTAINER_STARTED_EVENT Dec 16 13:15:52.774198 containerd[1770]: time="2025-12-16T13:15:52.774067003Z" level=warning msg="container event discarded" container=5f16cf2a9fcd8cedf0c024e2785f9e1b9823becc62a6a63604c662e32bf3b507 type=CONTAINER_CREATED_EVENT Dec 16 13:15:52.774198 containerd[1770]: time="2025-12-16T13:15:52.774188415Z" level=warning msg="container event discarded" container=5f16cf2a9fcd8cedf0c024e2785f9e1b9823becc62a6a63604c662e32bf3b507 type=CONTAINER_STARTED_EVENT Dec 16 13:15:52.798331 kubelet[3051]: E1216 13:15:52.798224 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:15:52.800948 containerd[1770]: time="2025-12-16T13:15:52.800862178Z" level=warning msg="container event discarded" container=39ca5bbac2b39e8800adb66471e19772fae078f2e2d705939e8400f9e693db6c type=CONTAINER_CREATED_EVENT Dec 16 13:15:52.801092 containerd[1770]: time="2025-12-16T13:15:52.800935731Z" level=warning msg="container event discarded" container=3e0a92f601c7d19b9b47c7ae1b3e1b6811475177f422394e2b729909176dd01c type=CONTAINER_CREATED_EVENT Dec 16 13:15:52.813684 containerd[1770]: time="2025-12-16T13:15:52.813559176Z" level=warning msg="container event discarded" container=eed1db98ae984d4e028be9de39d6aeda01529ec2c28d6831df395540b14bd4f1 type=CONTAINER_CREATED_EVENT Dec 16 13:15:52.883364 containerd[1770]: time="2025-12-16T13:15:52.883212082Z" level=warning msg="container event discarded" container=eed1db98ae984d4e028be9de39d6aeda01529ec2c28d6831df395540b14bd4f1 type=CONTAINER_STARTED_EVENT Dec 16 13:15:52.883364 containerd[1770]: time="2025-12-16T13:15:52.883286088Z" level=warning msg="container event discarded" container=3e0a92f601c7d19b9b47c7ae1b3e1b6811475177f422394e2b729909176dd01c type=CONTAINER_STARTED_EVENT Dec 16 13:15:52.883364 containerd[1770]: time="2025-12-16T13:15:52.883316320Z" level=warning msg="container event discarded" container=39ca5bbac2b39e8800adb66471e19772fae078f2e2d705939e8400f9e693db6c type=CONTAINER_STARTED_EVENT Dec 16 13:15:53.797669 kubelet[3051]: E1216 13:15:53.797572 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:15:54.799690 kubelet[3051]: E1216 13:15:54.799486 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:16:00.797801 kubelet[3051]: E1216 13:16:00.797736 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:16:02.798401 kubelet[3051]: E1216 13:16:02.798288 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:16:03.801252 kubelet[3051]: E1216 13:16:03.801169 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:16:04.146627 containerd[1770]: time="2025-12-16T13:16:04.146502105Z" level=warning msg="container event discarded" container=1754eb70b759939b4e210b06482499d4adf02380f5660363f64f99830c09ee6a type=CONTAINER_CREATED_EVENT Dec 16 13:16:04.146627 containerd[1770]: time="2025-12-16T13:16:04.146613494Z" level=warning msg="container event discarded" container=1754eb70b759939b4e210b06482499d4adf02380f5660363f64f99830c09ee6a type=CONTAINER_STARTED_EVENT Dec 16 13:16:04.184919 containerd[1770]: time="2025-12-16T13:16:04.184794017Z" level=warning msg="container event discarded" container=d600640b7a4ca8ac740a4e438ef66e304e42c921e1cd6ba4630519ae9196a51b type=CONTAINER_CREATED_EVENT Dec 16 13:16:04.253472 containerd[1770]: time="2025-12-16T13:16:04.253374004Z" level=warning msg="container event discarded" container=0abbeac5d8ea9ba7fad0432d4cea30fae873e48ba590697c773d466084a3cdf2 type=CONTAINER_CREATED_EVENT Dec 16 13:16:04.253472 containerd[1770]: time="2025-12-16T13:16:04.253460071Z" level=warning msg="container event discarded" container=0abbeac5d8ea9ba7fad0432d4cea30fae873e48ba590697c773d466084a3cdf2 type=CONTAINER_STARTED_EVENT Dec 16 13:16:04.303648 containerd[1770]: time="2025-12-16T13:16:04.303534901Z" level=warning msg="container event discarded" container=d600640b7a4ca8ac740a4e438ef66e304e42c921e1cd6ba4630519ae9196a51b type=CONTAINER_STARTED_EVENT Dec 16 13:16:06.526499 containerd[1770]: time="2025-12-16T13:16:06.526243495Z" level=warning msg="container event discarded" container=db5db23fac200a0160651391da1cc52e5337cc57b951ee74a4f664d3b0d7242b type=CONTAINER_CREATED_EVENT Dec 16 13:16:06.592018 containerd[1770]: time="2025-12-16T13:16:06.591812970Z" level=warning msg="container event discarded" container=db5db23fac200a0160651391da1cc52e5337cc57b951ee74a4f664d3b0d7242b type=CONTAINER_STARTED_EVENT Dec 16 13:16:06.803694 kubelet[3051]: E1216 13:16:06.803219 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:16:08.797265 kubelet[3051]: E1216 13:16:08.797180 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:16:08.799058 kubelet[3051]: E1216 13:16:08.799021 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:16:14.797940 kubelet[3051]: E1216 13:16:14.797848 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:16:15.797112 kubelet[3051]: E1216 13:16:15.796997 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:16:16.196508 containerd[1770]: time="2025-12-16T13:16:16.196379981Z" level=warning msg="container event discarded" container=52a2d0c00998b2db58653c2be14507f2efb6d82cde7dd59c9a41064c49103103 type=CONTAINER_CREATED_EVENT Dec 16 13:16:16.196508 containerd[1770]: time="2025-12-16T13:16:16.196481394Z" level=warning msg="container event discarded" container=52a2d0c00998b2db58653c2be14507f2efb6d82cde7dd59c9a41064c49103103 type=CONTAINER_STARTED_EVENT Dec 16 13:16:16.338807 containerd[1770]: time="2025-12-16T13:16:16.338742951Z" level=warning msg="container event discarded" container=c292fe828bcbb00615c372a1fe5594d8a9193821251933e77e29d90715d44e84 type=CONTAINER_CREATED_EVENT Dec 16 13:16:16.338807 containerd[1770]: time="2025-12-16T13:16:16.338802346Z" level=warning msg="container event discarded" container=c292fe828bcbb00615c372a1fe5594d8a9193821251933e77e29d90715d44e84 type=CONTAINER_STARTED_EVENT Dec 16 13:16:16.804578 kubelet[3051]: E1216 13:16:16.803878 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:16:18.759839 containerd[1770]: time="2025-12-16T13:16:18.759738345Z" level=warning msg="container event discarded" container=9cc1528219f32a69a850312d59c8be6bae5002841cfd5b8aeff65518e6bf53ce type=CONTAINER_CREATED_EVENT Dec 16 13:16:18.834751 containerd[1770]: time="2025-12-16T13:16:18.834667524Z" level=warning msg="container event discarded" container=9cc1528219f32a69a850312d59c8be6bae5002841cfd5b8aeff65518e6bf53ce type=CONTAINER_STARTED_EVENT Dec 16 13:16:19.798230 kubelet[3051]: E1216 13:16:19.798050 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:16:20.401380 containerd[1770]: time="2025-12-16T13:16:20.401143733Z" level=warning msg="container event discarded" container=df86f6918a3714b58bf76d3730727490fd28226c0874adb1232dc25073a7f785 type=CONTAINER_CREATED_EVENT Dec 16 13:16:20.523957 containerd[1770]: time="2025-12-16T13:16:20.523857304Z" level=warning msg="container event discarded" container=df86f6918a3714b58bf76d3730727490fd28226c0874adb1232dc25073a7f785 type=CONTAINER_STARTED_EVENT Dec 16 13:16:20.763513 containerd[1770]: time="2025-12-16T13:16:20.763073640Z" level=warning msg="container event discarded" container=df86f6918a3714b58bf76d3730727490fd28226c0874adb1232dc25073a7f785 type=CONTAINER_STOPPED_EVENT Dec 16 13:16:21.781684 systemd[1]: Started sshd@8-10.0.25.207:22-152.42.140.187:36996.service - OpenSSH per-connection server daemon (152.42.140.187:36996). Dec 16 13:16:21.797879 kubelet[3051]: E1216 13:16:21.797755 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:16:22.596146 sshd[5758]: Connection closed by authenticating user root 152.42.140.187 port 36996 [preauth] Dec 16 13:16:22.600376 systemd[1]: sshd@8-10.0.25.207:22-152.42.140.187:36996.service: Deactivated successfully. Dec 16 13:16:22.799708 kubelet[3051]: E1216 13:16:22.799612 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:16:24.491507 containerd[1770]: time="2025-12-16T13:16:24.491372289Z" level=warning msg="container event discarded" container=0af94392054758423909a8b11400fc393941cec86fa3f7ac19f467f1143fe724 type=CONTAINER_CREATED_EVENT Dec 16 13:16:24.618856 containerd[1770]: time="2025-12-16T13:16:24.618739246Z" level=warning msg="container event discarded" container=0af94392054758423909a8b11400fc393941cec86fa3f7ac19f467f1143fe724 type=CONTAINER_STARTED_EVENT Dec 16 13:16:25.172151 containerd[1770]: time="2025-12-16T13:16:25.172026304Z" level=warning msg="container event discarded" container=0af94392054758423909a8b11400fc393941cec86fa3f7ac19f467f1143fe724 type=CONTAINER_STOPPED_EVENT Dec 16 13:16:27.797811 kubelet[3051]: E1216 13:16:27.797738 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:16:28.798802 kubelet[3051]: E1216 13:16:28.798722 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:16:30.797347 kubelet[3051]: E1216 13:16:30.797247 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:16:33.886481 containerd[1770]: time="2025-12-16T13:16:33.886330416Z" level=warning msg="container event discarded" container=e1b4e8fed3ab448a4c1f59af45f7f79fe5dbff3ff60b7cc6f765698e380bdd08 type=CONTAINER_CREATED_EVENT Dec 16 13:16:34.009981 containerd[1770]: time="2025-12-16T13:16:34.009896812Z" level=warning msg="container event discarded" container=e1b4e8fed3ab448a4c1f59af45f7f79fe5dbff3ff60b7cc6f765698e380bdd08 type=CONTAINER_STARTED_EVENT Dec 16 13:16:34.798149 kubelet[3051]: E1216 13:16:34.797981 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:16:35.570050 containerd[1770]: time="2025-12-16T13:16:35.569910774Z" level=warning msg="container event discarded" container=8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4 type=CONTAINER_CREATED_EVENT Dec 16 13:16:35.570050 containerd[1770]: time="2025-12-16T13:16:35.569995873Z" level=warning msg="container event discarded" container=8513cc8bcab87ef8fd56c61fa01f74433bcfc1cb867a0904638602c145ff97e4 type=CONTAINER_STARTED_EVENT Dec 16 13:16:35.797459 kubelet[3051]: E1216 13:16:35.797388 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:16:36.032218 containerd[1770]: time="2025-12-16T13:16:36.032107871Z" level=warning msg="container event discarded" container=b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672 type=CONTAINER_CREATED_EVENT Dec 16 13:16:36.032218 containerd[1770]: time="2025-12-16T13:16:36.032206608Z" level=warning msg="container event discarded" container=b86669bfcd092dd804d53f37bf496abd0500f1a614e8a3025576970c99fcc672 type=CONTAINER_STARTED_EVENT Dec 16 13:16:36.062663 containerd[1770]: time="2025-12-16T13:16:36.062527498Z" level=warning msg="container event discarded" container=ec9b3f66485c048d43db19e024f11f058e1b4501d341981a395798879af6caf7 type=CONTAINER_CREATED_EVENT Dec 16 13:16:36.114900 containerd[1770]: time="2025-12-16T13:16:36.114809909Z" level=warning msg="container event discarded" container=ec9b3f66485c048d43db19e024f11f058e1b4501d341981a395798879af6caf7 type=CONTAINER_STARTED_EVENT Dec 16 13:16:37.034232 containerd[1770]: time="2025-12-16T13:16:37.034129032Z" level=warning msg="container event discarded" container=0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5 type=CONTAINER_CREATED_EVENT Dec 16 13:16:37.034232 containerd[1770]: time="2025-12-16T13:16:37.034198508Z" level=warning msg="container event discarded" container=0f8ea655211ef40ece3f3f50f40be33eb00addc88ba01fa10f42360b233e1ba5 type=CONTAINER_STARTED_EVENT Dec 16 13:16:37.797724 kubelet[3051]: E1216 13:16:37.797661 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:16:38.022588 containerd[1770]: time="2025-12-16T13:16:38.022482192Z" level=warning msg="container event discarded" container=23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f type=CONTAINER_CREATED_EVENT Dec 16 13:16:38.022588 containerd[1770]: time="2025-12-16T13:16:38.022556885Z" level=warning msg="container event discarded" container=23688a2633b05881d191690e497a1a023e52da7f8feda1eee945a45e06ac706f type=CONTAINER_STARTED_EVENT Dec 16 13:16:39.033217 containerd[1770]: time="2025-12-16T13:16:39.033119926Z" level=warning msg="container event discarded" container=11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374 type=CONTAINER_CREATED_EVENT Dec 16 13:16:39.033217 containerd[1770]: time="2025-12-16T13:16:39.033176086Z" level=warning msg="container event discarded" container=11beee305c25fb428b3f45e298c29d3cfa6154bcc4b007d1c34a75008a8b4374 type=CONTAINER_STARTED_EVENT Dec 16 13:16:39.136751 containerd[1770]: time="2025-12-16T13:16:39.136635054Z" level=warning msg="container event discarded" container=8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025 type=CONTAINER_CREATED_EVENT Dec 16 13:16:39.136751 containerd[1770]: time="2025-12-16T13:16:39.136737858Z" level=warning msg="container event discarded" container=8c80bfd347e6acbdc9d8392330ed476164da09be0a526773c13e41b376004025 type=CONTAINER_STARTED_EVENT Dec 16 13:16:39.166114 containerd[1770]: time="2025-12-16T13:16:39.166030967Z" level=warning msg="container event discarded" container=93162a156ed4dc75934262353827398939d0cd2912b94f9e04d2bc5363dba7b6 type=CONTAINER_CREATED_EVENT Dec 16 13:16:39.209588 containerd[1770]: time="2025-12-16T13:16:39.209461760Z" level=warning msg="container event discarded" container=93162a156ed4dc75934262353827398939d0cd2912b94f9e04d2bc5363dba7b6 type=CONTAINER_STARTED_EVENT Dec 16 13:16:41.089757 containerd[1770]: time="2025-12-16T13:16:41.089697476Z" level=warning msg="container event discarded" container=576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd type=CONTAINER_CREATED_EVENT Dec 16 13:16:41.090188 containerd[1770]: time="2025-12-16T13:16:41.090153582Z" level=warning msg="container event discarded" container=576fba02889c50bb0682802d1fe6cbb5c9d370c9ccbc26e7827400c3815f44cd type=CONTAINER_STARTED_EVENT Dec 16 13:16:41.148622 containerd[1770]: time="2025-12-16T13:16:41.148553893Z" level=warning msg="container event discarded" container=8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40 type=CONTAINER_CREATED_EVENT Dec 16 13:16:41.148622 containerd[1770]: time="2025-12-16T13:16:41.148597293Z" level=warning msg="container event discarded" container=8f9d117ed0148ebc2567cc7f3e55846707ebf9755992549c09c842bf0efd7c40 type=CONTAINER_STARTED_EVENT Dec 16 13:16:41.797839 kubelet[3051]: E1216 13:16:41.797772 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:16:42.796907 kubelet[3051]: E1216 13:16:42.796858 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:16:43.798638 kubelet[3051]: E1216 13:16:43.798589 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:16:48.797618 kubelet[3051]: E1216 13:16:48.797551 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:16:48.798813 kubelet[3051]: E1216 13:16:48.797764 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:16:49.796817 kubelet[3051]: E1216 13:16:49.796743 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:16:52.798028 kubelet[3051]: E1216 13:16:52.797926 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:16:53.796569 kubelet[3051]: E1216 13:16:53.796504 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:16:57.798680 kubelet[3051]: E1216 13:16:57.798580 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:17:01.271450 systemd[1]: Started sshd@9-10.0.25.207:22-152.42.140.187:53732.service - OpenSSH per-connection server daemon (152.42.140.187:53732). Dec 16 13:17:01.470830 sshd[5806]: Connection closed by authenticating user root 152.42.140.187 port 53732 [preauth] Dec 16 13:17:01.473957 systemd[1]: sshd@9-10.0.25.207:22-152.42.140.187:53732.service: Deactivated successfully. Dec 16 13:17:01.796697 kubelet[3051]: E1216 13:17:01.796537 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:17:01.797787 kubelet[3051]: E1216 13:17:01.797692 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:17:04.797344 kubelet[3051]: E1216 13:17:04.797257 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:17:04.798145 kubelet[3051]: E1216 13:17:04.798044 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:17:07.797000 kubelet[3051]: E1216 13:17:07.796920 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:17:12.798042 containerd[1770]: time="2025-12-16T13:17:12.797969409Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 13:17:13.231846 containerd[1770]: time="2025-12-16T13:17:13.231806097Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:17:13.233889 containerd[1770]: time="2025-12-16T13:17:13.233813579Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 13:17:13.234606 containerd[1770]: time="2025-12-16T13:17:13.233895912Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 13:17:13.236382 kubelet[3051]: E1216 13:17:13.236326 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:17:13.236382 kubelet[3051]: E1216 13:17:13.236373 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:17:13.236896 kubelet[3051]: E1216 13:17:13.236444 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-f47d785c5-qmfjz_calico-system(be9b0dbc-c111-4079-904c-bc275a2adf0f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 13:17:13.237155 containerd[1770]: time="2025-12-16T13:17:13.237110709Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 13:17:13.568179 containerd[1770]: time="2025-12-16T13:17:13.568058010Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:17:13.578494 containerd[1770]: time="2025-12-16T13:17:13.575557847Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 13:17:13.578494 containerd[1770]: time="2025-12-16T13:17:13.575664280Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 13:17:13.578732 kubelet[3051]: E1216 13:17:13.576428 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:17:13.578732 kubelet[3051]: E1216 13:17:13.576476 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:17:13.578732 kubelet[3051]: E1216 13:17:13.576559 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-f47d785c5-qmfjz_calico-system(be9b0dbc-c111-4079-904c-bc275a2adf0f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 13:17:13.578897 kubelet[3051]: E1216 13:17:13.576601 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:17:14.797184 kubelet[3051]: E1216 13:17:14.797133 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:17:15.796998 kubelet[3051]: E1216 13:17:15.796948 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:17:15.797959 kubelet[3051]: E1216 13:17:15.797878 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:17:19.796771 kubelet[3051]: E1216 13:17:19.796711 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:17:19.797960 containerd[1770]: time="2025-12-16T13:17:19.797433272Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:17:20.127553 containerd[1770]: time="2025-12-16T13:17:20.127494710Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:17:20.129621 containerd[1770]: time="2025-12-16T13:17:20.129580772Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:17:20.129741 containerd[1770]: time="2025-12-16T13:17:20.129689551Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:17:20.129911 kubelet[3051]: E1216 13:17:20.129872 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:17:20.129967 kubelet[3051]: E1216 13:17:20.129926 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:17:20.130036 kubelet[3051]: E1216 13:17:20.130016 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-685f7c88c8-zm2j4_calico-apiserver(cfbcb130-7336-4c20-a673-9988ffd0b461): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:17:20.130374 kubelet[3051]: E1216 13:17:20.130055 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:17:23.903623 systemd[1]: Started sshd@10-10.0.25.207:22-147.75.109.163:41188.service - OpenSSH per-connection server daemon (147.75.109.163:41188). Dec 16 13:17:24.864978 sshd[5844]: Accepted publickey for core from 147.75.109.163 port 41188 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:17:24.867361 sshd-session[5844]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:17:24.877587 systemd-logind[1750]: New session 8 of user core. Dec 16 13:17:24.891631 systemd[1]: Started session-8.scope - Session 8 of User core. Dec 16 13:17:25.627717 sshd[5847]: Connection closed by 147.75.109.163 port 41188 Dec 16 13:17:25.628084 sshd-session[5844]: pam_unix(sshd:session): session closed for user core Dec 16 13:17:25.632069 systemd[1]: sshd@10-10.0.25.207:22-147.75.109.163:41188.service: Deactivated successfully. Dec 16 13:17:25.633676 systemd[1]: session-8.scope: Deactivated successfully. Dec 16 13:17:25.634454 systemd-logind[1750]: Session 8 logged out. Waiting for processes to exit. Dec 16 13:17:25.635531 systemd-logind[1750]: Removed session 8. Dec 16 13:17:25.798090 kubelet[3051]: E1216 13:17:25.797996 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:17:26.797607 containerd[1770]: time="2025-12-16T13:17:26.797562336Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 13:17:27.118754 containerd[1770]: time="2025-12-16T13:17:27.118657979Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:17:27.120911 containerd[1770]: time="2025-12-16T13:17:27.120804835Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 13:17:27.120911 containerd[1770]: time="2025-12-16T13:17:27.120827111Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 13:17:27.121224 kubelet[3051]: E1216 13:17:27.121081 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:17:27.121224 kubelet[3051]: E1216 13:17:27.121129 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:17:27.121718 kubelet[3051]: E1216 13:17:27.121476 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-mvzk2_calico-system(49b26efe-3f65-4cf1-9ebd-006f8e8a22f9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 13:17:27.121755 containerd[1770]: time="2025-12-16T13:17:27.121463785Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 13:17:27.467512 containerd[1770]: time="2025-12-16T13:17:27.467268021Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:17:27.469709 containerd[1770]: time="2025-12-16T13:17:27.469619454Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 13:17:27.469850 containerd[1770]: time="2025-12-16T13:17:27.469725437Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 13:17:27.470154 kubelet[3051]: E1216 13:17:27.470043 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:17:27.470154 kubelet[3051]: E1216 13:17:27.470095 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:17:27.470569 containerd[1770]: time="2025-12-16T13:17:27.470432897Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 13:17:27.470784 kubelet[3051]: E1216 13:17:27.470721 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-sfvmc_calico-system(db81499e-70ee-42e1-9fb9-a69e20146fbb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 13:17:27.470879 kubelet[3051]: E1216 13:17:27.470784 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:17:27.825442 containerd[1770]: time="2025-12-16T13:17:27.825159392Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:17:27.827558 containerd[1770]: time="2025-12-16T13:17:27.827440248Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 13:17:27.827721 containerd[1770]: time="2025-12-16T13:17:27.827486652Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 13:17:27.827929 kubelet[3051]: E1216 13:17:27.827874 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:17:27.828063 kubelet[3051]: E1216 13:17:27.827936 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:17:27.828063 kubelet[3051]: E1216 13:17:27.828034 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-mvzk2_calico-system(49b26efe-3f65-4cf1-9ebd-006f8e8a22f9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 13:17:27.828344 kubelet[3051]: E1216 13:17:27.828091 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:17:29.797414 containerd[1770]: time="2025-12-16T13:17:29.797333380Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 13:17:30.146314 containerd[1770]: time="2025-12-16T13:17:30.146242156Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:17:30.148475 containerd[1770]: time="2025-12-16T13:17:30.148430575Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 13:17:30.148475 containerd[1770]: time="2025-12-16T13:17:30.148434386Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 13:17:30.148712 kubelet[3051]: E1216 13:17:30.148673 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:17:30.148999 kubelet[3051]: E1216 13:17:30.148716 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:17:30.148999 kubelet[3051]: E1216 13:17:30.148788 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-5d8969c677-gf2lj_calico-system(f1946944-0457-47c9-82f5-a0302117750a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 13:17:30.148999 kubelet[3051]: E1216 13:17:30.148819 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:17:30.797564 containerd[1770]: time="2025-12-16T13:17:30.797506267Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:17:30.809670 systemd[1]: Started sshd@11-10.0.25.207:22-147.75.109.163:41204.service - OpenSSH per-connection server daemon (147.75.109.163:41204). Dec 16 13:17:31.143629 containerd[1770]: time="2025-12-16T13:17:31.143578459Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:17:31.145674 containerd[1770]: time="2025-12-16T13:17:31.145619338Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:17:31.145786 containerd[1770]: time="2025-12-16T13:17:31.145712703Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:17:31.145872 kubelet[3051]: E1216 13:17:31.145828 3051 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:17:31.145935 kubelet[3051]: E1216 13:17:31.145876 3051 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:17:31.145969 kubelet[3051]: E1216 13:17:31.145949 3051 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-685f7c88c8-9xxz2_calico-apiserver(7fc3c8ab-a97b-45a2-9167-06f605649e74): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:17:31.145999 kubelet[3051]: E1216 13:17:31.145982 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:17:31.803744 sshd[5865]: Accepted publickey for core from 147.75.109.163 port 41204 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:17:31.805487 sshd-session[5865]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:17:31.814431 systemd-logind[1750]: New session 9 of user core. Dec 16 13:17:31.825561 systemd[1]: Started session-9.scope - Session 9 of User core. Dec 16 13:17:32.528376 sshd[5868]: Connection closed by 147.75.109.163 port 41204 Dec 16 13:17:32.528539 sshd-session[5865]: pam_unix(sshd:session): session closed for user core Dec 16 13:17:32.532155 systemd[1]: sshd@11-10.0.25.207:22-147.75.109.163:41204.service: Deactivated successfully. Dec 16 13:17:32.534286 systemd[1]: session-9.scope: Deactivated successfully. Dec 16 13:17:32.536010 systemd-logind[1750]: Session 9 logged out. Waiting for processes to exit. Dec 16 13:17:32.537173 systemd-logind[1750]: Removed session 9. Dec 16 13:17:32.702565 systemd[1]: Started sshd@12-10.0.25.207:22-147.75.109.163:51550.service - OpenSSH per-connection server daemon (147.75.109.163:51550). Dec 16 13:17:33.696437 sshd[5886]: Accepted publickey for core from 147.75.109.163 port 51550 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:17:33.697528 sshd-session[5886]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:17:33.702362 systemd-logind[1750]: New session 10 of user core. Dec 16 13:17:33.710478 systemd[1]: Started session-10.scope - Session 10 of User core. Dec 16 13:17:34.468355 sshd[5889]: Connection closed by 147.75.109.163 port 51550 Dec 16 13:17:34.467844 sshd-session[5886]: pam_unix(sshd:session): session closed for user core Dec 16 13:17:34.471765 systemd[1]: sshd@12-10.0.25.207:22-147.75.109.163:51550.service: Deactivated successfully. Dec 16 13:17:34.474600 systemd[1]: session-10.scope: Deactivated successfully. Dec 16 13:17:34.475695 systemd-logind[1750]: Session 10 logged out. Waiting for processes to exit. Dec 16 13:17:34.477796 systemd-logind[1750]: Removed session 10. Dec 16 13:17:34.667062 systemd[1]: Started sshd@13-10.0.25.207:22-147.75.109.163:51556.service - OpenSSH per-connection server daemon (147.75.109.163:51556). Dec 16 13:17:35.712636 sshd[5907]: Accepted publickey for core from 147.75.109.163 port 51556 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:17:35.713775 sshd-session[5907]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:17:35.720573 systemd-logind[1750]: New session 11 of user core. Dec 16 13:17:35.731558 systemd[1]: Started session-11.scope - Session 11 of User core. Dec 16 13:17:35.796593 kubelet[3051]: E1216 13:17:35.796535 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:17:36.485750 sshd[5910]: Connection closed by 147.75.109.163 port 51556 Dec 16 13:17:36.486380 sshd-session[5907]: pam_unix(sshd:session): session closed for user core Dec 16 13:17:36.492626 systemd[1]: sshd@13-10.0.25.207:22-147.75.109.163:51556.service: Deactivated successfully. Dec 16 13:17:36.494548 systemd[1]: session-11.scope: Deactivated successfully. Dec 16 13:17:36.495284 systemd-logind[1750]: Session 11 logged out. Waiting for processes to exit. Dec 16 13:17:36.496863 systemd-logind[1750]: Removed session 11. Dec 16 13:17:39.801318 kubelet[3051]: E1216 13:17:39.798722 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:17:39.988897 systemd[1]: Started sshd@14-10.0.25.207:22-152.42.140.187:54152.service - OpenSSH per-connection server daemon (152.42.140.187:54152). Dec 16 13:17:40.565537 sshd[5932]: Connection closed by authenticating user root 152.42.140.187 port 54152 [preauth] Dec 16 13:17:40.567348 systemd[1]: sshd@14-10.0.25.207:22-152.42.140.187:54152.service: Deactivated successfully. Dec 16 13:17:41.651868 systemd[1]: Started sshd@15-10.0.25.207:22-147.75.109.163:51560.service - OpenSSH per-connection server daemon (147.75.109.163:51560). Dec 16 13:17:41.797024 kubelet[3051]: E1216 13:17:41.796928 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:17:42.625747 sshd[5940]: Accepted publickey for core from 147.75.109.163 port 51560 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:17:42.628159 sshd-session[5940]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:17:42.636674 systemd-logind[1750]: New session 12 of user core. Dec 16 13:17:42.650766 systemd[1]: Started session-12.scope - Session 12 of User core. Dec 16 13:17:42.797466 kubelet[3051]: E1216 13:17:42.797381 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:17:42.798967 kubelet[3051]: E1216 13:17:42.798399 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:17:43.428918 sshd[5943]: Connection closed by 147.75.109.163 port 51560 Dec 16 13:17:43.429290 sshd-session[5940]: pam_unix(sshd:session): session closed for user core Dec 16 13:17:43.435221 systemd[1]: sshd@15-10.0.25.207:22-147.75.109.163:51560.service: Deactivated successfully. Dec 16 13:17:43.439543 systemd[1]: session-12.scope: Deactivated successfully. Dec 16 13:17:43.441261 systemd-logind[1750]: Session 12 logged out. Waiting for processes to exit. Dec 16 13:17:43.443220 systemd-logind[1750]: Removed session 12. Dec 16 13:17:43.797912 kubelet[3051]: E1216 13:17:43.796705 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:17:48.605351 systemd[1]: Started sshd@16-10.0.25.207:22-147.75.109.163:53110.service - OpenSSH per-connection server daemon (147.75.109.163:53110). Dec 16 13:17:49.593221 sshd[6002]: Accepted publickey for core from 147.75.109.163 port 53110 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:17:49.596321 sshd-session[6002]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:17:49.611543 systemd-logind[1750]: New session 13 of user core. Dec 16 13:17:49.621506 systemd[1]: Started session-13.scope - Session 13 of User core. Dec 16 13:17:50.367211 sshd[6005]: Connection closed by 147.75.109.163 port 53110 Dec 16 13:17:50.367771 sshd-session[6002]: pam_unix(sshd:session): session closed for user core Dec 16 13:17:50.372155 systemd[1]: sshd@16-10.0.25.207:22-147.75.109.163:53110.service: Deactivated successfully. Dec 16 13:17:50.374118 systemd[1]: session-13.scope: Deactivated successfully. Dec 16 13:17:50.375095 systemd-logind[1750]: Session 13 logged out. Waiting for processes to exit. Dec 16 13:17:50.376532 systemd-logind[1750]: Removed session 13. Dec 16 13:17:50.561409 systemd[1]: Started sshd@17-10.0.25.207:22-147.75.109.163:53120.service - OpenSSH per-connection server daemon (147.75.109.163:53120). Dec 16 13:17:50.797822 kubelet[3051]: E1216 13:17:50.797678 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:17:51.640020 sshd[6024]: Accepted publickey for core from 147.75.109.163 port 53120 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:17:51.641609 sshd-session[6024]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:17:51.646424 systemd-logind[1750]: New session 14 of user core. Dec 16 13:17:51.656458 systemd[1]: Started session-14.scope - Session 14 of User core. Dec 16 13:17:52.459964 sshd[6034]: Connection closed by 147.75.109.163 port 53120 Dec 16 13:17:52.460662 sshd-session[6024]: pam_unix(sshd:session): session closed for user core Dec 16 13:17:52.464494 systemd[1]: sshd@17-10.0.25.207:22-147.75.109.163:53120.service: Deactivated successfully. Dec 16 13:17:52.466674 systemd[1]: session-14.scope: Deactivated successfully. Dec 16 13:17:52.469118 systemd-logind[1750]: Session 14 logged out. Waiting for processes to exit. Dec 16 13:17:52.470981 systemd-logind[1750]: Removed session 14. Dec 16 13:17:52.615980 systemd[1]: Started sshd@18-10.0.25.207:22-147.75.109.163:38128.service - OpenSSH per-connection server daemon (147.75.109.163:38128). Dec 16 13:17:53.587502 sshd[6049]: Accepted publickey for core from 147.75.109.163 port 38128 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:17:53.590491 sshd-session[6049]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:17:53.599531 systemd-logind[1750]: New session 15 of user core. Dec 16 13:17:53.617511 systemd[1]: Started session-15.scope - Session 15 of User core. Dec 16 13:17:53.798054 kubelet[3051]: E1216 13:17:53.797951 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:17:54.796820 kubelet[3051]: E1216 13:17:54.796545 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:17:54.951172 sshd[6052]: Connection closed by 147.75.109.163 port 38128 Dec 16 13:17:54.951798 sshd-session[6049]: pam_unix(sshd:session): session closed for user core Dec 16 13:17:54.957839 systemd[1]: sshd@18-10.0.25.207:22-147.75.109.163:38128.service: Deactivated successfully. Dec 16 13:17:54.960412 systemd[1]: session-15.scope: Deactivated successfully. Dec 16 13:17:54.961733 systemd-logind[1750]: Session 15 logged out. Waiting for processes to exit. Dec 16 13:17:54.963540 systemd-logind[1750]: Removed session 15. Dec 16 13:17:55.131924 systemd[1]: Started sshd@19-10.0.25.207:22-147.75.109.163:38140.service - OpenSSH per-connection server daemon (147.75.109.163:38140). Dec 16 13:17:56.151633 sshd[6072]: Accepted publickey for core from 147.75.109.163 port 38140 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:17:56.153133 sshd-session[6072]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:17:56.158525 systemd-logind[1750]: New session 16 of user core. Dec 16 13:17:56.166537 systemd[1]: Started session-16.scope - Session 16 of User core. Dec 16 13:17:57.052548 sshd[6077]: Connection closed by 147.75.109.163 port 38140 Dec 16 13:17:57.053458 sshd-session[6072]: pam_unix(sshd:session): session closed for user core Dec 16 13:17:57.060384 systemd[1]: sshd@19-10.0.25.207:22-147.75.109.163:38140.service: Deactivated successfully. Dec 16 13:17:57.064759 systemd[1]: session-16.scope: Deactivated successfully. Dec 16 13:17:57.069458 systemd-logind[1750]: Session 16 logged out. Waiting for processes to exit. Dec 16 13:17:57.071849 systemd-logind[1750]: Removed session 16. Dec 16 13:17:57.242805 systemd[1]: Started sshd@20-10.0.25.207:22-147.75.109.163:38142.service - OpenSSH per-connection server daemon (147.75.109.163:38142). Dec 16 13:17:57.796454 kubelet[3051]: E1216 13:17:57.796396 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:17:57.797541 kubelet[3051]: E1216 13:17:57.796643 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:17:57.798474 kubelet[3051]: E1216 13:17:57.798332 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:17:58.273703 sshd[6094]: Accepted publickey for core from 147.75.109.163 port 38142 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:17:58.274850 sshd-session[6094]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:17:58.280208 systemd-logind[1750]: New session 17 of user core. Dec 16 13:17:58.294619 systemd[1]: Started session-17.scope - Session 17 of User core. Dec 16 13:17:59.064804 sshd[6097]: Connection closed by 147.75.109.163 port 38142 Dec 16 13:17:59.065164 sshd-session[6094]: pam_unix(sshd:session): session closed for user core Dec 16 13:17:59.068692 systemd[1]: sshd@20-10.0.25.207:22-147.75.109.163:38142.service: Deactivated successfully. Dec 16 13:17:59.070335 systemd[1]: session-17.scope: Deactivated successfully. Dec 16 13:17:59.072464 systemd-logind[1750]: Session 17 logged out. Waiting for processes to exit. Dec 16 13:17:59.075760 systemd-logind[1750]: Removed session 17. Dec 16 13:18:04.234139 systemd[1]: Started sshd@21-10.0.25.207:22-147.75.109.163:56140.service - OpenSSH per-connection server daemon (147.75.109.163:56140). Dec 16 13:18:04.797693 kubelet[3051]: E1216 13:18:04.797589 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:18:05.230003 sshd[6116]: Accepted publickey for core from 147.75.109.163 port 56140 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:18:05.231729 sshd-session[6116]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:18:05.238729 systemd-logind[1750]: New session 18 of user core. Dec 16 13:18:05.259565 systemd[1]: Started session-18.scope - Session 18 of User core. Dec 16 13:18:05.797764 kubelet[3051]: E1216 13:18:05.797700 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:18:05.983484 sshd[6121]: Connection closed by 147.75.109.163 port 56140 Dec 16 13:18:05.983822 sshd-session[6116]: pam_unix(sshd:session): session closed for user core Dec 16 13:18:05.987264 systemd[1]: sshd@21-10.0.25.207:22-147.75.109.163:56140.service: Deactivated successfully. Dec 16 13:18:05.989188 systemd[1]: session-18.scope: Deactivated successfully. Dec 16 13:18:05.991554 systemd-logind[1750]: Session 18 logged out. Waiting for processes to exit. Dec 16 13:18:05.994145 systemd-logind[1750]: Removed session 18. Dec 16 13:18:06.798542 kubelet[3051]: E1216 13:18:06.798262 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:18:10.796722 kubelet[3051]: E1216 13:18:10.796676 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:18:10.797726 kubelet[3051]: E1216 13:18:10.797688 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:18:11.155678 systemd[1]: Started sshd@22-10.0.25.207:22-147.75.109.163:56148.service - OpenSSH per-connection server daemon (147.75.109.163:56148). Dec 16 13:18:11.797510 kubelet[3051]: E1216 13:18:11.797432 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:18:12.139534 sshd[6140]: Accepted publickey for core from 147.75.109.163 port 56148 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:18:12.140837 sshd-session[6140]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:18:12.150112 systemd-logind[1750]: New session 19 of user core. Dec 16 13:18:12.167551 systemd[1]: Started session-19.scope - Session 19 of User core. Dec 16 13:18:12.879577 sshd[6143]: Connection closed by 147.75.109.163 port 56148 Dec 16 13:18:12.880259 sshd-session[6140]: pam_unix(sshd:session): session closed for user core Dec 16 13:18:12.886179 systemd[1]: sshd@22-10.0.25.207:22-147.75.109.163:56148.service: Deactivated successfully. Dec 16 13:18:12.891061 systemd[1]: session-19.scope: Deactivated successfully. Dec 16 13:18:12.893013 systemd-logind[1750]: Session 19 logged out. Waiting for processes to exit. Dec 16 13:18:12.894856 systemd-logind[1750]: Removed session 19. Dec 16 13:18:17.796848 kubelet[3051]: E1216 13:18:17.796563 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:18:17.797479 kubelet[3051]: E1216 13:18:17.797061 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:18:17.939199 systemd[1]: Started sshd@23-10.0.25.207:22-152.42.140.187:59492.service - OpenSSH per-connection server daemon (152.42.140.187:59492). Dec 16 13:18:18.023939 sshd[6188]: Connection closed by authenticating user root 152.42.140.187 port 59492 [preauth] Dec 16 13:18:18.026194 systemd[1]: sshd@23-10.0.25.207:22-152.42.140.187:59492.service: Deactivated successfully. Dec 16 13:18:18.081922 systemd[1]: Started sshd@24-10.0.25.207:22-147.75.109.163:43216.service - OpenSSH per-connection server daemon (147.75.109.163:43216). Dec 16 13:18:19.136379 sshd[6194]: Accepted publickey for core from 147.75.109.163 port 43216 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:18:19.137526 sshd-session[6194]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:18:19.142230 systemd-logind[1750]: New session 20 of user core. Dec 16 13:18:19.165517 systemd[1]: Started session-20.scope - Session 20 of User core. Dec 16 13:18:19.796337 kubelet[3051]: E1216 13:18:19.796284 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:18:19.963950 sshd[6197]: Connection closed by 147.75.109.163 port 43216 Dec 16 13:18:19.964427 sshd-session[6194]: pam_unix(sshd:session): session closed for user core Dec 16 13:18:19.968933 systemd[1]: sshd@24-10.0.25.207:22-147.75.109.163:43216.service: Deactivated successfully. Dec 16 13:18:19.970790 systemd[1]: session-20.scope: Deactivated successfully. Dec 16 13:18:19.971611 systemd-logind[1750]: Session 20 logged out. Waiting for processes to exit. Dec 16 13:18:19.972682 systemd-logind[1750]: Removed session 20. Dec 16 13:18:21.797056 kubelet[3051]: E1216 13:18:21.796993 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:18:22.797144 kubelet[3051]: E1216 13:18:22.797093 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:18:24.796433 kubelet[3051]: E1216 13:18:24.796381 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:18:25.118449 systemd[1]: Started sshd@25-10.0.25.207:22-147.75.109.163:57882.service - OpenSSH per-connection server daemon (147.75.109.163:57882). Dec 16 13:18:26.090991 sshd[6215]: Accepted publickey for core from 147.75.109.163 port 57882 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:18:26.092756 sshd-session[6215]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:18:26.100342 systemd-logind[1750]: New session 21 of user core. Dec 16 13:18:26.114638 systemd[1]: Started session-21.scope - Session 21 of User core. Dec 16 13:18:26.801665 sshd[6218]: Connection closed by 147.75.109.163 port 57882 Dec 16 13:18:26.802045 sshd-session[6215]: pam_unix(sshd:session): session closed for user core Dec 16 13:18:26.806290 systemd[1]: sshd@25-10.0.25.207:22-147.75.109.163:57882.service: Deactivated successfully. Dec 16 13:18:26.808120 systemd[1]: session-21.scope: Deactivated successfully. Dec 16 13:18:26.809044 systemd-logind[1750]: Session 21 logged out. Waiting for processes to exit. Dec 16 13:18:26.810146 systemd-logind[1750]: Removed session 21. Dec 16 13:18:30.797617 kubelet[3051]: E1216 13:18:30.797555 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:18:31.797347 kubelet[3051]: E1216 13:18:31.797212 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:18:33.797867 kubelet[3051]: E1216 13:18:33.797799 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:18:34.796629 kubelet[3051]: E1216 13:18:34.796580 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:18:34.796824 kubelet[3051]: E1216 13:18:34.796710 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:18:37.796055 kubelet[3051]: E1216 13:18:37.796017 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:18:43.798602 kubelet[3051]: E1216 13:18:43.798552 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:18:44.796764 kubelet[3051]: E1216 13:18:44.796682 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:18:46.799194 kubelet[3051]: E1216 13:18:46.799133 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:18:48.797487 kubelet[3051]: E1216 13:18:48.797209 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:18:48.798398 kubelet[3051]: E1216 13:18:48.797466 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:18:49.796606 kubelet[3051]: E1216 13:18:49.796520 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:18:54.823731 systemd[1]: Started sshd@26-10.0.25.207:22-152.42.140.187:39600.service - OpenSSH per-connection server daemon (152.42.140.187:39600). Dec 16 13:18:54.866848 systemd[1]: cri-containerd-eed1db98ae984d4e028be9de39d6aeda01529ec2c28d6831df395540b14bd4f1.scope: Deactivated successfully. Dec 16 13:18:54.867523 systemd[1]: cri-containerd-eed1db98ae984d4e028be9de39d6aeda01529ec2c28d6831df395540b14bd4f1.scope: Consumed 7.432s CPU time, 67.6M memory peak. Dec 16 13:18:54.870424 containerd[1770]: time="2025-12-16T13:18:54.870358818Z" level=info msg="received container exit event container_id:\"eed1db98ae984d4e028be9de39d6aeda01529ec2c28d6831df395540b14bd4f1\" id:\"eed1db98ae984d4e028be9de39d6aeda01529ec2c28d6831df395540b14bd4f1\" pid:2886 exit_status:1 exited_at:{seconds:1765891134 nanos:869782417}" Dec 16 13:18:54.900697 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-eed1db98ae984d4e028be9de39d6aeda01529ec2c28d6831df395540b14bd4f1-rootfs.mount: Deactivated successfully. Dec 16 13:18:54.978776 sshd[6266]: Connection closed by authenticating user root 152.42.140.187 port 39600 [preauth] Dec 16 13:18:54.981368 systemd[1]: sshd@26-10.0.25.207:22-152.42.140.187:39600.service: Deactivated successfully. Dec 16 13:18:55.007234 systemd[1]: cri-containerd-db5db23fac200a0160651391da1cc52e5337cc57b951ee74a4f664d3b0d7242b.scope: Deactivated successfully. Dec 16 13:18:55.008180 systemd[1]: cri-containerd-db5db23fac200a0160651391da1cc52e5337cc57b951ee74a4f664d3b0d7242b.scope: Consumed 1min 36.047s CPU time, 113.2M memory peak. Dec 16 13:18:55.010595 containerd[1770]: time="2025-12-16T13:18:55.010522495Z" level=info msg="received container exit event container_id:\"db5db23fac200a0160651391da1cc52e5337cc57b951ee74a4f664d3b0d7242b\" id:\"db5db23fac200a0160651391da1cc52e5337cc57b951ee74a4f664d3b0d7242b\" pid:3399 exit_status:1 exited_at:{seconds:1765891135 nanos:10085735}" Dec 16 13:18:55.042955 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-db5db23fac200a0160651391da1cc52e5337cc57b951ee74a4f664d3b0d7242b-rootfs.mount: Deactivated successfully. Dec 16 13:18:55.105320 kubelet[3051]: I1216 13:18:55.104675 3051 scope.go:117] "RemoveContainer" containerID="db5db23fac200a0160651391da1cc52e5337cc57b951ee74a4f664d3b0d7242b" Dec 16 13:18:55.107287 kubelet[3051]: I1216 13:18:55.107238 3051 scope.go:117] "RemoveContainer" containerID="eed1db98ae984d4e028be9de39d6aeda01529ec2c28d6831df395540b14bd4f1" Dec 16 13:18:55.111390 containerd[1770]: time="2025-12-16T13:18:55.111343160Z" level=info msg="CreateContainer within sandbox \"5f16cf2a9fcd8cedf0c024e2785f9e1b9823becc62a6a63604c662e32bf3b507\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Dec 16 13:18:55.119043 containerd[1770]: time="2025-12-16T13:18:55.118977690Z" level=info msg="CreateContainer within sandbox \"0abbeac5d8ea9ba7fad0432d4cea30fae873e48ba590697c773d466084a3cdf2\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Dec 16 13:18:55.129997 containerd[1770]: time="2025-12-16T13:18:55.129944790Z" level=info msg="Container bc9e6ebe71e55109d6c565ab1b73ff58e3526147f986113b4fc51d97a0d2b204: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:18:55.140333 containerd[1770]: time="2025-12-16T13:18:55.140248267Z" level=info msg="Container 0ed5a408f944a12b1713df86f70680bf5626bc89c312da9f332c609e3abfb90e: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:18:55.147461 containerd[1770]: time="2025-12-16T13:18:55.147385772Z" level=info msg="CreateContainer within sandbox \"5f16cf2a9fcd8cedf0c024e2785f9e1b9823becc62a6a63604c662e32bf3b507\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"bc9e6ebe71e55109d6c565ab1b73ff58e3526147f986113b4fc51d97a0d2b204\"" Dec 16 13:18:55.148143 containerd[1770]: time="2025-12-16T13:18:55.148091544Z" level=info msg="StartContainer for \"bc9e6ebe71e55109d6c565ab1b73ff58e3526147f986113b4fc51d97a0d2b204\"" Dec 16 13:18:55.149570 containerd[1770]: time="2025-12-16T13:18:55.149538358Z" level=info msg="connecting to shim bc9e6ebe71e55109d6c565ab1b73ff58e3526147f986113b4fc51d97a0d2b204" address="unix:///run/containerd/s/24684877e733198d03bd57b7bda439f9b5469fea17d3d101e46e0b33d394a64c" protocol=ttrpc version=3 Dec 16 13:18:55.150638 containerd[1770]: time="2025-12-16T13:18:55.150595883Z" level=info msg="CreateContainer within sandbox \"0abbeac5d8ea9ba7fad0432d4cea30fae873e48ba590697c773d466084a3cdf2\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"0ed5a408f944a12b1713df86f70680bf5626bc89c312da9f332c609e3abfb90e\"" Dec 16 13:18:55.151490 containerd[1770]: time="2025-12-16T13:18:55.151279494Z" level=info msg="StartContainer for \"0ed5a408f944a12b1713df86f70680bf5626bc89c312da9f332c609e3abfb90e\"" Dec 16 13:18:55.152164 containerd[1770]: time="2025-12-16T13:18:55.152124376Z" level=info msg="connecting to shim 0ed5a408f944a12b1713df86f70680bf5626bc89c312da9f332c609e3abfb90e" address="unix:///run/containerd/s/a7e13e2b9dbdf51efd3ce75133f9e28080cacb647b43e15ac8ddda6b3773a002" protocol=ttrpc version=3 Dec 16 13:18:55.180613 systemd[1]: Started cri-containerd-bc9e6ebe71e55109d6c565ab1b73ff58e3526147f986113b4fc51d97a0d2b204.scope - libcontainer container bc9e6ebe71e55109d6c565ab1b73ff58e3526147f986113b4fc51d97a0d2b204. Dec 16 13:18:55.185407 systemd[1]: Started cri-containerd-0ed5a408f944a12b1713df86f70680bf5626bc89c312da9f332c609e3abfb90e.scope - libcontainer container 0ed5a408f944a12b1713df86f70680bf5626bc89c312da9f332c609e3abfb90e. Dec 16 13:18:55.221450 containerd[1770]: time="2025-12-16T13:18:55.221404501Z" level=info msg="StartContainer for \"0ed5a408f944a12b1713df86f70680bf5626bc89c312da9f332c609e3abfb90e\" returns successfully" Dec 16 13:18:55.233988 containerd[1770]: time="2025-12-16T13:18:55.233854645Z" level=info msg="StartContainer for \"bc9e6ebe71e55109d6c565ab1b73ff58e3526147f986113b4fc51d97a0d2b204\" returns successfully" Dec 16 13:18:55.274581 kubelet[3051]: E1216 13:18:55.274380 3051 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.25.207:41938->10.0.25.247:2379: read: connection timed out" Dec 16 13:18:57.798074 kubelet[3051]: E1216 13:18:57.797959 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:18:57.798978 kubelet[3051]: E1216 13:18:57.798827 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:18:59.305896 kubelet[3051]: E1216 13:18:59.305709 3051 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.25.207:41612->10.0.25.247:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4459-2-2-9-79ca1ea2c9.1881b4a026002499 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4459-2-2-9-79ca1ea2c9,UID:db1da35d80e3303675c287cd2133e522,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4459-2-2-9-79ca1ea2c9,},FirstTimestamp:2025-12-16 13:18:48.831255705 +0000 UTC m=+472.143878312,LastTimestamp:2025-12-16 13:18:48.831255705 +0000 UTC m=+472.143878312,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-9-79ca1ea2c9,}" Dec 16 13:19:00.789390 systemd[1]: cri-containerd-3e0a92f601c7d19b9b47c7ae1b3e1b6811475177f422394e2b729909176dd01c.scope: Deactivated successfully. Dec 16 13:19:00.790026 systemd[1]: cri-containerd-3e0a92f601c7d19b9b47c7ae1b3e1b6811475177f422394e2b729909176dd01c.scope: Consumed 4.808s CPU time, 27.3M memory peak. Dec 16 13:19:00.794236 containerd[1770]: time="2025-12-16T13:19:00.794152462Z" level=info msg="received container exit event container_id:\"3e0a92f601c7d19b9b47c7ae1b3e1b6811475177f422394e2b729909176dd01c\" id:\"3e0a92f601c7d19b9b47c7ae1b3e1b6811475177f422394e2b729909176dd01c\" pid:2877 exit_status:1 exited_at:{seconds:1765891140 nanos:793244563}" Dec 16 13:19:00.798020 kubelet[3051]: E1216 13:19:00.797928 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:19:00.835648 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3e0a92f601c7d19b9b47c7ae1b3e1b6811475177f422394e2b729909176dd01c-rootfs.mount: Deactivated successfully. Dec 16 13:19:01.138079 kubelet[3051]: I1216 13:19:01.137885 3051 scope.go:117] "RemoveContainer" containerID="3e0a92f601c7d19b9b47c7ae1b3e1b6811475177f422394e2b729909176dd01c" Dec 16 13:19:01.139658 containerd[1770]: time="2025-12-16T13:19:01.139603828Z" level=info msg="CreateContainer within sandbox \"1e78eedeb6eb60d03631c19dbfe0fb7157739aef37f050f123b381f3c23f1be1\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Dec 16 13:19:01.153112 containerd[1770]: time="2025-12-16T13:19:01.152843372Z" level=info msg="Container 65f85695ecd6c9cad372435c50fe1076a7dc539925dc9f83d8ccde7968cec1ab: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:19:01.165613 containerd[1770]: time="2025-12-16T13:19:01.165566756Z" level=info msg="CreateContainer within sandbox \"1e78eedeb6eb60d03631c19dbfe0fb7157739aef37f050f123b381f3c23f1be1\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"65f85695ecd6c9cad372435c50fe1076a7dc539925dc9f83d8ccde7968cec1ab\"" Dec 16 13:19:01.166114 containerd[1770]: time="2025-12-16T13:19:01.166065822Z" level=info msg="StartContainer for \"65f85695ecd6c9cad372435c50fe1076a7dc539925dc9f83d8ccde7968cec1ab\"" Dec 16 13:19:01.168250 containerd[1770]: time="2025-12-16T13:19:01.168186506Z" level=info msg="connecting to shim 65f85695ecd6c9cad372435c50fe1076a7dc539925dc9f83d8ccde7968cec1ab" address="unix:///run/containerd/s/75a2863400045880dc21771ac9eb3c30cbd9ac7221f10a32942ad300875a63c1" protocol=ttrpc version=3 Dec 16 13:19:01.193598 systemd[1]: Started cri-containerd-65f85695ecd6c9cad372435c50fe1076a7dc539925dc9f83d8ccde7968cec1ab.scope - libcontainer container 65f85695ecd6c9cad372435c50fe1076a7dc539925dc9f83d8ccde7968cec1ab. Dec 16 13:19:01.261010 containerd[1770]: time="2025-12-16T13:19:01.260962500Z" level=info msg="StartContainer for \"65f85695ecd6c9cad372435c50fe1076a7dc539925dc9f83d8ccde7968cec1ab\" returns successfully" Dec 16 13:19:01.797427 kubelet[3051]: E1216 13:19:01.797350 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:19:01.798327 kubelet[3051]: E1216 13:19:01.797996 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:19:01.798893 kubelet[3051]: E1216 13:19:01.798815 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:19:05.275350 kubelet[3051]: E1216 13:19:05.275186 3051 controller.go:195] "Failed to update lease" err="the server was unable to return a response in the time allotted, but may still be processing the request (put leases.coordination.k8s.io ci-4459-2-2-9-79ca1ea2c9)" Dec 16 13:19:06.476709 systemd[1]: cri-containerd-0ed5a408f944a12b1713df86f70680bf5626bc89c312da9f332c609e3abfb90e.scope: Deactivated successfully. Dec 16 13:19:06.477675 containerd[1770]: time="2025-12-16T13:19:06.477614035Z" level=info msg="received container exit event container_id:\"0ed5a408f944a12b1713df86f70680bf5626bc89c312da9f332c609e3abfb90e\" id:\"0ed5a408f944a12b1713df86f70680bf5626bc89c312da9f332c609e3abfb90e\" pid:6329 exit_status:1 exited_at:{seconds:1765891146 nanos:477158162}" Dec 16 13:19:06.510229 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0ed5a408f944a12b1713df86f70680bf5626bc89c312da9f332c609e3abfb90e-rootfs.mount: Deactivated successfully. Dec 16 13:19:07.166651 kubelet[3051]: I1216 13:19:07.166609 3051 scope.go:117] "RemoveContainer" containerID="db5db23fac200a0160651391da1cc52e5337cc57b951ee74a4f664d3b0d7242b" Dec 16 13:19:07.167237 kubelet[3051]: I1216 13:19:07.167132 3051 scope.go:117] "RemoveContainer" containerID="0ed5a408f944a12b1713df86f70680bf5626bc89c312da9f332c609e3abfb90e" Dec 16 13:19:07.167531 kubelet[3051]: E1216 13:19:07.167471 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-65cdcdfd6d-w7l2h_tigera-operator(94829de0-01a7-487b-b724-32eef91529c4)\"" pod="tigera-operator/tigera-operator-65cdcdfd6d-w7l2h" podUID="94829de0-01a7-487b-b724-32eef91529c4" Dec 16 13:19:07.170137 containerd[1770]: time="2025-12-16T13:19:07.169938570Z" level=info msg="RemoveContainer for \"db5db23fac200a0160651391da1cc52e5337cc57b951ee74a4f664d3b0d7242b\"" Dec 16 13:19:07.178734 containerd[1770]: time="2025-12-16T13:19:07.178659658Z" level=info msg="RemoveContainer for \"db5db23fac200a0160651391da1cc52e5337cc57b951ee74a4f664d3b0d7242b\" returns successfully" Dec 16 13:19:09.799727 kubelet[3051]: E1216 13:19:09.799643 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f47d785c5-qmfjz" podUID="be9b0dbc-c111-4079-904c-bc275a2adf0f" Dec 16 13:19:11.796859 kubelet[3051]: E1216 13:19:11.796793 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-9xxz2" podUID="7fc3c8ab-a97b-45a2-9167-06f605649e74" Dec 16 13:19:12.797441 kubelet[3051]: E1216 13:19:12.797319 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8969c677-gf2lj" podUID="f1946944-0457-47c9-82f5-a0302117750a" Dec 16 13:19:12.798558 kubelet[3051]: E1216 13:19:12.798168 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-mvzk2" podUID="49b26efe-3f65-4cf1-9ebd-006f8e8a22f9" Dec 16 13:19:14.798399 kubelet[3051]: E1216 13:19:14.798320 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-sfvmc" podUID="db81499e-70ee-42e1-9fb9-a69e20146fbb" Dec 16 13:19:15.276086 kubelet[3051]: E1216 13:19:15.276046 3051 controller.go:195] "Failed to update lease" err="the server was unable to return a response in the time allotted, but may still be processing the request (put leases.coordination.k8s.io ci-4459-2-2-9-79ca1ea2c9)" Dec 16 13:19:15.797614 kubelet[3051]: E1216 13:19:15.797531 3051 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-685f7c88c8-zm2j4" podUID="cfbcb130-7336-4c20-a673-9988ffd0b461" Dec 16 13:19:19.796564 kubelet[3051]: I1216 13:19:19.796514 3051 scope.go:117] "RemoveContainer" containerID="0ed5a408f944a12b1713df86f70680bf5626bc89c312da9f332c609e3abfb90e" Dec 16 13:19:19.799232 containerd[1770]: time="2025-12-16T13:19:19.798679555Z" level=info msg="CreateContainer within sandbox \"0abbeac5d8ea9ba7fad0432d4cea30fae873e48ba590697c773d466084a3cdf2\" for container &ContainerMetadata{Name:tigera-operator,Attempt:2,}" Dec 16 13:19:19.807939 containerd[1770]: time="2025-12-16T13:19:19.807537931Z" level=info msg="Container c9235430ee1a773215e5991d89972b9137be97adaba4929f693007748839ab7a: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:19:19.819636 containerd[1770]: time="2025-12-16T13:19:19.819573864Z" level=info msg="CreateContainer within sandbox \"0abbeac5d8ea9ba7fad0432d4cea30fae873e48ba590697c773d466084a3cdf2\" for &ContainerMetadata{Name:tigera-operator,Attempt:2,} returns container id \"c9235430ee1a773215e5991d89972b9137be97adaba4929f693007748839ab7a\"" Dec 16 13:19:19.820151 containerd[1770]: time="2025-12-16T13:19:19.820112437Z" level=info msg="StartContainer for \"c9235430ee1a773215e5991d89972b9137be97adaba4929f693007748839ab7a\"" Dec 16 13:19:19.821184 containerd[1770]: time="2025-12-16T13:19:19.821144736Z" level=info msg="connecting to shim c9235430ee1a773215e5991d89972b9137be97adaba4929f693007748839ab7a" address="unix:///run/containerd/s/a7e13e2b9dbdf51efd3ce75133f9e28080cacb647b43e15ac8ddda6b3773a002" protocol=ttrpc version=3 Dec 16 13:19:19.842594 systemd[1]: Started cri-containerd-c9235430ee1a773215e5991d89972b9137be97adaba4929f693007748839ab7a.scope - libcontainer container c9235430ee1a773215e5991d89972b9137be97adaba4929f693007748839ab7a. Dec 16 13:19:19.871807 containerd[1770]: time="2025-12-16T13:19:19.871733713Z" level=info msg="StartContainer for \"c9235430ee1a773215e5991d89972b9137be97adaba4929f693007748839ab7a\" returns successfully"