Dec 12 18:42:34.795970 kernel: Linux version 6.12.61-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Dec 12 15:21:28 -00 2025 Dec 12 18:42:34.795997 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=a214a2d85e162c493e8b13db2df50a43e1005a0e4854a1ae089a14f442a30022 Dec 12 18:42:34.796009 kernel: BIOS-provided physical RAM map: Dec 12 18:42:34.796016 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Dec 12 18:42:34.796025 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable Dec 12 18:42:34.796031 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Dec 12 18:42:34.796038 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable Dec 12 18:42:34.796044 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Dec 12 18:42:34.796051 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable Dec 12 18:42:34.796058 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Dec 12 18:42:34.796064 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000007e73efff] usable Dec 12 18:42:34.796074 kernel: BIOS-e820: [mem 0x000000007e73f000-0x000000007e7fffff] reserved Dec 12 18:42:34.796080 kernel: BIOS-e820: [mem 0x000000007e800000-0x000000007ea70fff] usable Dec 12 18:42:34.796086 kernel: BIOS-e820: [mem 0x000000007ea71000-0x000000007eb84fff] reserved Dec 12 18:42:34.796093 kernel: BIOS-e820: [mem 0x000000007eb85000-0x000000007f6ecfff] usable Dec 12 18:42:34.796102 kernel: BIOS-e820: [mem 0x000000007f6ed000-0x000000007f96cfff] reserved Dec 12 18:42:34.796111 kernel: BIOS-e820: [mem 0x000000007f96d000-0x000000007f97efff] ACPI data Dec 12 18:42:34.796118 kernel: BIOS-e820: [mem 0x000000007f97f000-0x000000007f9fefff] ACPI NVS Dec 12 18:42:34.796125 kernel: BIOS-e820: [mem 0x000000007f9ff000-0x000000007fe4efff] usable Dec 12 18:42:34.796131 kernel: BIOS-e820: [mem 0x000000007fe4f000-0x000000007fe52fff] reserved Dec 12 18:42:34.796137 kernel: BIOS-e820: [mem 0x000000007fe53000-0x000000007fe54fff] ACPI NVS Dec 12 18:42:34.796144 kernel: BIOS-e820: [mem 0x000000007fe55000-0x000000007febbfff] usable Dec 12 18:42:34.796150 kernel: BIOS-e820: [mem 0x000000007febc000-0x000000007ff3ffff] reserved Dec 12 18:42:34.796156 kernel: BIOS-e820: [mem 0x000000007ff40000-0x000000007fffffff] ACPI NVS Dec 12 18:42:34.796162 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Dec 12 18:42:34.796170 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Dec 12 18:42:34.796176 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Dec 12 18:42:34.796186 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000047fffffff] usable Dec 12 18:42:34.796193 kernel: NX (Execute Disable) protection: active Dec 12 18:42:34.796199 kernel: APIC: Static calls initialized Dec 12 18:42:34.796205 kernel: e820: update [mem 0x7dd4e018-0x7dd57a57] usable ==> usable Dec 12 18:42:34.796212 kernel: e820: update [mem 0x7dd26018-0x7dd4d457] usable ==> usable Dec 12 18:42:34.796218 kernel: extended physical RAM map: Dec 12 18:42:34.796224 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Dec 12 18:42:34.796230 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable Dec 12 18:42:34.796240 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Dec 12 18:42:34.796253 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable Dec 12 18:42:34.796259 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Dec 12 18:42:34.796269 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable Dec 12 18:42:34.796279 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Dec 12 18:42:34.796289 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000007dd26017] usable Dec 12 18:42:34.796295 kernel: reserve setup_data: [mem 0x000000007dd26018-0x000000007dd4d457] usable Dec 12 18:42:34.796302 kernel: reserve setup_data: [mem 0x000000007dd4d458-0x000000007dd4e017] usable Dec 12 18:42:34.796310 kernel: reserve setup_data: [mem 0x000000007dd4e018-0x000000007dd57a57] usable Dec 12 18:42:34.796317 kernel: reserve setup_data: [mem 0x000000007dd57a58-0x000000007e73efff] usable Dec 12 18:42:34.796323 kernel: reserve setup_data: [mem 0x000000007e73f000-0x000000007e7fffff] reserved Dec 12 18:42:34.796330 kernel: reserve setup_data: [mem 0x000000007e800000-0x000000007ea70fff] usable Dec 12 18:42:34.796336 kernel: reserve setup_data: [mem 0x000000007ea71000-0x000000007eb84fff] reserved Dec 12 18:42:34.796343 kernel: reserve setup_data: [mem 0x000000007eb85000-0x000000007f6ecfff] usable Dec 12 18:42:34.796350 kernel: reserve setup_data: [mem 0x000000007f6ed000-0x000000007f96cfff] reserved Dec 12 18:42:34.796356 kernel: reserve setup_data: [mem 0x000000007f96d000-0x000000007f97efff] ACPI data Dec 12 18:42:34.796363 kernel: reserve setup_data: [mem 0x000000007f97f000-0x000000007f9fefff] ACPI NVS Dec 12 18:42:34.796371 kernel: reserve setup_data: [mem 0x000000007f9ff000-0x000000007fe4efff] usable Dec 12 18:42:34.796399 kernel: reserve setup_data: [mem 0x000000007fe4f000-0x000000007fe52fff] reserved Dec 12 18:42:34.796406 kernel: reserve setup_data: [mem 0x000000007fe53000-0x000000007fe54fff] ACPI NVS Dec 12 18:42:34.796413 kernel: reserve setup_data: [mem 0x000000007fe55000-0x000000007febbfff] usable Dec 12 18:42:34.796419 kernel: reserve setup_data: [mem 0x000000007febc000-0x000000007ff3ffff] reserved Dec 12 18:42:34.796426 kernel: reserve setup_data: [mem 0x000000007ff40000-0x000000007fffffff] ACPI NVS Dec 12 18:42:34.796432 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Dec 12 18:42:34.796439 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Dec 12 18:42:34.796445 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Dec 12 18:42:34.796452 kernel: reserve setup_data: [mem 0x0000000100000000-0x000000047fffffff] usable Dec 12 18:42:34.796458 kernel: efi: EFI v2.7 by EDK II Dec 12 18:42:34.796467 kernel: efi: SMBIOS=0x7f772000 ACPI=0x7f97e000 ACPI 2.0=0x7f97e014 MEMATTR=0x7e282018 RNG=0x7f972018 Dec 12 18:42:34.796473 kernel: random: crng init done Dec 12 18:42:34.796480 kernel: efi: Remove mem152: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Dec 12 18:42:34.796487 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Dec 12 18:42:34.796493 kernel: secureboot: Secure boot disabled Dec 12 18:42:34.796500 kernel: SMBIOS 2.8 present. Dec 12 18:42:34.796506 kernel: DMI: STACKIT Cloud OpenStack Nova/Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 Dec 12 18:42:34.796513 kernel: DMI: Memory slots populated: 1/1 Dec 12 18:42:34.796519 kernel: Hypervisor detected: KVM Dec 12 18:42:34.796526 kernel: last_pfn = 0x7febc max_arch_pfn = 0x10000000000 Dec 12 18:42:34.796532 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Dec 12 18:42:34.796539 kernel: kvm-clock: using sched offset of 6719874043 cycles Dec 12 18:42:34.796548 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Dec 12 18:42:34.796555 kernel: tsc: Detected 2294.578 MHz processor Dec 12 18:42:34.796562 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Dec 12 18:42:34.796571 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Dec 12 18:42:34.796578 kernel: last_pfn = 0x480000 max_arch_pfn = 0x10000000000 Dec 12 18:42:34.796585 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Dec 12 18:42:34.796592 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Dec 12 18:42:34.796598 kernel: last_pfn = 0x7febc max_arch_pfn = 0x10000000000 Dec 12 18:42:34.796605 kernel: Using GB pages for direct mapping Dec 12 18:42:34.796614 kernel: ACPI: Early table checksum verification disabled Dec 12 18:42:34.796620 kernel: ACPI: RSDP 0x000000007F97E014 000024 (v02 BOCHS ) Dec 12 18:42:34.796627 kernel: ACPI: XSDT 0x000000007F97D0E8 00004C (v01 BOCHS BXPC 00000001 01000013) Dec 12 18:42:34.796634 kernel: ACPI: FACP 0x000000007F977000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 18:42:34.796641 kernel: ACPI: DSDT 0x000000007F978000 004441 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 18:42:34.796648 kernel: ACPI: FACS 0x000000007F9DD000 000040 Dec 12 18:42:34.796655 kernel: ACPI: APIC 0x000000007F976000 0000B0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 18:42:34.796662 kernel: ACPI: MCFG 0x000000007F975000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 18:42:34.796668 kernel: ACPI: WAET 0x000000007F974000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 18:42:34.796677 kernel: ACPI: BGRT 0x000000007F973000 000038 (v01 INTEL EDK2 00000002 01000013) Dec 12 18:42:34.796684 kernel: ACPI: Reserving FACP table memory at [mem 0x7f977000-0x7f9770f3] Dec 12 18:42:34.796691 kernel: ACPI: Reserving DSDT table memory at [mem 0x7f978000-0x7f97c440] Dec 12 18:42:34.796698 kernel: ACPI: Reserving FACS table memory at [mem 0x7f9dd000-0x7f9dd03f] Dec 12 18:42:34.796704 kernel: ACPI: Reserving APIC table memory at [mem 0x7f976000-0x7f9760af] Dec 12 18:42:34.796711 kernel: ACPI: Reserving MCFG table memory at [mem 0x7f975000-0x7f97503b] Dec 12 18:42:34.796718 kernel: ACPI: Reserving WAET table memory at [mem 0x7f974000-0x7f974027] Dec 12 18:42:34.796724 kernel: ACPI: Reserving BGRT table memory at [mem 0x7f973000-0x7f973037] Dec 12 18:42:34.796731 kernel: No NUMA configuration found Dec 12 18:42:34.796740 kernel: Faking a node at [mem 0x0000000000000000-0x000000047fffffff] Dec 12 18:42:34.796747 kernel: NODE_DATA(0) allocated [mem 0x47fff8dc0-0x47fffffff] Dec 12 18:42:34.796754 kernel: Zone ranges: Dec 12 18:42:34.796760 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Dec 12 18:42:34.796767 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Dec 12 18:42:34.796774 kernel: Normal [mem 0x0000000100000000-0x000000047fffffff] Dec 12 18:42:34.796780 kernel: Device empty Dec 12 18:42:34.796787 kernel: Movable zone start for each node Dec 12 18:42:34.796794 kernel: Early memory node ranges Dec 12 18:42:34.796802 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Dec 12 18:42:34.796809 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] Dec 12 18:42:34.796816 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] Dec 12 18:42:34.796823 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] Dec 12 18:42:34.796829 kernel: node 0: [mem 0x0000000000900000-0x000000007e73efff] Dec 12 18:42:34.796836 kernel: node 0: [mem 0x000000007e800000-0x000000007ea70fff] Dec 12 18:42:34.796843 kernel: node 0: [mem 0x000000007eb85000-0x000000007f6ecfff] Dec 12 18:42:34.796857 kernel: node 0: [mem 0x000000007f9ff000-0x000000007fe4efff] Dec 12 18:42:34.796865 kernel: node 0: [mem 0x000000007fe55000-0x000000007febbfff] Dec 12 18:42:34.796872 kernel: node 0: [mem 0x0000000100000000-0x000000047fffffff] Dec 12 18:42:34.796879 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000047fffffff] Dec 12 18:42:34.796887 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Dec 12 18:42:34.796894 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Dec 12 18:42:34.796907 kernel: On node 0, zone DMA: 8 pages in unavailable ranges Dec 12 18:42:34.796915 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Dec 12 18:42:34.796922 kernel: On node 0, zone DMA: 239 pages in unavailable ranges Dec 12 18:42:34.796933 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Dec 12 18:42:34.796945 kernel: On node 0, zone DMA32: 276 pages in unavailable ranges Dec 12 18:42:34.796954 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Dec 12 18:42:34.796965 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges Dec 12 18:42:34.796977 kernel: On node 0, zone Normal: 324 pages in unavailable ranges Dec 12 18:42:34.796991 kernel: ACPI: PM-Timer IO Port: 0x608 Dec 12 18:42:34.797003 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Dec 12 18:42:34.797011 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Dec 12 18:42:34.797018 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Dec 12 18:42:34.797025 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Dec 12 18:42:34.797033 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Dec 12 18:42:34.797042 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Dec 12 18:42:34.797050 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Dec 12 18:42:34.797057 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Dec 12 18:42:34.797064 kernel: TSC deadline timer available Dec 12 18:42:34.797072 kernel: CPU topo: Max. logical packages: 8 Dec 12 18:42:34.797080 kernel: CPU topo: Max. logical dies: 8 Dec 12 18:42:34.797087 kernel: CPU topo: Max. dies per package: 1 Dec 12 18:42:34.797094 kernel: CPU topo: Max. threads per core: 1 Dec 12 18:42:34.797102 kernel: CPU topo: Num. cores per package: 1 Dec 12 18:42:34.797111 kernel: CPU topo: Num. threads per package: 1 Dec 12 18:42:34.797118 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs Dec 12 18:42:34.797126 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Dec 12 18:42:34.797133 kernel: kvm-guest: KVM setup pv remote TLB flush Dec 12 18:42:34.797141 kernel: kvm-guest: setup PV sched yield Dec 12 18:42:34.797148 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Dec 12 18:42:34.797155 kernel: Booting paravirtualized kernel on KVM Dec 12 18:42:34.797163 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Dec 12 18:42:34.797170 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1 Dec 12 18:42:34.797180 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Dec 12 18:42:34.797187 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Dec 12 18:42:34.797194 kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 Dec 12 18:42:34.797202 kernel: kvm-guest: PV spinlocks enabled Dec 12 18:42:34.797209 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Dec 12 18:42:34.797218 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=a214a2d85e162c493e8b13db2df50a43e1005a0e4854a1ae089a14f442a30022 Dec 12 18:42:34.797225 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Dec 12 18:42:34.797233 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Dec 12 18:42:34.797242 kernel: Fallback order for Node 0: 0 Dec 12 18:42:34.797250 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4192374 Dec 12 18:42:34.797258 kernel: Policy zone: Normal Dec 12 18:42:34.797265 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 12 18:42:34.797272 kernel: software IO TLB: area num 8. Dec 12 18:42:34.797280 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1 Dec 12 18:42:34.797287 kernel: ftrace: allocating 40103 entries in 157 pages Dec 12 18:42:34.797295 kernel: ftrace: allocated 157 pages with 5 groups Dec 12 18:42:34.797302 kernel: Dynamic Preempt: voluntary Dec 12 18:42:34.797312 kernel: rcu: Preemptible hierarchical RCU implementation. Dec 12 18:42:34.797320 kernel: rcu: RCU event tracing is enabled. Dec 12 18:42:34.797328 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=8. Dec 12 18:42:34.797335 kernel: Trampoline variant of Tasks RCU enabled. Dec 12 18:42:34.797343 kernel: Rude variant of Tasks RCU enabled. Dec 12 18:42:34.797350 kernel: Tracing variant of Tasks RCU enabled. Dec 12 18:42:34.797358 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 12 18:42:34.797365 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8 Dec 12 18:42:34.797373 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8. Dec 12 18:42:34.797389 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8. Dec 12 18:42:34.797396 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8. Dec 12 18:42:34.797404 kernel: NR_IRQS: 33024, nr_irqs: 488, preallocated irqs: 16 Dec 12 18:42:34.797411 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 12 18:42:34.797419 kernel: Console: colour dummy device 80x25 Dec 12 18:42:34.797426 kernel: printk: legacy console [tty0] enabled Dec 12 18:42:34.797434 kernel: printk: legacy console [ttyS0] enabled Dec 12 18:42:34.797441 kernel: ACPI: Core revision 20240827 Dec 12 18:42:34.797449 kernel: APIC: Switch to symmetric I/O mode setup Dec 12 18:42:34.797458 kernel: x2apic enabled Dec 12 18:42:34.797465 kernel: APIC: Switched APIC routing to: physical x2apic Dec 12 18:42:34.797473 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Dec 12 18:42:34.797481 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Dec 12 18:42:34.797488 kernel: kvm-guest: setup PV IPIs Dec 12 18:42:34.797496 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2113334dc36, max_idle_ns: 440795272915 ns Dec 12 18:42:34.797503 kernel: Calibrating delay loop (skipped) preset value.. 4589.15 BogoMIPS (lpj=2294578) Dec 12 18:42:34.797511 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Dec 12 18:42:34.797518 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Dec 12 18:42:34.797527 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Dec 12 18:42:34.797534 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Dec 12 18:42:34.797541 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Dec 12 18:42:34.797548 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Dec 12 18:42:34.797556 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Dec 12 18:42:34.797563 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Dec 12 18:42:34.797570 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Dec 12 18:42:34.797577 kernel: TAA: Mitigation: Clear CPU buffers Dec 12 18:42:34.797584 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Dec 12 18:42:34.797591 kernel: active return thunk: its_return_thunk Dec 12 18:42:34.797598 kernel: ITS: Mitigation: Aligned branch/return thunks Dec 12 18:42:34.797607 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Dec 12 18:42:34.797614 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Dec 12 18:42:34.797622 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Dec 12 18:42:34.797629 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Dec 12 18:42:34.797636 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Dec 12 18:42:34.797643 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Dec 12 18:42:34.797651 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Dec 12 18:42:34.797658 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Dec 12 18:42:34.797665 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Dec 12 18:42:34.797672 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Dec 12 18:42:34.797679 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Dec 12 18:42:34.797695 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Dec 12 18:42:34.797702 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Dec 12 18:42:34.797709 kernel: Freeing SMP alternatives memory: 32K Dec 12 18:42:34.797717 kernel: pid_max: default: 32768 minimum: 301 Dec 12 18:42:34.797724 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Dec 12 18:42:34.797735 kernel: landlock: Up and running. Dec 12 18:42:34.797743 kernel: SELinux: Initializing. Dec 12 18:42:34.797754 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 12 18:42:34.797765 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 12 18:42:34.797772 kernel: smpboot: CPU0: Intel(R) Xeon(R) Silver 4316 CPU @ 2.30GHz (family: 0x6, model: 0x6a, stepping: 0x6) Dec 12 18:42:34.797779 kernel: Performance Events: PEBS fmt0-, Icelake events, full-width counters, Intel PMU driver. Dec 12 18:42:34.797789 kernel: ... version: 2 Dec 12 18:42:34.797797 kernel: ... bit width: 48 Dec 12 18:42:34.797805 kernel: ... generic registers: 8 Dec 12 18:42:34.797812 kernel: ... value mask: 0000ffffffffffff Dec 12 18:42:34.797819 kernel: ... max period: 00007fffffffffff Dec 12 18:42:34.797826 kernel: ... fixed-purpose events: 3 Dec 12 18:42:34.797833 kernel: ... event mask: 00000007000000ff Dec 12 18:42:34.797840 kernel: signal: max sigframe size: 3632 Dec 12 18:42:34.797848 kernel: rcu: Hierarchical SRCU implementation. Dec 12 18:42:34.797855 kernel: rcu: Max phase no-delay instances is 400. Dec 12 18:42:34.797864 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Dec 12 18:42:34.797871 kernel: smp: Bringing up secondary CPUs ... Dec 12 18:42:34.797878 kernel: smpboot: x86: Booting SMP configuration: Dec 12 18:42:34.797885 kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 Dec 12 18:42:34.797893 kernel: smp: Brought up 1 node, 8 CPUs Dec 12 18:42:34.797900 kernel: smpboot: Total of 8 processors activated (36713.24 BogoMIPS) Dec 12 18:42:34.797908 kernel: Memory: 16308692K/16769496K available (14336K kernel code, 2444K rwdata, 26064K rodata, 46188K init, 2572K bss, 453244K reserved, 0K cma-reserved) Dec 12 18:42:34.797915 kernel: devtmpfs: initialized Dec 12 18:42:34.797922 kernel: x86/mm: Memory block size: 128MB Dec 12 18:42:34.797934 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) Dec 12 18:42:34.797942 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) Dec 12 18:42:34.797950 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) Dec 12 18:42:34.797961 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7f97f000-0x7f9fefff] (524288 bytes) Dec 12 18:42:34.797968 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fe53000-0x7fe54fff] (8192 bytes) Dec 12 18:42:34.797975 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7ff40000-0x7fffffff] (786432 bytes) Dec 12 18:42:34.797983 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 12 18:42:34.797990 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear) Dec 12 18:42:34.798001 kernel: pinctrl core: initialized pinctrl subsystem Dec 12 18:42:34.798008 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 12 18:42:34.798015 kernel: audit: initializing netlink subsys (disabled) Dec 12 18:42:34.798023 kernel: audit: type=2000 audit(1765564952.530:1): state=initialized audit_enabled=0 res=1 Dec 12 18:42:34.798031 kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 12 18:42:34.798038 kernel: thermal_sys: Registered thermal governor 'user_space' Dec 12 18:42:34.798045 kernel: cpuidle: using governor menu Dec 12 18:42:34.798053 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 12 18:42:34.798060 kernel: dca service started, version 1.12.1 Dec 12 18:42:34.798069 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Dec 12 18:42:34.798077 kernel: PCI: Using configuration type 1 for base access Dec 12 18:42:34.798084 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Dec 12 18:42:34.798092 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Dec 12 18:42:34.798100 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Dec 12 18:42:34.798107 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Dec 12 18:42:34.798115 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Dec 12 18:42:34.798122 kernel: ACPI: Added _OSI(Module Device) Dec 12 18:42:34.798129 kernel: ACPI: Added _OSI(Processor Device) Dec 12 18:42:34.798138 kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 12 18:42:34.798146 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 12 18:42:34.798153 kernel: ACPI: Interpreter enabled Dec 12 18:42:34.798161 kernel: ACPI: PM: (supports S0 S3 S5) Dec 12 18:42:34.798168 kernel: ACPI: Using IOAPIC for interrupt routing Dec 12 18:42:34.798175 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Dec 12 18:42:34.798183 kernel: PCI: Using E820 reservations for host bridge windows Dec 12 18:42:34.798190 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Dec 12 18:42:34.798198 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 12 18:42:34.798330 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Dec 12 18:42:34.798412 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Dec 12 18:42:34.798482 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Dec 12 18:42:34.798491 kernel: PCI host bridge to bus 0000:00 Dec 12 18:42:34.798565 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Dec 12 18:42:34.798628 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Dec 12 18:42:34.798689 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Dec 12 18:42:34.798753 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Dec 12 18:42:34.798813 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Dec 12 18:42:34.798873 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x38e800003fff window] Dec 12 18:42:34.798934 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Dec 12 18:42:34.799017 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Dec 12 18:42:34.799114 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Dec 12 18:42:34.799200 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80000000-0x807fffff pref] Dec 12 18:42:34.799283 kernel: pci 0000:00:01.0: BAR 2 [mem 0x38e800000000-0x38e800003fff 64bit pref] Dec 12 18:42:34.799363 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8439e000-0x8439efff] Dec 12 18:42:34.799457 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Dec 12 18:42:34.799526 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Dec 12 18:42:34.799611 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.799682 kernel: pci 0000:00:02.0: BAR 0 [mem 0x8439d000-0x8439dfff] Dec 12 18:42:34.799755 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Dec 12 18:42:34.799823 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Dec 12 18:42:34.799894 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Dec 12 18:42:34.799963 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Dec 12 18:42:34.800039 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.800109 kernel: pci 0000:00:02.1: BAR 0 [mem 0x8439c000-0x8439cfff] Dec 12 18:42:34.800188 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Dec 12 18:42:34.800259 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Dec 12 18:42:34.800326 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Dec 12 18:42:34.800426 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.800497 kernel: pci 0000:00:02.2: BAR 0 [mem 0x8439b000-0x8439bfff] Dec 12 18:42:34.800564 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Dec 12 18:42:34.800631 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Dec 12 18:42:34.800699 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Dec 12 18:42:34.800775 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.800844 kernel: pci 0000:00:02.3: BAR 0 [mem 0x8439a000-0x8439afff] Dec 12 18:42:34.800911 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Dec 12 18:42:34.800978 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Dec 12 18:42:34.801053 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Dec 12 18:42:34.801138 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.801206 kernel: pci 0000:00:02.4: BAR 0 [mem 0x84399000-0x84399fff] Dec 12 18:42:34.801274 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Dec 12 18:42:34.801340 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Dec 12 18:42:34.801415 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Dec 12 18:42:34.801491 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.801560 kernel: pci 0000:00:02.5: BAR 0 [mem 0x84398000-0x84398fff] Dec 12 18:42:34.801626 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Dec 12 18:42:34.801692 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Dec 12 18:42:34.801760 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Dec 12 18:42:34.801833 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.801901 kernel: pci 0000:00:02.6: BAR 0 [mem 0x84397000-0x84397fff] Dec 12 18:42:34.801968 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Dec 12 18:42:34.802035 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Dec 12 18:42:34.802103 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Dec 12 18:42:34.802179 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.802248 kernel: pci 0000:00:02.7: BAR 0 [mem 0x84396000-0x84396fff] Dec 12 18:42:34.802315 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Dec 12 18:42:34.802446 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Dec 12 18:42:34.802516 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Dec 12 18:42:34.802590 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.802661 kernel: pci 0000:00:03.0: BAR 0 [mem 0x84395000-0x84395fff] Dec 12 18:42:34.802731 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Dec 12 18:42:34.802798 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Dec 12 18:42:34.802866 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Dec 12 18:42:34.802943 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.803027 kernel: pci 0000:00:03.1: BAR 0 [mem 0x84394000-0x84394fff] Dec 12 18:42:34.803098 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Dec 12 18:42:34.803165 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Dec 12 18:42:34.803234 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Dec 12 18:42:34.803308 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.803377 kernel: pci 0000:00:03.2: BAR 0 [mem 0x84393000-0x84393fff] Dec 12 18:42:34.803455 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Dec 12 18:42:34.803522 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Dec 12 18:42:34.803590 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Dec 12 18:42:34.803668 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.803740 kernel: pci 0000:00:03.3: BAR 0 [mem 0x84392000-0x84392fff] Dec 12 18:42:34.803811 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Dec 12 18:42:34.803879 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Dec 12 18:42:34.803946 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Dec 12 18:42:34.804020 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.804091 kernel: pci 0000:00:03.4: BAR 0 [mem 0x84391000-0x84391fff] Dec 12 18:42:34.804160 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Dec 12 18:42:34.804228 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Dec 12 18:42:34.804296 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Dec 12 18:42:34.804370 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.804458 kernel: pci 0000:00:03.5: BAR 0 [mem 0x84390000-0x84390fff] Dec 12 18:42:34.804525 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Dec 12 18:42:34.804595 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Dec 12 18:42:34.804662 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Dec 12 18:42:34.804736 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.804806 kernel: pci 0000:00:03.6: BAR 0 [mem 0x8438f000-0x8438ffff] Dec 12 18:42:34.804872 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Dec 12 18:42:34.804939 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Dec 12 18:42:34.805006 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Dec 12 18:42:34.805080 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.805149 kernel: pci 0000:00:03.7: BAR 0 [mem 0x8438e000-0x8438efff] Dec 12 18:42:34.805216 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Dec 12 18:42:34.805282 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Dec 12 18:42:34.805347 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Dec 12 18:42:34.805426 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.805494 kernel: pci 0000:00:04.0: BAR 0 [mem 0x8438d000-0x8438dfff] Dec 12 18:42:34.805565 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Dec 12 18:42:34.805634 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Dec 12 18:42:34.805700 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Dec 12 18:42:34.805772 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.805840 kernel: pci 0000:00:04.1: BAR 0 [mem 0x8438c000-0x8438cfff] Dec 12 18:42:34.805909 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Dec 12 18:42:34.805977 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Dec 12 18:42:34.806047 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Dec 12 18:42:34.806123 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.806192 kernel: pci 0000:00:04.2: BAR 0 [mem 0x8438b000-0x8438bfff] Dec 12 18:42:34.806260 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Dec 12 18:42:34.806329 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Dec 12 18:42:34.806405 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Dec 12 18:42:34.806481 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.806554 kernel: pci 0000:00:04.3: BAR 0 [mem 0x8438a000-0x8438afff] Dec 12 18:42:34.806622 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Dec 12 18:42:34.806689 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Dec 12 18:42:34.806756 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Dec 12 18:42:34.806829 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.806898 kernel: pci 0000:00:04.4: BAR 0 [mem 0x84389000-0x84389fff] Dec 12 18:42:34.806967 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Dec 12 18:42:34.807035 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Dec 12 18:42:34.807106 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Dec 12 18:42:34.807186 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.807256 kernel: pci 0000:00:04.5: BAR 0 [mem 0x84388000-0x84388fff] Dec 12 18:42:34.807328 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Dec 12 18:42:34.807403 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Dec 12 18:42:34.807479 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Dec 12 18:42:34.807555 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.807628 kernel: pci 0000:00:04.6: BAR 0 [mem 0x84387000-0x84387fff] Dec 12 18:42:34.807701 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Dec 12 18:42:34.807775 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Dec 12 18:42:34.807843 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Dec 12 18:42:34.807921 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.807990 kernel: pci 0000:00:04.7: BAR 0 [mem 0x84386000-0x84386fff] Dec 12 18:42:34.808058 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Dec 12 18:42:34.808126 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Dec 12 18:42:34.808194 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Dec 12 18:42:34.808275 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.808345 kernel: pci 0000:00:05.0: BAR 0 [mem 0x84385000-0x84385fff] Dec 12 18:42:34.808452 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Dec 12 18:42:34.808522 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Dec 12 18:42:34.808595 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Dec 12 18:42:34.808671 kernel: pci 0000:00:05.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.808741 kernel: pci 0000:00:05.1: BAR 0 [mem 0x84384000-0x84384fff] Dec 12 18:42:34.808810 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Dec 12 18:42:34.808879 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Dec 12 18:42:34.808950 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Dec 12 18:42:34.809029 kernel: pci 0000:00:05.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.809100 kernel: pci 0000:00:05.2: BAR 0 [mem 0x84383000-0x84383fff] Dec 12 18:42:34.809169 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Dec 12 18:42:34.809237 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Dec 12 18:42:34.809304 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Dec 12 18:42:34.809390 kernel: pci 0000:00:05.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.809468 kernel: pci 0000:00:05.3: BAR 0 [mem 0x84382000-0x84382fff] Dec 12 18:42:34.809537 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Dec 12 18:42:34.809605 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Dec 12 18:42:34.809672 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Dec 12 18:42:34.809747 kernel: pci 0000:00:05.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 18:42:34.809815 kernel: pci 0000:00:05.4: BAR 0 [mem 0x84381000-0x84381fff] Dec 12 18:42:34.809884 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Dec 12 18:42:34.809954 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Dec 12 18:42:34.810023 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Dec 12 18:42:34.810096 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Dec 12 18:42:34.810166 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Dec 12 18:42:34.810240 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Dec 12 18:42:34.810309 kernel: pci 0000:00:1f.2: BAR 4 [io 0x7040-0x705f] Dec 12 18:42:34.810376 kernel: pci 0000:00:1f.2: BAR 5 [mem 0x84380000-0x84380fff] Dec 12 18:42:34.810460 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Dec 12 18:42:34.810530 kernel: pci 0000:00:1f.3: BAR 4 [io 0x7000-0x703f] Dec 12 18:42:34.810608 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Dec 12 18:42:34.810679 kernel: pci 0000:01:00.0: BAR 0 [mem 0x84200000-0x842000ff 64bit] Dec 12 18:42:34.810755 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Dec 12 18:42:34.810825 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Dec 12 18:42:34.810896 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Dec 12 18:42:34.810969 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Dec 12 18:42:34.811039 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Dec 12 18:42:34.811120 kernel: pci_bus 0000:02: extended config space not accessible Dec 12 18:42:34.811131 kernel: acpiphp: Slot [1] registered Dec 12 18:42:34.811139 kernel: acpiphp: Slot [0] registered Dec 12 18:42:34.811149 kernel: acpiphp: Slot [2] registered Dec 12 18:42:34.811157 kernel: acpiphp: Slot [3] registered Dec 12 18:42:34.811164 kernel: acpiphp: Slot [4] registered Dec 12 18:42:34.811173 kernel: acpiphp: Slot [5] registered Dec 12 18:42:34.811181 kernel: acpiphp: Slot [6] registered Dec 12 18:42:34.811188 kernel: acpiphp: Slot [7] registered Dec 12 18:42:34.811196 kernel: acpiphp: Slot [8] registered Dec 12 18:42:34.811203 kernel: acpiphp: Slot [9] registered Dec 12 18:42:34.811211 kernel: acpiphp: Slot [10] registered Dec 12 18:42:34.811218 kernel: acpiphp: Slot [11] registered Dec 12 18:42:34.811226 kernel: acpiphp: Slot [12] registered Dec 12 18:42:34.811234 kernel: acpiphp: Slot [13] registered Dec 12 18:42:34.811241 kernel: acpiphp: Slot [14] registered Dec 12 18:42:34.811258 kernel: acpiphp: Slot [15] registered Dec 12 18:42:34.811266 kernel: acpiphp: Slot [16] registered Dec 12 18:42:34.811273 kernel: acpiphp: Slot [17] registered Dec 12 18:42:34.811281 kernel: acpiphp: Slot [18] registered Dec 12 18:42:34.811288 kernel: acpiphp: Slot [19] registered Dec 12 18:42:34.811296 kernel: acpiphp: Slot [20] registered Dec 12 18:42:34.811303 kernel: acpiphp: Slot [21] registered Dec 12 18:42:34.811311 kernel: acpiphp: Slot [22] registered Dec 12 18:42:34.811318 kernel: acpiphp: Slot [23] registered Dec 12 18:42:34.811328 kernel: acpiphp: Slot [24] registered Dec 12 18:42:34.811335 kernel: acpiphp: Slot [25] registered Dec 12 18:42:34.811343 kernel: acpiphp: Slot [26] registered Dec 12 18:42:34.811351 kernel: acpiphp: Slot [27] registered Dec 12 18:42:34.811358 kernel: acpiphp: Slot [28] registered Dec 12 18:42:34.811366 kernel: acpiphp: Slot [29] registered Dec 12 18:42:34.811374 kernel: acpiphp: Slot [30] registered Dec 12 18:42:34.811388 kernel: acpiphp: Slot [31] registered Dec 12 18:42:34.811466 kernel: pci 0000:02:01.0: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint Dec 12 18:42:34.811546 kernel: pci 0000:02:01.0: BAR 4 [io 0x6000-0x601f] Dec 12 18:42:34.811617 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Dec 12 18:42:34.811634 kernel: acpiphp: Slot [0-2] registered Dec 12 18:42:34.811718 kernel: pci 0000:03:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 12 18:42:34.811791 kernel: pci 0000:03:00.0: BAR 1 [mem 0x83e00000-0x83e00fff] Dec 12 18:42:34.811863 kernel: pci 0000:03:00.0: BAR 4 [mem 0x380800000000-0x380800003fff 64bit pref] Dec 12 18:42:34.811933 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref] Dec 12 18:42:34.812006 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Dec 12 18:42:34.812019 kernel: acpiphp: Slot [0-3] registered Dec 12 18:42:34.812094 kernel: pci 0000:04:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Dec 12 18:42:34.812166 kernel: pci 0000:04:00.0: BAR 1 [mem 0x83c00000-0x83c00fff] Dec 12 18:42:34.812237 kernel: pci 0000:04:00.0: BAR 4 [mem 0x381000000000-0x381000003fff 64bit pref] Dec 12 18:42:34.812308 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Dec 12 18:42:34.812318 kernel: acpiphp: Slot [0-4] registered Dec 12 18:42:34.812412 kernel: pci 0000:05:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Dec 12 18:42:34.812489 kernel: pci 0000:05:00.0: BAR 4 [mem 0x381800000000-0x381800003fff 64bit pref] Dec 12 18:42:34.812558 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Dec 12 18:42:34.812568 kernel: acpiphp: Slot [0-5] registered Dec 12 18:42:34.812641 kernel: pci 0000:06:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Dec 12 18:42:34.812710 kernel: pci 0000:06:00.0: BAR 1 [mem 0x83800000-0x83800fff] Dec 12 18:42:34.812779 kernel: pci 0000:06:00.0: BAR 4 [mem 0x382000000000-0x382000003fff 64bit pref] Dec 12 18:42:34.812847 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Dec 12 18:42:34.812859 kernel: acpiphp: Slot [0-6] registered Dec 12 18:42:34.812927 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Dec 12 18:42:34.812937 kernel: acpiphp: Slot [0-7] registered Dec 12 18:42:34.813005 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Dec 12 18:42:34.813015 kernel: acpiphp: Slot [0-8] registered Dec 12 18:42:34.813083 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Dec 12 18:42:34.813093 kernel: acpiphp: Slot [0-9] registered Dec 12 18:42:34.813161 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Dec 12 18:42:34.813174 kernel: acpiphp: Slot [0-10] registered Dec 12 18:42:34.813242 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Dec 12 18:42:34.813252 kernel: acpiphp: Slot [0-11] registered Dec 12 18:42:34.813319 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Dec 12 18:42:34.813329 kernel: acpiphp: Slot [0-12] registered Dec 12 18:42:34.813404 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Dec 12 18:42:34.813414 kernel: acpiphp: Slot [0-13] registered Dec 12 18:42:34.813486 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Dec 12 18:42:34.813496 kernel: acpiphp: Slot [0-14] registered Dec 12 18:42:34.813563 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Dec 12 18:42:34.813573 kernel: acpiphp: Slot [0-15] registered Dec 12 18:42:34.813640 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Dec 12 18:42:34.813650 kernel: acpiphp: Slot [0-16] registered Dec 12 18:42:34.813717 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Dec 12 18:42:34.813728 kernel: acpiphp: Slot [0-17] registered Dec 12 18:42:34.813798 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Dec 12 18:42:34.813808 kernel: acpiphp: Slot [0-18] registered Dec 12 18:42:34.813876 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Dec 12 18:42:34.813886 kernel: acpiphp: Slot [0-19] registered Dec 12 18:42:34.813954 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Dec 12 18:42:34.813963 kernel: acpiphp: Slot [0-20] registered Dec 12 18:42:34.814031 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Dec 12 18:42:34.814040 kernel: acpiphp: Slot [0-21] registered Dec 12 18:42:34.814110 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Dec 12 18:42:34.814120 kernel: acpiphp: Slot [0-22] registered Dec 12 18:42:34.814187 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Dec 12 18:42:34.814197 kernel: acpiphp: Slot [0-23] registered Dec 12 18:42:34.814263 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Dec 12 18:42:34.814273 kernel: acpiphp: Slot [0-24] registered Dec 12 18:42:34.814339 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Dec 12 18:42:34.814349 kernel: acpiphp: Slot [0-25] registered Dec 12 18:42:34.814425 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Dec 12 18:42:34.814435 kernel: acpiphp: Slot [0-26] registered Dec 12 18:42:34.814503 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Dec 12 18:42:34.814513 kernel: acpiphp: Slot [0-27] registered Dec 12 18:42:34.814579 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Dec 12 18:42:34.814589 kernel: acpiphp: Slot [0-28] registered Dec 12 18:42:34.814656 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Dec 12 18:42:34.814665 kernel: acpiphp: Slot [0-29] registered Dec 12 18:42:34.814735 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Dec 12 18:42:34.814745 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Dec 12 18:42:34.814753 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Dec 12 18:42:34.814760 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Dec 12 18:42:34.814768 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Dec 12 18:42:34.814776 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Dec 12 18:42:34.814783 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Dec 12 18:42:34.814791 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Dec 12 18:42:34.814801 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Dec 12 18:42:34.814809 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Dec 12 18:42:34.814817 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Dec 12 18:42:34.814824 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Dec 12 18:42:34.814832 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Dec 12 18:42:34.814840 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Dec 12 18:42:34.814847 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Dec 12 18:42:34.814855 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Dec 12 18:42:34.814863 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Dec 12 18:42:34.814872 kernel: iommu: Default domain type: Translated Dec 12 18:42:34.814880 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Dec 12 18:42:34.814888 kernel: efivars: Registered efivars operations Dec 12 18:42:34.814895 kernel: PCI: Using ACPI for IRQ routing Dec 12 18:42:34.814903 kernel: PCI: pci_cache_line_size set to 64 bytes Dec 12 18:42:34.814911 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] Dec 12 18:42:34.814919 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] Dec 12 18:42:34.814926 kernel: e820: reserve RAM buffer [mem 0x7dd26018-0x7fffffff] Dec 12 18:42:34.814934 kernel: e820: reserve RAM buffer [mem 0x7dd4e018-0x7fffffff] Dec 12 18:42:34.814941 kernel: e820: reserve RAM buffer [mem 0x7e73f000-0x7fffffff] Dec 12 18:42:34.814951 kernel: e820: reserve RAM buffer [mem 0x7ea71000-0x7fffffff] Dec 12 18:42:34.814958 kernel: e820: reserve RAM buffer [mem 0x7f6ed000-0x7fffffff] Dec 12 18:42:34.814966 kernel: e820: reserve RAM buffer [mem 0x7fe4f000-0x7fffffff] Dec 12 18:42:34.814974 kernel: e820: reserve RAM buffer [mem 0x7febc000-0x7fffffff] Dec 12 18:42:34.815044 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Dec 12 18:42:34.815112 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Dec 12 18:42:34.815181 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Dec 12 18:42:34.815191 kernel: vgaarb: loaded Dec 12 18:42:34.815201 kernel: clocksource: Switched to clocksource kvm-clock Dec 12 18:42:34.815208 kernel: VFS: Disk quotas dquot_6.6.0 Dec 12 18:42:34.815216 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 12 18:42:34.815224 kernel: pnp: PnP ACPI init Dec 12 18:42:34.815299 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Dec 12 18:42:34.815310 kernel: pnp: PnP ACPI: found 5 devices Dec 12 18:42:34.815317 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Dec 12 18:42:34.815326 kernel: NET: Registered PF_INET protocol family Dec 12 18:42:34.815336 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 12 18:42:34.815344 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Dec 12 18:42:34.815351 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 12 18:42:34.815359 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Dec 12 18:42:34.815367 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Dec 12 18:42:34.815375 kernel: TCP: Hash tables configured (established 131072 bind 65536) Dec 12 18:42:34.815389 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 12 18:42:34.815397 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 12 18:42:34.815405 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 12 18:42:34.815414 kernel: NET: Registered PF_XDP protocol family Dec 12 18:42:34.815486 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Dec 12 18:42:34.815556 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Dec 12 18:42:34.815627 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Dec 12 18:42:34.815697 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Dec 12 18:42:34.815766 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Dec 12 18:42:34.815835 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Dec 12 18:42:34.815904 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Dec 12 18:42:34.815974 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Dec 12 18:42:34.816042 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Dec 12 18:42:34.816112 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Dec 12 18:42:34.816181 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Dec 12 18:42:34.816249 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Dec 12 18:42:34.816317 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Dec 12 18:42:34.816431 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Dec 12 18:42:34.816501 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Dec 12 18:42:34.816574 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Dec 12 18:42:34.816648 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Dec 12 18:42:34.816725 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Dec 12 18:42:34.816794 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Dec 12 18:42:34.816863 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Dec 12 18:42:34.816931 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Dec 12 18:42:34.816999 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Dec 12 18:42:34.817071 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Dec 12 18:42:34.817138 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Dec 12 18:42:34.817207 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Dec 12 18:42:34.817276 kernel: pci 0000:00:05.1: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Dec 12 18:42:34.817346 kernel: pci 0000:00:05.2: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Dec 12 18:42:34.817422 kernel: pci 0000:00:05.3: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Dec 12 18:42:34.817491 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Dec 12 18:42:34.817558 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x1fff]: assigned Dec 12 18:42:34.817631 kernel: pci 0000:00:02.2: bridge window [io 0x2000-0x2fff]: assigned Dec 12 18:42:34.817711 kernel: pci 0000:00:02.3: bridge window [io 0x3000-0x3fff]: assigned Dec 12 18:42:34.817806 kernel: pci 0000:00:02.4: bridge window [io 0x4000-0x4fff]: assigned Dec 12 18:42:34.817906 kernel: pci 0000:00:02.5: bridge window [io 0x5000-0x5fff]: assigned Dec 12 18:42:34.817977 kernel: pci 0000:00:02.6: bridge window [io 0x8000-0x8fff]: assigned Dec 12 18:42:34.818046 kernel: pci 0000:00:02.7: bridge window [io 0x9000-0x9fff]: assigned Dec 12 18:42:34.818114 kernel: pci 0000:00:03.0: bridge window [io 0xa000-0xafff]: assigned Dec 12 18:42:34.818182 kernel: pci 0000:00:03.1: bridge window [io 0xb000-0xbfff]: assigned Dec 12 18:42:34.818253 kernel: pci 0000:00:03.2: bridge window [io 0xc000-0xcfff]: assigned Dec 12 18:42:34.818321 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff]: assigned Dec 12 18:42:34.818406 kernel: pci 0000:00:03.4: bridge window [io 0xe000-0xefff]: assigned Dec 12 18:42:34.818476 kernel: pci 0000:00:03.5: bridge window [io 0xf000-0xffff]: assigned Dec 12 18:42:34.818543 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.818611 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.818679 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.818748 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.818820 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.818888 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.818957 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.819024 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.819093 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.819169 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.819251 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.819321 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.819402 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.819472 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.819539 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.819608 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.819680 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.819750 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.819818 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.819891 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.819963 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.820036 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.820105 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.820173 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.820241 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.820309 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.820392 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.820466 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.820534 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.820602 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.820670 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff]: assigned Dec 12 18:42:34.820737 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff]: assigned Dec 12 18:42:34.820805 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff]: assigned Dec 12 18:42:34.820872 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff]: assigned Dec 12 18:42:34.820940 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff]: assigned Dec 12 18:42:34.821010 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff]: assigned Dec 12 18:42:34.821077 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff]: assigned Dec 12 18:42:34.821145 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff]: assigned Dec 12 18:42:34.821213 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff]: assigned Dec 12 18:42:34.821281 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff]: assigned Dec 12 18:42:34.821349 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff]: assigned Dec 12 18:42:34.821424 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff]: assigned Dec 12 18:42:34.821492 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff]: assigned Dec 12 18:42:34.821563 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.821633 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.821700 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.821768 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.821836 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.821904 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.821972 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.822040 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.822109 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.822191 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.822289 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.822390 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.822485 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.822581 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.822678 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.822774 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.822868 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.822939 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.823051 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.823121 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.823191 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.823260 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.823330 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.823405 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.823476 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.823546 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.823615 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.823684 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.823752 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:42:34.823820 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Dec 12 18:42:34.823895 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Dec 12 18:42:34.823965 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Dec 12 18:42:34.824036 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Dec 12 18:42:34.824112 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Dec 12 18:42:34.824183 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Dec 12 18:42:34.824251 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Dec 12 18:42:34.824319 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Dec 12 18:42:34.824402 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Dec 12 18:42:34.824477 kernel: pci 0000:03:00.0: ROM [mem 0x83e80000-0x83efffff pref]: assigned Dec 12 18:42:34.824546 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Dec 12 18:42:34.824613 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Dec 12 18:42:34.824682 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Dec 12 18:42:34.824754 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Dec 12 18:42:34.824824 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Dec 12 18:42:34.824893 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Dec 12 18:42:34.824960 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Dec 12 18:42:34.825027 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Dec 12 18:42:34.825092 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Dec 12 18:42:34.825157 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Dec 12 18:42:34.825221 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Dec 12 18:42:34.825293 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Dec 12 18:42:34.825365 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Dec 12 18:42:34.825445 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Dec 12 18:42:34.825514 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Dec 12 18:42:34.825588 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Dec 12 18:42:34.825656 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Dec 12 18:42:34.825725 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Dec 12 18:42:34.825793 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Dec 12 18:42:34.825863 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Dec 12 18:42:34.825931 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Dec 12 18:42:34.825999 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Dec 12 18:42:34.826066 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Dec 12 18:42:34.826136 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Dec 12 18:42:34.826205 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Dec 12 18:42:34.826273 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Dec 12 18:42:34.826338 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Dec 12 18:42:34.826417 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Dec 12 18:42:34.826486 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Dec 12 18:42:34.826554 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Dec 12 18:42:34.826622 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Dec 12 18:42:34.826691 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Dec 12 18:42:34.826759 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Dec 12 18:42:34.826828 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Dec 12 18:42:34.826895 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Dec 12 18:42:34.826964 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Dec 12 18:42:34.827037 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Dec 12 18:42:34.827105 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Dec 12 18:42:34.827172 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Dec 12 18:42:34.827251 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Dec 12 18:42:34.827342 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Dec 12 18:42:34.827425 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Dec 12 18:42:34.827514 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Dec 12 18:42:34.827607 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Dec 12 18:42:34.827686 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Dec 12 18:42:34.827757 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Dec 12 18:42:34.827826 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff] Dec 12 18:42:34.827899 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Dec 12 18:42:34.827967 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Dec 12 18:42:34.828035 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Dec 12 18:42:34.828103 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff] Dec 12 18:42:34.828171 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Dec 12 18:42:34.828239 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Dec 12 18:42:34.828308 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Dec 12 18:42:34.828396 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff] Dec 12 18:42:34.828467 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Dec 12 18:42:34.828533 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Dec 12 18:42:34.828600 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Dec 12 18:42:34.828666 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff] Dec 12 18:42:34.828733 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Dec 12 18:42:34.828803 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Dec 12 18:42:34.828875 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Dec 12 18:42:34.828942 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff] Dec 12 18:42:34.829011 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Dec 12 18:42:34.829079 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Dec 12 18:42:34.829148 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Dec 12 18:42:34.829214 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff] Dec 12 18:42:34.829282 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Dec 12 18:42:34.829349 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Dec 12 18:42:34.829428 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Dec 12 18:42:34.829496 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff] Dec 12 18:42:34.829565 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Dec 12 18:42:34.829633 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Dec 12 18:42:34.829702 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Dec 12 18:42:34.829770 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff] Dec 12 18:42:34.829838 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Dec 12 18:42:34.829906 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Dec 12 18:42:34.829979 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Dec 12 18:42:34.830049 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff] Dec 12 18:42:34.830116 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Dec 12 18:42:34.830184 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Dec 12 18:42:34.830255 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Dec 12 18:42:34.830324 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff] Dec 12 18:42:34.830407 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Dec 12 18:42:34.830479 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Dec 12 18:42:34.830550 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Dec 12 18:42:34.830617 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff] Dec 12 18:42:34.830686 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Dec 12 18:42:34.830756 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Dec 12 18:42:34.830831 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Dec 12 18:42:34.830898 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff] Dec 12 18:42:34.830969 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Dec 12 18:42:34.831037 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Dec 12 18:42:34.831106 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Dec 12 18:42:34.831175 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff] Dec 12 18:42:34.831243 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Dec 12 18:42:34.831311 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Dec 12 18:42:34.831386 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Dec 12 18:42:34.831454 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Dec 12 18:42:34.831516 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Dec 12 18:42:34.831576 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Dec 12 18:42:34.831636 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Dec 12 18:42:34.831697 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x38e800003fff window] Dec 12 18:42:34.831768 kernel: pci_bus 0000:01: resource 0 [io 0x6000-0x6fff] Dec 12 18:42:34.831833 kernel: pci_bus 0000:01: resource 1 [mem 0x84000000-0x842fffff] Dec 12 18:42:34.831898 kernel: pci_bus 0000:01: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Dec 12 18:42:34.831969 kernel: pci_bus 0000:02: resource 0 [io 0x6000-0x6fff] Dec 12 18:42:34.832036 kernel: pci_bus 0000:02: resource 1 [mem 0x84000000-0x841fffff] Dec 12 18:42:34.832102 kernel: pci_bus 0000:02: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Dec 12 18:42:34.832171 kernel: pci_bus 0000:03: resource 1 [mem 0x83e00000-0x83ffffff] Dec 12 18:42:34.832236 kernel: pci_bus 0000:03: resource 2 [mem 0x380800000000-0x380fffffffff 64bit pref] Dec 12 18:42:34.832309 kernel: pci_bus 0000:04: resource 1 [mem 0x83c00000-0x83dfffff] Dec 12 18:42:34.832390 kernel: pci_bus 0000:04: resource 2 [mem 0x381000000000-0x3817ffffffff 64bit pref] Dec 12 18:42:34.832460 kernel: pci_bus 0000:05: resource 1 [mem 0x83a00000-0x83bfffff] Dec 12 18:42:34.832526 kernel: pci_bus 0000:05: resource 2 [mem 0x381800000000-0x381fffffffff 64bit pref] Dec 12 18:42:34.832595 kernel: pci_bus 0000:06: resource 1 [mem 0x83800000-0x839fffff] Dec 12 18:42:34.832659 kernel: pci_bus 0000:06: resource 2 [mem 0x382000000000-0x3827ffffffff 64bit pref] Dec 12 18:42:34.832729 kernel: pci_bus 0000:07: resource 1 [mem 0x83600000-0x837fffff] Dec 12 18:42:34.832796 kernel: pci_bus 0000:07: resource 2 [mem 0x382800000000-0x382fffffffff 64bit pref] Dec 12 18:42:34.832865 kernel: pci_bus 0000:08: resource 1 [mem 0x83400000-0x835fffff] Dec 12 18:42:34.832929 kernel: pci_bus 0000:08: resource 2 [mem 0x383000000000-0x3837ffffffff 64bit pref] Dec 12 18:42:34.832998 kernel: pci_bus 0000:09: resource 1 [mem 0x83200000-0x833fffff] Dec 12 18:42:34.833062 kernel: pci_bus 0000:09: resource 2 [mem 0x383800000000-0x383fffffffff 64bit pref] Dec 12 18:42:34.833131 kernel: pci_bus 0000:0a: resource 1 [mem 0x83000000-0x831fffff] Dec 12 18:42:34.833194 kernel: pci_bus 0000:0a: resource 2 [mem 0x384000000000-0x3847ffffffff 64bit pref] Dec 12 18:42:34.833272 kernel: pci_bus 0000:0b: resource 1 [mem 0x82e00000-0x82ffffff] Dec 12 18:42:34.833337 kernel: pci_bus 0000:0b: resource 2 [mem 0x384800000000-0x384fffffffff 64bit pref] Dec 12 18:42:34.833415 kernel: pci_bus 0000:0c: resource 1 [mem 0x82c00000-0x82dfffff] Dec 12 18:42:34.833480 kernel: pci_bus 0000:0c: resource 2 [mem 0x385000000000-0x3857ffffffff 64bit pref] Dec 12 18:42:34.833554 kernel: pci_bus 0000:0d: resource 1 [mem 0x82a00000-0x82bfffff] Dec 12 18:42:34.833622 kernel: pci_bus 0000:0d: resource 2 [mem 0x385800000000-0x385fffffffff 64bit pref] Dec 12 18:42:34.833693 kernel: pci_bus 0000:0e: resource 1 [mem 0x82800000-0x829fffff] Dec 12 18:42:34.833758 kernel: pci_bus 0000:0e: resource 2 [mem 0x386000000000-0x3867ffffffff 64bit pref] Dec 12 18:42:34.833827 kernel: pci_bus 0000:0f: resource 1 [mem 0x82600000-0x827fffff] Dec 12 18:42:34.833891 kernel: pci_bus 0000:0f: resource 2 [mem 0x386800000000-0x386fffffffff 64bit pref] Dec 12 18:42:34.833959 kernel: pci_bus 0000:10: resource 1 [mem 0x82400000-0x825fffff] Dec 12 18:42:34.834026 kernel: pci_bus 0000:10: resource 2 [mem 0x387000000000-0x3877ffffffff 64bit pref] Dec 12 18:42:34.834096 kernel: pci_bus 0000:11: resource 1 [mem 0x82200000-0x823fffff] Dec 12 18:42:34.834164 kernel: pci_bus 0000:11: resource 2 [mem 0x387800000000-0x387fffffffff 64bit pref] Dec 12 18:42:34.834233 kernel: pci_bus 0000:12: resource 0 [io 0xf000-0xffff] Dec 12 18:42:34.834298 kernel: pci_bus 0000:12: resource 1 [mem 0x82000000-0x821fffff] Dec 12 18:42:34.834361 kernel: pci_bus 0000:12: resource 2 [mem 0x388000000000-0x3887ffffffff 64bit pref] Dec 12 18:42:34.834439 kernel: pci_bus 0000:13: resource 0 [io 0xe000-0xefff] Dec 12 18:42:34.834505 kernel: pci_bus 0000:13: resource 1 [mem 0x81e00000-0x81ffffff] Dec 12 18:42:34.834568 kernel: pci_bus 0000:13: resource 2 [mem 0x388800000000-0x388fffffffff 64bit pref] Dec 12 18:42:34.834636 kernel: pci_bus 0000:14: resource 0 [io 0xd000-0xdfff] Dec 12 18:42:34.834700 kernel: pci_bus 0000:14: resource 1 [mem 0x81c00000-0x81dfffff] Dec 12 18:42:34.834763 kernel: pci_bus 0000:14: resource 2 [mem 0x389000000000-0x3897ffffffff 64bit pref] Dec 12 18:42:34.834830 kernel: pci_bus 0000:15: resource 0 [io 0xc000-0xcfff] Dec 12 18:42:34.834897 kernel: pci_bus 0000:15: resource 1 [mem 0x81a00000-0x81bfffff] Dec 12 18:42:34.834961 kernel: pci_bus 0000:15: resource 2 [mem 0x389800000000-0x389fffffffff 64bit pref] Dec 12 18:42:34.835033 kernel: pci_bus 0000:16: resource 0 [io 0xb000-0xbfff] Dec 12 18:42:34.835097 kernel: pci_bus 0000:16: resource 1 [mem 0x81800000-0x819fffff] Dec 12 18:42:34.835160 kernel: pci_bus 0000:16: resource 2 [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Dec 12 18:42:34.835227 kernel: pci_bus 0000:17: resource 0 [io 0xa000-0xafff] Dec 12 18:42:34.835291 kernel: pci_bus 0000:17: resource 1 [mem 0x81600000-0x817fffff] Dec 12 18:42:34.835358 kernel: pci_bus 0000:17: resource 2 [mem 0x38a800000000-0x38afffffffff 64bit pref] Dec 12 18:42:34.835442 kernel: pci_bus 0000:18: resource 0 [io 0x9000-0x9fff] Dec 12 18:42:34.835508 kernel: pci_bus 0000:18: resource 1 [mem 0x81400000-0x815fffff] Dec 12 18:42:34.835572 kernel: pci_bus 0000:18: resource 2 [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Dec 12 18:42:34.835640 kernel: pci_bus 0000:19: resource 0 [io 0x8000-0x8fff] Dec 12 18:42:34.835705 kernel: pci_bus 0000:19: resource 1 [mem 0x81200000-0x813fffff] Dec 12 18:42:34.835769 kernel: pci_bus 0000:19: resource 2 [mem 0x38b800000000-0x38bfffffffff 64bit pref] Dec 12 18:42:34.835840 kernel: pci_bus 0000:1a: resource 0 [io 0x5000-0x5fff] Dec 12 18:42:34.835904 kernel: pci_bus 0000:1a: resource 1 [mem 0x81000000-0x811fffff] Dec 12 18:42:34.835967 kernel: pci_bus 0000:1a: resource 2 [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Dec 12 18:42:34.836035 kernel: pci_bus 0000:1b: resource 0 [io 0x4000-0x4fff] Dec 12 18:42:34.836098 kernel: pci_bus 0000:1b: resource 1 [mem 0x80e00000-0x80ffffff] Dec 12 18:42:34.836161 kernel: pci_bus 0000:1b: resource 2 [mem 0x38c800000000-0x38cfffffffff 64bit pref] Dec 12 18:42:34.836231 kernel: pci_bus 0000:1c: resource 0 [io 0x3000-0x3fff] Dec 12 18:42:34.836295 kernel: pci_bus 0000:1c: resource 1 [mem 0x80c00000-0x80dfffff] Dec 12 18:42:34.836358 kernel: pci_bus 0000:1c: resource 2 [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Dec 12 18:42:34.836445 kernel: pci_bus 0000:1d: resource 0 [io 0x2000-0x2fff] Dec 12 18:42:34.836510 kernel: pci_bus 0000:1d: resource 1 [mem 0x80a00000-0x80bfffff] Dec 12 18:42:34.836575 kernel: pci_bus 0000:1d: resource 2 [mem 0x38d800000000-0x38dfffffffff 64bit pref] Dec 12 18:42:34.836649 kernel: pci_bus 0000:1e: resource 0 [io 0x1000-0x1fff] Dec 12 18:42:34.836716 kernel: pci_bus 0000:1e: resource 1 [mem 0x80800000-0x809fffff] Dec 12 18:42:34.836780 kernel: pci_bus 0000:1e: resource 2 [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Dec 12 18:42:34.836790 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Dec 12 18:42:34.836799 kernel: PCI: CLS 0 bytes, default 64 Dec 12 18:42:34.836807 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Dec 12 18:42:34.836815 kernel: software IO TLB: mapped [mem 0x0000000077e7e000-0x000000007be7e000] (64MB) Dec 12 18:42:34.836822 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Dec 12 18:42:34.836830 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2113334dc36, max_idle_ns: 440795272915 ns Dec 12 18:42:34.836840 kernel: Initialise system trusted keyrings Dec 12 18:42:34.836849 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Dec 12 18:42:34.836857 kernel: Key type asymmetric registered Dec 12 18:42:34.836864 kernel: Asymmetric key parser 'x509' registered Dec 12 18:42:34.836872 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Dec 12 18:42:34.836880 kernel: io scheduler mq-deadline registered Dec 12 18:42:34.836888 kernel: io scheduler kyber registered Dec 12 18:42:34.836895 kernel: io scheduler bfq registered Dec 12 18:42:34.836968 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Dec 12 18:42:34.837041 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Dec 12 18:42:34.837112 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Dec 12 18:42:34.837181 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Dec 12 18:42:34.837252 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Dec 12 18:42:34.837321 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Dec 12 18:42:34.837418 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Dec 12 18:42:34.837504 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Dec 12 18:42:34.837576 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Dec 12 18:42:34.837645 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Dec 12 18:42:34.837715 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Dec 12 18:42:34.837783 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Dec 12 18:42:34.837853 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Dec 12 18:42:34.837925 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Dec 12 18:42:34.837995 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Dec 12 18:42:34.838063 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Dec 12 18:42:34.838073 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Dec 12 18:42:34.838141 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Dec 12 18:42:34.838210 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Dec 12 18:42:34.838281 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 33 Dec 12 18:42:34.838351 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 33 Dec 12 18:42:34.838433 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 34 Dec 12 18:42:34.838502 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 34 Dec 12 18:42:34.838572 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 35 Dec 12 18:42:34.838643 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 35 Dec 12 18:42:34.838713 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 36 Dec 12 18:42:34.838781 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 36 Dec 12 18:42:34.838851 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 37 Dec 12 18:42:34.838920 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 37 Dec 12 18:42:34.838990 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 38 Dec 12 18:42:34.839062 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 38 Dec 12 18:42:34.839131 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 39 Dec 12 18:42:34.839199 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 39 Dec 12 18:42:34.839209 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Dec 12 18:42:34.839275 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 40 Dec 12 18:42:34.839344 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 40 Dec 12 18:42:34.839420 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 41 Dec 12 18:42:34.839489 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 41 Dec 12 18:42:34.839562 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 42 Dec 12 18:42:34.839630 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 42 Dec 12 18:42:34.839698 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 43 Dec 12 18:42:34.839766 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 43 Dec 12 18:42:34.839836 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 44 Dec 12 18:42:34.839905 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 44 Dec 12 18:42:34.839973 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 45 Dec 12 18:42:34.840042 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 45 Dec 12 18:42:34.840110 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 46 Dec 12 18:42:34.840181 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 46 Dec 12 18:42:34.840251 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 47 Dec 12 18:42:34.840319 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 47 Dec 12 18:42:34.840329 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Dec 12 18:42:34.840413 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 48 Dec 12 18:42:34.840482 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 48 Dec 12 18:42:34.840551 kernel: pcieport 0000:00:05.1: PME: Signaling with IRQ 49 Dec 12 18:42:34.840619 kernel: pcieport 0000:00:05.1: AER: enabled with IRQ 49 Dec 12 18:42:34.840691 kernel: pcieport 0000:00:05.2: PME: Signaling with IRQ 50 Dec 12 18:42:34.840760 kernel: pcieport 0000:00:05.2: AER: enabled with IRQ 50 Dec 12 18:42:34.840828 kernel: pcieport 0000:00:05.3: PME: Signaling with IRQ 51 Dec 12 18:42:34.840897 kernel: pcieport 0000:00:05.3: AER: enabled with IRQ 51 Dec 12 18:42:34.840966 kernel: pcieport 0000:00:05.4: PME: Signaling with IRQ 52 Dec 12 18:42:34.841036 kernel: pcieport 0000:00:05.4: AER: enabled with IRQ 52 Dec 12 18:42:34.841046 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Dec 12 18:42:34.841054 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 12 18:42:34.841065 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Dec 12 18:42:34.841073 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Dec 12 18:42:34.841080 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Dec 12 18:42:34.841088 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Dec 12 18:42:34.841164 kernel: rtc_cmos 00:03: RTC can wake from S4 Dec 12 18:42:34.841175 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Dec 12 18:42:34.841238 kernel: rtc_cmos 00:03: registered as rtc0 Dec 12 18:42:34.841302 kernel: rtc_cmos 00:03: setting system clock to 2025-12-12T18:42:34 UTC (1765564954) Dec 12 18:42:34.841367 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Dec 12 18:42:34.841393 kernel: intel_pstate: CPU model not supported Dec 12 18:42:34.841401 kernel: efifb: probing for efifb Dec 12 18:42:34.841409 kernel: efifb: framebuffer at 0x80000000, using 4000k, total 4000k Dec 12 18:42:34.841416 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Dec 12 18:42:34.841424 kernel: efifb: scrolling: redraw Dec 12 18:42:34.841432 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Dec 12 18:42:34.841440 kernel: Console: switching to colour frame buffer device 160x50 Dec 12 18:42:34.841447 kernel: fb0: EFI VGA frame buffer device Dec 12 18:42:34.841457 kernel: pstore: Using crash dump compression: deflate Dec 12 18:42:34.841465 kernel: pstore: Registered efi_pstore as persistent store backend Dec 12 18:42:34.841473 kernel: NET: Registered PF_INET6 protocol family Dec 12 18:42:34.841481 kernel: Segment Routing with IPv6 Dec 12 18:42:34.841488 kernel: In-situ OAM (IOAM) with IPv6 Dec 12 18:42:34.841496 kernel: NET: Registered PF_PACKET protocol family Dec 12 18:42:34.841504 kernel: Key type dns_resolver registered Dec 12 18:42:34.841512 kernel: IPI shorthand broadcast: enabled Dec 12 18:42:34.841519 kernel: sched_clock: Marking stable (3977002810, 164256464)->(4374206320, -232947046) Dec 12 18:42:34.841529 kernel: registered taskstats version 1 Dec 12 18:42:34.841537 kernel: Loading compiled-in X.509 certificates Dec 12 18:42:34.841545 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.61-flatcar: 0d0c78e6590cb40d27f1cef749ef9f2f3425f38d' Dec 12 18:42:34.841552 kernel: Demotion targets for Node 0: null Dec 12 18:42:34.841560 kernel: Key type .fscrypt registered Dec 12 18:42:34.841568 kernel: Key type fscrypt-provisioning registered Dec 12 18:42:34.841575 kernel: ima: No TPM chip found, activating TPM-bypass! Dec 12 18:42:34.841583 kernel: ima: Allocated hash algorithm: sha1 Dec 12 18:42:34.841591 kernel: ima: No architecture policies found Dec 12 18:42:34.841600 kernel: clk: Disabling unused clocks Dec 12 18:42:34.841608 kernel: Warning: unable to open an initial console. Dec 12 18:42:34.841616 kernel: Freeing unused kernel image (initmem) memory: 46188K Dec 12 18:42:34.841624 kernel: Write protecting the kernel read-only data: 40960k Dec 12 18:42:34.841632 kernel: Freeing unused kernel image (rodata/data gap) memory: 560K Dec 12 18:42:34.841639 kernel: Run /init as init process Dec 12 18:42:34.841647 kernel: with arguments: Dec 12 18:42:34.841655 kernel: /init Dec 12 18:42:34.841662 kernel: with environment: Dec 12 18:42:34.841670 kernel: HOME=/ Dec 12 18:42:34.841679 kernel: TERM=linux Dec 12 18:42:34.841688 systemd[1]: Successfully made /usr/ read-only. Dec 12 18:42:34.841700 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 12 18:42:34.841709 systemd[1]: Detected virtualization kvm. Dec 12 18:42:34.841716 systemd[1]: Detected architecture x86-64. Dec 12 18:42:34.841725 systemd[1]: Running in initrd. Dec 12 18:42:34.841733 systemd[1]: No hostname configured, using default hostname. Dec 12 18:42:34.841743 systemd[1]: Hostname set to . Dec 12 18:42:34.841751 systemd[1]: Initializing machine ID from VM UUID. Dec 12 18:42:34.841770 systemd[1]: Queued start job for default target initrd.target. Dec 12 18:42:34.841780 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 18:42:34.841788 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 18:42:34.841797 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Dec 12 18:42:34.841806 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 12 18:42:34.841814 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Dec 12 18:42:34.841823 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Dec 12 18:42:34.841834 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Dec 12 18:42:34.841842 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Dec 12 18:42:34.841851 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 18:42:34.841859 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 12 18:42:34.841867 systemd[1]: Reached target paths.target - Path Units. Dec 12 18:42:34.841875 systemd[1]: Reached target slices.target - Slice Units. Dec 12 18:42:34.841883 systemd[1]: Reached target swap.target - Swaps. Dec 12 18:42:34.841892 systemd[1]: Reached target timers.target - Timer Units. Dec 12 18:42:34.841902 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Dec 12 18:42:34.841910 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 12 18:42:34.841919 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Dec 12 18:42:34.841927 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Dec 12 18:42:34.841935 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 12 18:42:34.841944 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 12 18:42:34.841952 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 18:42:34.841960 systemd[1]: Reached target sockets.target - Socket Units. Dec 12 18:42:34.841968 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Dec 12 18:42:34.841978 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 12 18:42:34.841986 systemd[1]: Finished network-cleanup.service - Network Cleanup. Dec 12 18:42:34.841995 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Dec 12 18:42:34.842003 systemd[1]: Starting systemd-fsck-usr.service... Dec 12 18:42:34.842011 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 12 18:42:34.842020 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 12 18:42:34.842028 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 18:42:34.842036 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Dec 12 18:42:34.842047 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 18:42:34.842055 systemd[1]: Finished systemd-fsck-usr.service. Dec 12 18:42:34.842064 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 12 18:42:34.842094 systemd-journald[275]: Collecting audit messages is disabled. Dec 12 18:42:34.842118 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 18:42:34.842129 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 12 18:42:34.842138 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 12 18:42:34.842146 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 12 18:42:34.842159 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 18:42:34.842168 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 12 18:42:34.842176 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 12 18:42:34.842184 kernel: Bridge firewalling registered Dec 12 18:42:34.842192 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 12 18:42:34.842201 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Dec 12 18:42:34.842209 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 12 18:42:34.842221 systemd-journald[275]: Journal started Dec 12 18:42:34.842240 systemd-journald[275]: Runtime Journal (/run/log/journal/ee2fc1ccfaf44d7a807f696de9cb9d95) is 8M, max 319.5M, 311.5M free. Dec 12 18:42:34.797991 systemd-modules-load[279]: Inserted module 'overlay' Dec 12 18:42:34.851886 systemd[1]: Started systemd-journald.service - Journal Service. Dec 12 18:42:34.827663 systemd-modules-load[279]: Inserted module 'br_netfilter' Dec 12 18:42:34.853310 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 12 18:42:34.857700 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 12 18:42:34.862147 systemd-tmpfiles[311]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Dec 12 18:42:34.862929 dracut-cmdline[304]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=a214a2d85e162c493e8b13db2df50a43e1005a0e4854a1ae089a14f442a30022 Dec 12 18:42:34.865578 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 18:42:34.867630 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 12 18:42:34.898813 systemd-resolved[340]: Positive Trust Anchors: Dec 12 18:42:34.898831 systemd-resolved[340]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 12 18:42:34.898862 systemd-resolved[340]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 12 18:42:34.901226 systemd-resolved[340]: Defaulting to hostname 'linux'. Dec 12 18:42:34.902171 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 12 18:42:34.903041 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 12 18:42:34.951440 kernel: SCSI subsystem initialized Dec 12 18:42:34.964445 kernel: Loading iSCSI transport class v2.0-870. Dec 12 18:42:34.976413 kernel: iscsi: registered transport (tcp) Dec 12 18:42:35.000774 kernel: iscsi: registered transport (qla4xxx) Dec 12 18:42:35.000848 kernel: QLogic iSCSI HBA Driver Dec 12 18:42:35.020332 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 12 18:42:35.046767 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 18:42:35.048717 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 12 18:42:35.095155 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Dec 12 18:42:35.097278 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Dec 12 18:42:35.156448 kernel: raid6: avx512x4 gen() 43303 MB/s Dec 12 18:42:35.174420 kernel: raid6: avx512x2 gen() 46086 MB/s Dec 12 18:42:35.191424 kernel: raid6: avx512x1 gen() 44472 MB/s Dec 12 18:42:35.209426 kernel: raid6: avx2x4 gen() 34104 MB/s Dec 12 18:42:35.226419 kernel: raid6: avx2x2 gen() 33152 MB/s Dec 12 18:42:35.243886 kernel: raid6: avx2x1 gen() 26658 MB/s Dec 12 18:42:35.243955 kernel: raid6: using algorithm avx512x2 gen() 46086 MB/s Dec 12 18:42:35.263490 kernel: raid6: .... xor() 27079 MB/s, rmw enabled Dec 12 18:42:35.263556 kernel: raid6: using avx512x2 recovery algorithm Dec 12 18:42:35.283416 kernel: xor: automatically using best checksumming function avx Dec 12 18:42:35.417433 kernel: Btrfs loaded, zoned=no, fsverity=no Dec 12 18:42:35.424535 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Dec 12 18:42:35.426326 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 18:42:35.455205 systemd-udevd[533]: Using default interface naming scheme 'v255'. Dec 12 18:42:35.459637 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 18:42:35.461037 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Dec 12 18:42:35.493230 dracut-pre-trigger[539]: rd.md=0: removing MD RAID activation Dec 12 18:42:35.515997 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Dec 12 18:42:35.517616 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 12 18:42:35.617251 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 18:42:35.622034 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Dec 12 18:42:35.647411 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues Dec 12 18:42:35.677574 kernel: virtio_blk virtio2: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Dec 12 18:42:35.686410 kernel: cryptd: max_cpu_qlen set to 1000 Dec 12 18:42:35.696403 kernel: ACPI: bus type USB registered Dec 12 18:42:35.703469 kernel: libata version 3.00 loaded. Dec 12 18:42:35.707405 kernel: usbcore: registered new interface driver usbfs Dec 12 18:42:35.707470 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Dec 12 18:42:35.709537 kernel: usbcore: registered new interface driver hub Dec 12 18:42:35.709583 kernel: GPT:17805311 != 104857599 Dec 12 18:42:35.709595 kernel: GPT:Alternate GPT header not at the end of the disk. Dec 12 18:42:35.709604 kernel: GPT:17805311 != 104857599 Dec 12 18:42:35.709613 kernel: GPT: Use GNU Parted to correct GPT errors. Dec 12 18:42:35.709623 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 12 18:42:35.709633 kernel: usbcore: registered new device driver usb Dec 12 18:42:35.712623 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Dec 12 18:42:35.718399 kernel: AES CTR mode by8 optimization enabled Dec 12 18:42:35.723662 kernel: ahci 0000:00:1f.2: version 3.0 Dec 12 18:42:35.723850 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Dec 12 18:42:35.728108 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Dec 12 18:42:35.728287 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Dec 12 18:42:35.728413 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Dec 12 18:42:35.733816 kernel: uhci_hcd 0000:02:01.0: UHCI Host Controller Dec 12 18:42:35.734026 kernel: uhci_hcd 0000:02:01.0: new USB bus registered, assigned bus number 1 Dec 12 18:42:35.734147 kernel: scsi host0: ahci Dec 12 18:42:35.735517 kernel: uhci_hcd 0000:02:01.0: detected 2 ports Dec 12 18:42:35.735689 kernel: scsi host1: ahci Dec 12 18:42:35.737455 kernel: uhci_hcd 0000:02:01.0: irq 22, io port 0x00006000 Dec 12 18:42:35.739405 kernel: scsi host2: ahci Dec 12 18:42:35.739444 kernel: hub 1-0:1.0: USB hub found Dec 12 18:42:35.739092 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 18:42:35.753343 kernel: hub 1-0:1.0: 2 ports detected Dec 12 18:42:35.753539 kernel: scsi host3: ahci Dec 12 18:42:35.753639 kernel: scsi host4: ahci Dec 12 18:42:35.753742 kernel: scsi host5: ahci Dec 12 18:42:35.753826 kernel: ata1: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380100 irq 67 lpm-pol 1 Dec 12 18:42:35.753837 kernel: ata2: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380180 irq 67 lpm-pol 1 Dec 12 18:42:35.753847 kernel: ata3: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380200 irq 67 lpm-pol 1 Dec 12 18:42:35.753861 kernel: ata4: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380280 irq 67 lpm-pol 1 Dec 12 18:42:35.753870 kernel: ata5: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380300 irq 67 lpm-pol 1 Dec 12 18:42:35.753880 kernel: ata6: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380380 irq 67 lpm-pol 1 Dec 12 18:42:35.739216 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 18:42:35.753821 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 18:42:35.754988 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 18:42:35.783605 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Dec 12 18:42:35.784881 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 18:42:35.798483 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Dec 12 18:42:35.809396 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Dec 12 18:42:35.810005 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Dec 12 18:42:35.817773 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 12 18:42:35.819247 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Dec 12 18:42:35.852665 disk-uuid[740]: Primary Header is updated. Dec 12 18:42:35.852665 disk-uuid[740]: Secondary Entries is updated. Dec 12 18:42:35.852665 disk-uuid[740]: Secondary Header is updated. Dec 12 18:42:35.860421 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 12 18:42:35.962426 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd Dec 12 18:42:36.055436 kernel: ata2: SATA link down (SStatus 0 SControl 300) Dec 12 18:42:36.055514 kernel: ata6: SATA link down (SStatus 0 SControl 300) Dec 12 18:42:36.056425 kernel: ata1: SATA link down (SStatus 0 SControl 300) Dec 12 18:42:36.057404 kernel: ata4: SATA link down (SStatus 0 SControl 300) Dec 12 18:42:36.059411 kernel: ata3: SATA link down (SStatus 0 SControl 300) Dec 12 18:42:36.060433 kernel: ata5: SATA link down (SStatus 0 SControl 300) Dec 12 18:42:36.069035 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Dec 12 18:42:36.070291 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Dec 12 18:42:36.070731 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 18:42:36.071421 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 12 18:42:36.072823 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Dec 12 18:42:36.113595 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Dec 12 18:42:36.145422 kernel: hid: raw HID events driver (C) Jiri Kosina Dec 12 18:42:36.153484 kernel: usbcore: registered new interface driver usbhid Dec 12 18:42:36.153551 kernel: usbhid: USB HID core driver Dec 12 18:42:36.158873 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.0/0000:01:00.0/0000:02:01.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Dec 12 18:42:36.158923 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:01.0-1/input0 Dec 12 18:42:36.873432 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 12 18:42:36.873784 disk-uuid[741]: The operation has completed successfully. Dec 12 18:42:36.924155 systemd[1]: disk-uuid.service: Deactivated successfully. Dec 12 18:42:36.924286 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Dec 12 18:42:36.952638 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Dec 12 18:42:36.986460 sh[775]: Success Dec 12 18:42:37.005953 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 12 18:42:37.006017 kernel: device-mapper: uevent: version 1.0.3 Dec 12 18:42:37.006476 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Dec 12 18:42:37.017421 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Dec 12 18:42:37.096616 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Dec 12 18:42:37.099337 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Dec 12 18:42:37.115495 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Dec 12 18:42:37.138424 kernel: BTRFS: device fsid a6ae7f96-a076-4d3c-81ed-46dd341492f8 devid 1 transid 37 /dev/mapper/usr (253:0) scanned by mount (788) Dec 12 18:42:37.141513 kernel: BTRFS info (device dm-0): first mount of filesystem a6ae7f96-a076-4d3c-81ed-46dd341492f8 Dec 12 18:42:37.141583 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Dec 12 18:42:37.163862 kernel: BTRFS info (device dm-0): disabling log replay at mount time Dec 12 18:42:37.163942 kernel: BTRFS info (device dm-0): enabling free space tree Dec 12 18:42:37.167057 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Dec 12 18:42:37.168192 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Dec 12 18:42:37.168875 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Dec 12 18:42:37.169715 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Dec 12 18:42:37.171269 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Dec 12 18:42:37.220419 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (819) Dec 12 18:42:37.223586 kernel: BTRFS info (device vda6): first mount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 12 18:42:37.223628 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Dec 12 18:42:37.232977 kernel: BTRFS info (device vda6): turning on async discard Dec 12 18:42:37.233031 kernel: BTRFS info (device vda6): enabling free space tree Dec 12 18:42:37.237400 kernel: BTRFS info (device vda6): last unmount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 12 18:42:37.238673 systemd[1]: Finished ignition-setup.service - Ignition (setup). Dec 12 18:42:37.240045 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Dec 12 18:42:37.280099 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 12 18:42:37.282777 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 12 18:42:37.326119 systemd-networkd[963]: lo: Link UP Dec 12 18:42:37.326128 systemd-networkd[963]: lo: Gained carrier Dec 12 18:42:37.327095 systemd-networkd[963]: Enumeration completed Dec 12 18:42:37.327354 systemd-networkd[963]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 18:42:37.327357 systemd-networkd[963]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 12 18:42:37.327482 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 12 18:42:37.327802 systemd-networkd[963]: eth0: Link UP Dec 12 18:42:37.328190 systemd-networkd[963]: eth0: Gained carrier Dec 12 18:42:37.328200 systemd-networkd[963]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 18:42:37.328776 systemd[1]: Reached target network.target - Network. Dec 12 18:42:37.347449 systemd-networkd[963]: eth0: DHCPv4 address 10.0.8.97/25, gateway 10.0.8.1 acquired from 10.0.8.1 Dec 12 18:42:37.385134 ignition[898]: Ignition 2.22.0 Dec 12 18:42:37.385147 ignition[898]: Stage: fetch-offline Dec 12 18:42:37.385182 ignition[898]: no configs at "/usr/lib/ignition/base.d" Dec 12 18:42:37.386784 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Dec 12 18:42:37.385189 ignition[898]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 18:42:37.385274 ignition[898]: parsed url from cmdline: "" Dec 12 18:42:37.385277 ignition[898]: no config URL provided Dec 12 18:42:37.385282 ignition[898]: reading system config file "/usr/lib/ignition/user.ign" Dec 12 18:42:37.385288 ignition[898]: no config at "/usr/lib/ignition/user.ign" Dec 12 18:42:37.388864 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Dec 12 18:42:37.385292 ignition[898]: failed to fetch config: resource requires networking Dec 12 18:42:37.385449 ignition[898]: Ignition finished successfully Dec 12 18:42:37.428166 ignition[980]: Ignition 2.22.0 Dec 12 18:42:37.428180 ignition[980]: Stage: fetch Dec 12 18:42:37.428317 ignition[980]: no configs at "/usr/lib/ignition/base.d" Dec 12 18:42:37.428325 ignition[980]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 18:42:37.428429 ignition[980]: parsed url from cmdline: "" Dec 12 18:42:37.428436 ignition[980]: no config URL provided Dec 12 18:42:37.428442 ignition[980]: reading system config file "/usr/lib/ignition/user.ign" Dec 12 18:42:37.428448 ignition[980]: no config at "/usr/lib/ignition/user.ign" Dec 12 18:42:37.428573 ignition[980]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Dec 12 18:42:37.428644 ignition[980]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Dec 12 18:42:37.428679 ignition[980]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Dec 12 18:42:37.741515 ignition[980]: GET result: OK Dec 12 18:42:37.741636 ignition[980]: parsing config with SHA512: c25f84499d332ffca55ed027cd5d6307f49803725582769d5c30a6c8048886a0ca05bd7af7b0307483d201c92f47d1dd0fa4c58965285c61d60381c88d40c0a2 Dec 12 18:42:37.745369 unknown[980]: fetched base config from "system" Dec 12 18:42:37.745392 unknown[980]: fetched base config from "system" Dec 12 18:42:37.745671 ignition[980]: fetch: fetch complete Dec 12 18:42:37.745404 unknown[980]: fetched user config from "openstack" Dec 12 18:42:37.745675 ignition[980]: fetch: fetch passed Dec 12 18:42:37.747743 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Dec 12 18:42:37.745711 ignition[980]: Ignition finished successfully Dec 12 18:42:37.749697 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Dec 12 18:42:37.786909 ignition[992]: Ignition 2.22.0 Dec 12 18:42:37.786922 ignition[992]: Stage: kargs Dec 12 18:42:37.787086 ignition[992]: no configs at "/usr/lib/ignition/base.d" Dec 12 18:42:37.787095 ignition[992]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 18:42:37.787855 ignition[992]: kargs: kargs passed Dec 12 18:42:37.789309 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Dec 12 18:42:37.787901 ignition[992]: Ignition finished successfully Dec 12 18:42:37.791412 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Dec 12 18:42:37.829726 ignition[1003]: Ignition 2.22.0 Dec 12 18:42:37.829739 ignition[1003]: Stage: disks Dec 12 18:42:37.829887 ignition[1003]: no configs at "/usr/lib/ignition/base.d" Dec 12 18:42:37.829896 ignition[1003]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 18:42:37.830652 ignition[1003]: disks: disks passed Dec 12 18:42:37.830691 ignition[1003]: Ignition finished successfully Dec 12 18:42:37.832296 systemd[1]: Finished ignition-disks.service - Ignition (disks). Dec 12 18:42:37.833082 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Dec 12 18:42:37.833714 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Dec 12 18:42:37.834527 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 12 18:42:37.835326 systemd[1]: Reached target sysinit.target - System Initialization. Dec 12 18:42:37.836157 systemd[1]: Reached target basic.target - Basic System. Dec 12 18:42:37.837813 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Dec 12 18:42:37.886904 systemd-fsck[1017]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Dec 12 18:42:37.889204 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Dec 12 18:42:37.890843 systemd[1]: Mounting sysroot.mount - /sysroot... Dec 12 18:42:38.083402 kernel: EXT4-fs (vda9): mounted filesystem e48ca59c-1206-4abd-b121-5e9b35e49852 r/w with ordered data mode. Quota mode: none. Dec 12 18:42:38.084105 systemd[1]: Mounted sysroot.mount - /sysroot. Dec 12 18:42:38.085096 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Dec 12 18:42:38.087560 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 12 18:42:38.089269 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Dec 12 18:42:38.089948 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Dec 12 18:42:38.090680 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Dec 12 18:42:38.091136 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Dec 12 18:42:38.091167 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Dec 12 18:42:38.112462 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Dec 12 18:42:38.115003 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Dec 12 18:42:38.129412 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1025) Dec 12 18:42:38.134390 kernel: BTRFS info (device vda6): first mount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 12 18:42:38.134458 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Dec 12 18:42:38.143997 kernel: BTRFS info (device vda6): turning on async discard Dec 12 18:42:38.144099 kernel: BTRFS info (device vda6): enabling free space tree Dec 12 18:42:38.145509 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 12 18:42:38.181407 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 18:42:38.202025 initrd-setup-root[1054]: cut: /sysroot/etc/passwd: No such file or directory Dec 12 18:42:38.209840 initrd-setup-root[1061]: cut: /sysroot/etc/group: No such file or directory Dec 12 18:42:38.213414 initrd-setup-root[1068]: cut: /sysroot/etc/shadow: No such file or directory Dec 12 18:42:38.216264 initrd-setup-root[1075]: cut: /sysroot/etc/gshadow: No such file or directory Dec 12 18:42:38.328916 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Dec 12 18:42:38.331198 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Dec 12 18:42:38.332679 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Dec 12 18:42:38.350661 systemd[1]: sysroot-oem.mount: Deactivated successfully. Dec 12 18:42:38.352412 kernel: BTRFS info (device vda6): last unmount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 12 18:42:38.378450 ignition[1142]: INFO : Ignition 2.22.0 Dec 12 18:42:38.378450 ignition[1142]: INFO : Stage: mount Dec 12 18:42:38.380731 ignition[1142]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 18:42:38.380731 ignition[1142]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 18:42:38.380731 ignition[1142]: INFO : mount: mount passed Dec 12 18:42:38.380731 ignition[1142]: INFO : Ignition finished successfully Dec 12 18:42:38.382275 systemd[1]: Finished ignition-mount.service - Ignition (mount). Dec 12 18:42:38.386665 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Dec 12 18:42:39.095622 systemd-networkd[963]: eth0: Gained IPv6LL Dec 12 18:42:39.235410 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 18:42:41.244423 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 18:42:45.255730 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 18:42:45.260784 coreos-metadata[1027]: Dec 12 18:42:45.260 WARN failed to locate config-drive, using the metadata service API instead Dec 12 18:42:45.272077 coreos-metadata[1027]: Dec 12 18:42:45.272 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 12 18:42:45.427774 coreos-metadata[1027]: Dec 12 18:42:45.427 INFO Fetch successful Dec 12 18:42:45.428579 coreos-metadata[1027]: Dec 12 18:42:45.427 INFO wrote hostname ci-4459-2-2-4-78a5f49b53 to /sysroot/etc/hostname Dec 12 18:42:45.429697 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Dec 12 18:42:45.429802 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Dec 12 18:42:45.431152 systemd[1]: Starting ignition-files.service - Ignition (files)... Dec 12 18:42:45.458514 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 12 18:42:45.491413 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1167) Dec 12 18:42:45.494745 kernel: BTRFS info (device vda6): first mount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 12 18:42:45.494793 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Dec 12 18:42:45.505956 kernel: BTRFS info (device vda6): turning on async discard Dec 12 18:42:45.506031 kernel: BTRFS info (device vda6): enabling free space tree Dec 12 18:42:45.507533 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 12 18:42:45.556244 ignition[1185]: INFO : Ignition 2.22.0 Dec 12 18:42:45.556244 ignition[1185]: INFO : Stage: files Dec 12 18:42:45.557361 ignition[1185]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 18:42:45.557361 ignition[1185]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 18:42:45.557361 ignition[1185]: DEBUG : files: compiled without relabeling support, skipping Dec 12 18:42:45.559684 ignition[1185]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Dec 12 18:42:45.559684 ignition[1185]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Dec 12 18:42:45.564582 ignition[1185]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Dec 12 18:42:45.564972 ignition[1185]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Dec 12 18:42:45.565401 unknown[1185]: wrote ssh authorized keys file for user: core Dec 12 18:42:45.565846 ignition[1185]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Dec 12 18:42:45.568430 ignition[1185]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Dec 12 18:42:45.569111 ignition[1185]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Dec 12 18:42:45.638142 ignition[1185]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Dec 12 18:42:45.927647 ignition[1185]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Dec 12 18:42:45.927647 ignition[1185]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Dec 12 18:42:45.929305 ignition[1185]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Dec 12 18:42:45.929305 ignition[1185]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Dec 12 18:42:45.929305 ignition[1185]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Dec 12 18:42:45.929305 ignition[1185]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 12 18:42:45.929305 ignition[1185]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 12 18:42:45.929305 ignition[1185]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 12 18:42:45.929305 ignition[1185]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 12 18:42:45.932827 ignition[1185]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Dec 12 18:42:45.933284 ignition[1185]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Dec 12 18:42:45.933284 ignition[1185]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Dec 12 18:42:45.935470 ignition[1185]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Dec 12 18:42:45.935470 ignition[1185]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Dec 12 18:42:45.935470 ignition[1185]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Dec 12 18:42:46.046338 ignition[1185]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Dec 12 18:42:47.015822 ignition[1185]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Dec 12 18:42:47.015822 ignition[1185]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Dec 12 18:42:47.018254 ignition[1185]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 12 18:42:47.022286 ignition[1185]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 12 18:42:47.022286 ignition[1185]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Dec 12 18:42:47.022286 ignition[1185]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Dec 12 18:42:47.024036 ignition[1185]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Dec 12 18:42:47.024036 ignition[1185]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Dec 12 18:42:47.024036 ignition[1185]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Dec 12 18:42:47.024036 ignition[1185]: INFO : files: files passed Dec 12 18:42:47.024036 ignition[1185]: INFO : Ignition finished successfully Dec 12 18:42:47.024080 systemd[1]: Finished ignition-files.service - Ignition (files). Dec 12 18:42:47.026323 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Dec 12 18:42:47.027801 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Dec 12 18:42:47.044858 systemd[1]: ignition-quench.service: Deactivated successfully. Dec 12 18:42:47.044967 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Dec 12 18:42:47.050376 initrd-setup-root-after-ignition[1220]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 12 18:42:47.050376 initrd-setup-root-after-ignition[1220]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Dec 12 18:42:47.051717 initrd-setup-root-after-ignition[1224]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 12 18:42:47.052576 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 12 18:42:47.053232 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Dec 12 18:42:47.054671 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Dec 12 18:42:47.108952 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 12 18:42:47.109060 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Dec 12 18:42:47.110615 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Dec 12 18:42:47.111438 systemd[1]: Reached target initrd.target - Initrd Default Target. Dec 12 18:42:47.113172 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Dec 12 18:42:47.114097 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Dec 12 18:42:47.143374 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 12 18:42:47.145495 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Dec 12 18:42:47.169795 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Dec 12 18:42:47.170712 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 18:42:47.171954 systemd[1]: Stopped target timers.target - Timer Units. Dec 12 18:42:47.172850 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 12 18:42:47.172987 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 12 18:42:47.174111 systemd[1]: Stopped target initrd.target - Initrd Default Target. Dec 12 18:42:47.175031 systemd[1]: Stopped target basic.target - Basic System. Dec 12 18:42:47.175790 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Dec 12 18:42:47.176636 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Dec 12 18:42:47.177375 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Dec 12 18:42:47.178121 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Dec 12 18:42:47.178905 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Dec 12 18:42:47.179759 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Dec 12 18:42:47.180671 systemd[1]: Stopped target sysinit.target - System Initialization. Dec 12 18:42:47.181557 systemd[1]: Stopped target local-fs.target - Local File Systems. Dec 12 18:42:47.182325 systemd[1]: Stopped target swap.target - Swaps. Dec 12 18:42:47.183085 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 12 18:42:47.183222 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Dec 12 18:42:47.184314 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Dec 12 18:42:47.185168 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 18:42:47.185949 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Dec 12 18:42:47.186066 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 18:42:47.186731 systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 12 18:42:47.186849 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Dec 12 18:42:47.187954 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Dec 12 18:42:47.188048 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 12 18:42:47.188827 systemd[1]: ignition-files.service: Deactivated successfully. Dec 12 18:42:47.188909 systemd[1]: Stopped ignition-files.service - Ignition (files). Dec 12 18:42:47.190621 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Dec 12 18:42:47.191357 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 12 18:42:47.191466 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 18:42:47.192950 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Dec 12 18:42:47.193667 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 12 18:42:47.193761 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 18:42:47.194466 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 12 18:42:47.194544 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Dec 12 18:42:47.198264 systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 12 18:42:47.216643 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Dec 12 18:42:47.234010 systemd[1]: sysroot-boot.mount: Deactivated successfully. Dec 12 18:42:47.234828 ignition[1244]: INFO : Ignition 2.22.0 Dec 12 18:42:47.234828 ignition[1244]: INFO : Stage: umount Dec 12 18:42:47.234828 ignition[1244]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 18:42:47.234828 ignition[1244]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 18:42:47.236843 ignition[1244]: INFO : umount: umount passed Dec 12 18:42:47.236843 ignition[1244]: INFO : Ignition finished successfully Dec 12 18:42:47.237004 systemd[1]: ignition-mount.service: Deactivated successfully. Dec 12 18:42:47.237108 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Dec 12 18:42:47.237823 systemd[1]: ignition-disks.service: Deactivated successfully. Dec 12 18:42:47.237862 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Dec 12 18:42:47.238360 systemd[1]: ignition-kargs.service: Deactivated successfully. Dec 12 18:42:47.238406 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Dec 12 18:42:47.239063 systemd[1]: ignition-fetch.service: Deactivated successfully. Dec 12 18:42:47.239095 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Dec 12 18:42:47.239808 systemd[1]: Stopped target network.target - Network. Dec 12 18:42:47.240529 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Dec 12 18:42:47.240568 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Dec 12 18:42:47.241314 systemd[1]: Stopped target paths.target - Path Units. Dec 12 18:42:47.242039 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 12 18:42:47.246466 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 18:42:47.246868 systemd[1]: Stopped target slices.target - Slice Units. Dec 12 18:42:47.247609 systemd[1]: Stopped target sockets.target - Socket Units. Dec 12 18:42:47.248375 systemd[1]: iscsid.socket: Deactivated successfully. Dec 12 18:42:47.248433 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Dec 12 18:42:47.249069 systemd[1]: iscsiuio.socket: Deactivated successfully. Dec 12 18:42:47.249101 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 12 18:42:47.249812 systemd[1]: ignition-setup.service: Deactivated successfully. Dec 12 18:42:47.249868 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Dec 12 18:42:47.250542 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Dec 12 18:42:47.250574 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Dec 12 18:42:47.251330 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Dec 12 18:42:47.252185 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Dec 12 18:42:47.254451 systemd[1]: systemd-resolved.service: Deactivated successfully. Dec 12 18:42:47.254571 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Dec 12 18:42:47.257896 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Dec 12 18:42:47.258170 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 12 18:42:47.258206 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 18:42:47.259660 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Dec 12 18:42:47.266668 systemd[1]: systemd-networkd.service: Deactivated successfully. Dec 12 18:42:47.266771 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Dec 12 18:42:47.269480 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Dec 12 18:42:47.269722 systemd[1]: Stopped target network-pre.target - Preparation for Network. Dec 12 18:42:47.270497 systemd[1]: systemd-networkd.socket: Deactivated successfully. Dec 12 18:42:47.270539 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Dec 12 18:42:47.271991 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Dec 12 18:42:47.273194 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Dec 12 18:42:47.273246 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 12 18:42:47.273944 systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 12 18:42:47.273979 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Dec 12 18:42:47.274745 systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 12 18:42:47.274775 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Dec 12 18:42:47.275358 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 18:42:47.276909 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Dec 12 18:42:47.299334 systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 12 18:42:47.299506 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 18:42:47.302165 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 12 18:42:47.302227 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Dec 12 18:42:47.303053 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 12 18:42:47.303082 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 18:42:47.303711 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 12 18:42:47.303750 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Dec 12 18:42:47.304787 systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 12 18:42:47.304824 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Dec 12 18:42:47.305834 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 12 18:42:47.305871 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 12 18:42:47.307561 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Dec 12 18:42:47.308024 systemd[1]: systemd-network-generator.service: Deactivated successfully. Dec 12 18:42:47.308065 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 18:42:47.308854 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 12 18:42:47.308885 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 18:42:47.309484 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 18:42:47.309514 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 18:42:47.310846 systemd[1]: network-cleanup.service: Deactivated successfully. Dec 12 18:42:47.310916 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Dec 12 18:42:47.313314 systemd[1]: sysroot-boot.service: Deactivated successfully. Dec 12 18:42:47.313415 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Dec 12 18:42:47.314159 systemd[1]: initrd-setup-root.service: Deactivated successfully. Dec 12 18:42:47.314188 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Dec 12 18:42:47.316461 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 12 18:42:47.316548 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Dec 12 18:42:47.317352 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Dec 12 18:42:47.318691 systemd[1]: Starting initrd-switch-root.service - Switch Root... Dec 12 18:42:47.328523 systemd[1]: Switching root. Dec 12 18:42:47.382821 systemd-journald[275]: Journal stopped Dec 12 18:42:48.442755 systemd-journald[275]: Received SIGTERM from PID 1 (systemd). Dec 12 18:42:48.442837 kernel: SELinux: policy capability network_peer_controls=1 Dec 12 18:42:48.442865 kernel: SELinux: policy capability open_perms=1 Dec 12 18:42:48.442875 kernel: SELinux: policy capability extended_socket_class=1 Dec 12 18:42:48.442889 kernel: SELinux: policy capability always_check_network=0 Dec 12 18:42:48.442902 kernel: SELinux: policy capability cgroup_seclabel=1 Dec 12 18:42:48.442912 kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 12 18:42:48.442924 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Dec 12 18:42:48.442934 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Dec 12 18:42:48.442944 kernel: SELinux: policy capability userspace_initial_context=0 Dec 12 18:42:48.442954 kernel: audit: type=1403 audit(1765564967.549:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Dec 12 18:42:48.442968 systemd[1]: Successfully loaded SELinux policy in 71.985ms. Dec 12 18:42:48.442991 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.342ms. Dec 12 18:42:48.443004 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 12 18:42:48.443015 systemd[1]: Detected virtualization kvm. Dec 12 18:42:48.443026 systemd[1]: Detected architecture x86-64. Dec 12 18:42:48.443038 systemd[1]: Detected first boot. Dec 12 18:42:48.443048 systemd[1]: Hostname set to . Dec 12 18:42:48.443059 systemd[1]: Initializing machine ID from VM UUID. Dec 12 18:42:48.443069 zram_generator::config[1297]: No configuration found. Dec 12 18:42:48.443081 kernel: Guest personality initialized and is inactive Dec 12 18:42:48.443093 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Dec 12 18:42:48.443104 kernel: Initialized host personality Dec 12 18:42:48.443113 kernel: NET: Registered PF_VSOCK protocol family Dec 12 18:42:48.443123 systemd[1]: Populated /etc with preset unit settings. Dec 12 18:42:48.443134 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Dec 12 18:42:48.443145 systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 12 18:42:48.443155 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Dec 12 18:42:48.443165 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 12 18:42:48.443181 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Dec 12 18:42:48.443192 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Dec 12 18:42:48.443203 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Dec 12 18:42:48.443213 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Dec 12 18:42:48.443224 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Dec 12 18:42:48.443235 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Dec 12 18:42:48.443245 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Dec 12 18:42:48.443255 systemd[1]: Created slice user.slice - User and Session Slice. Dec 12 18:42:48.443269 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 18:42:48.443283 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 18:42:48.443293 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Dec 12 18:42:48.443303 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Dec 12 18:42:48.443314 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Dec 12 18:42:48.443324 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 12 18:42:48.443335 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Dec 12 18:42:48.443347 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 18:42:48.443358 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 12 18:42:48.443368 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Dec 12 18:42:48.443386 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Dec 12 18:42:48.443396 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Dec 12 18:42:48.443406 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Dec 12 18:42:48.443417 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 18:42:48.443430 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 12 18:42:48.443441 systemd[1]: Reached target slices.target - Slice Units. Dec 12 18:42:48.443452 systemd[1]: Reached target swap.target - Swaps. Dec 12 18:42:48.443462 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Dec 12 18:42:48.443473 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Dec 12 18:42:48.443483 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Dec 12 18:42:48.443493 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 12 18:42:48.443506 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 12 18:42:48.443517 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 18:42:48.443527 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Dec 12 18:42:48.443537 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Dec 12 18:42:48.443550 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Dec 12 18:42:48.443561 systemd[1]: Mounting media.mount - External Media Directory... Dec 12 18:42:48.443572 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 12 18:42:48.443586 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Dec 12 18:42:48.443596 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Dec 12 18:42:48.443609 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Dec 12 18:42:48.443620 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 12 18:42:48.443631 systemd[1]: Reached target machines.target - Containers. Dec 12 18:42:48.443641 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Dec 12 18:42:48.443653 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 18:42:48.443667 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 12 18:42:48.443683 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Dec 12 18:42:48.443695 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 18:42:48.443708 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 12 18:42:48.443718 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 18:42:48.443731 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Dec 12 18:42:48.443742 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 18:42:48.443754 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Dec 12 18:42:48.443764 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 12 18:42:48.443774 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Dec 12 18:42:48.443784 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Dec 12 18:42:48.443795 systemd[1]: Stopped systemd-fsck-usr.service. Dec 12 18:42:48.443808 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 18:42:48.443819 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 12 18:42:48.443830 kernel: loop: module loaded Dec 12 18:42:48.443839 kernel: fuse: init (API version 7.41) Dec 12 18:42:48.443849 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 12 18:42:48.443860 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 12 18:42:48.443871 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Dec 12 18:42:48.443881 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Dec 12 18:42:48.443892 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 12 18:42:48.443905 systemd[1]: verity-setup.service: Deactivated successfully. Dec 12 18:42:48.443918 systemd[1]: Stopped verity-setup.service. Dec 12 18:42:48.443929 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 12 18:42:48.443961 systemd-journald[1381]: Collecting audit messages is disabled. Dec 12 18:42:48.443990 kernel: ACPI: bus type drm_connector registered Dec 12 18:42:48.444000 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Dec 12 18:42:48.444011 systemd-journald[1381]: Journal started Dec 12 18:42:48.444032 systemd-journald[1381]: Runtime Journal (/run/log/journal/ee2fc1ccfaf44d7a807f696de9cb9d95) is 8M, max 319.5M, 311.5M free. Dec 12 18:42:48.241425 systemd[1]: Queued start job for default target multi-user.target. Dec 12 18:42:48.260486 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Dec 12 18:42:48.261076 systemd[1]: systemd-journald.service: Deactivated successfully. Dec 12 18:42:48.446464 systemd[1]: Started systemd-journald.service - Journal Service. Dec 12 18:42:48.446961 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Dec 12 18:42:48.447506 systemd[1]: Mounted media.mount - External Media Directory. Dec 12 18:42:48.448008 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Dec 12 18:42:48.448537 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Dec 12 18:42:48.449048 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Dec 12 18:42:48.449710 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Dec 12 18:42:48.450346 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 18:42:48.450948 systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 12 18:42:48.451130 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Dec 12 18:42:48.451754 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 18:42:48.451907 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 18:42:48.452552 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 12 18:42:48.452683 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 12 18:42:48.453251 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 18:42:48.453401 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 18:42:48.453963 systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 12 18:42:48.454100 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Dec 12 18:42:48.454726 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 18:42:48.454879 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 18:42:48.455506 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 12 18:42:48.456126 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 18:42:48.456761 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Dec 12 18:42:48.457336 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Dec 12 18:42:48.467356 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 12 18:42:48.469462 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Dec 12 18:42:48.470846 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Dec 12 18:42:48.471354 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Dec 12 18:42:48.471397 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 12 18:42:48.472696 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Dec 12 18:42:48.483345 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Dec 12 18:42:48.484104 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 18:42:48.485225 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Dec 12 18:42:48.486610 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Dec 12 18:42:48.487162 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 12 18:42:48.487952 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Dec 12 18:42:48.488509 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 12 18:42:48.489296 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 12 18:42:48.491337 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Dec 12 18:42:48.492617 systemd[1]: Starting systemd-sysusers.service - Create System Users... Dec 12 18:42:48.494597 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Dec 12 18:42:48.495167 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Dec 12 18:42:48.497362 systemd-journald[1381]: Time spent on flushing to /var/log/journal/ee2fc1ccfaf44d7a807f696de9cb9d95 is 17.513ms for 1704 entries. Dec 12 18:42:48.497362 systemd-journald[1381]: System Journal (/var/log/journal/ee2fc1ccfaf44d7a807f696de9cb9d95) is 8M, max 584.8M, 576.8M free. Dec 12 18:42:48.524415 systemd-journald[1381]: Received client request to flush runtime journal. Dec 12 18:42:48.524461 kernel: loop0: detected capacity change from 0 to 110984 Dec 12 18:42:48.503775 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Dec 12 18:42:48.504596 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Dec 12 18:42:48.506126 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Dec 12 18:42:48.521895 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 12 18:42:48.525854 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Dec 12 18:42:48.539791 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 18:42:48.540818 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Dec 12 18:42:48.548413 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Dec 12 18:42:48.556340 systemd[1]: Finished systemd-sysusers.service - Create System Users. Dec 12 18:42:48.558910 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 12 18:42:48.581443 kernel: loop1: detected capacity change from 0 to 128560 Dec 12 18:42:48.591863 systemd-tmpfiles[1441]: ACLs are not supported, ignoring. Dec 12 18:42:48.591878 systemd-tmpfiles[1441]: ACLs are not supported, ignoring. Dec 12 18:42:48.595493 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 18:42:48.651423 kernel: loop2: detected capacity change from 0 to 1640 Dec 12 18:42:48.685406 kernel: loop3: detected capacity change from 0 to 224512 Dec 12 18:42:48.733406 kernel: loop4: detected capacity change from 0 to 110984 Dec 12 18:42:48.758407 kernel: loop5: detected capacity change from 0 to 128560 Dec 12 18:42:48.782406 kernel: loop6: detected capacity change from 0 to 1640 Dec 12 18:42:48.791404 kernel: loop7: detected capacity change from 0 to 224512 Dec 12 18:42:48.819904 (sd-merge)[1447]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-stackit'. Dec 12 18:42:48.820420 (sd-merge)[1447]: Merged extensions into '/usr'. Dec 12 18:42:48.825267 systemd[1]: Reload requested from client PID 1423 ('systemd-sysext') (unit systemd-sysext.service)... Dec 12 18:42:48.825284 systemd[1]: Reloading... Dec 12 18:42:48.857417 zram_generator::config[1469]: No configuration found. Dec 12 18:42:49.037425 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Dec 12 18:42:49.037582 systemd[1]: Reloading finished in 211 ms. Dec 12 18:42:49.081499 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Dec 12 18:42:49.082264 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Dec 12 18:42:49.096578 systemd[1]: Starting ensure-sysext.service... Dec 12 18:42:49.098234 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 12 18:42:49.099796 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 18:42:49.108265 systemd[1]: Reload requested from client PID 1517 ('systemctl') (unit ensure-sysext.service)... Dec 12 18:42:49.108280 systemd[1]: Reloading... Dec 12 18:42:49.115088 systemd-tmpfiles[1518]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Dec 12 18:42:49.115898 systemd-tmpfiles[1518]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Dec 12 18:42:49.116160 systemd-tmpfiles[1518]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 12 18:42:49.116387 systemd-tmpfiles[1518]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Dec 12 18:42:49.117012 systemd-tmpfiles[1518]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Dec 12 18:42:49.117213 systemd-tmpfiles[1518]: ACLs are not supported, ignoring. Dec 12 18:42:49.117260 systemd-tmpfiles[1518]: ACLs are not supported, ignoring. Dec 12 18:42:49.122019 systemd-tmpfiles[1518]: Detected autofs mount point /boot during canonicalization of boot. Dec 12 18:42:49.122031 systemd-tmpfiles[1518]: Skipping /boot Dec 12 18:42:49.124824 systemd-udevd[1519]: Using default interface naming scheme 'v255'. Dec 12 18:42:49.128266 systemd-tmpfiles[1518]: Detected autofs mount point /boot during canonicalization of boot. Dec 12 18:42:49.128278 systemd-tmpfiles[1518]: Skipping /boot Dec 12 18:42:49.144701 zram_generator::config[1547]: No configuration found. Dec 12 18:42:49.198395 ldconfig[1418]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Dec 12 18:42:49.222436 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Dec 12 18:42:49.228396 kernel: ACPI: button: Power Button [PWRF] Dec 12 18:42:49.236417 kernel: mousedev: PS/2 mouse device common for all mice Dec 12 18:42:49.274433 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Dec 12 18:42:49.274719 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Dec 12 18:42:49.276643 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Dec 12 18:42:49.311399 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Dec 12 18:42:49.311487 kernel: Console: switching to colour dummy device 80x25 Dec 12 18:42:49.312654 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Dec 12 18:42:49.313549 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Dec 12 18:42:49.313579 kernel: [drm] features: -context_init Dec 12 18:42:49.322404 kernel: [drm] number of scanouts: 1 Dec 12 18:42:49.322455 kernel: [drm] number of cap sets: 0 Dec 12 18:42:49.324410 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Dec 12 18:42:49.325389 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Dec 12 18:42:49.327391 kernel: Console: switching to colour frame buffer device 160x50 Dec 12 18:42:49.333399 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Dec 12 18:42:49.352987 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Dec 12 18:42:49.353343 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 12 18:42:49.353743 systemd[1]: Reloading finished in 245 ms. Dec 12 18:42:49.404361 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 18:42:49.404761 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Dec 12 18:42:49.409431 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 18:42:49.457020 systemd[1]: Finished ensure-sysext.service. Dec 12 18:42:49.462698 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 12 18:42:49.463863 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 12 18:42:49.466925 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Dec 12 18:42:49.468077 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 18:42:49.484462 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 18:42:49.486775 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 12 18:42:49.488900 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 18:42:49.491652 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 18:42:49.493078 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Dec 12 18:42:49.493985 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 18:42:49.494868 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Dec 12 18:42:49.495784 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 18:42:49.496722 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Dec 12 18:42:49.499062 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 12 18:42:49.501700 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 12 18:42:49.502324 systemd[1]: Reached target time-set.target - System Time Set. Dec 12 18:42:49.503356 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Dec 12 18:42:49.505562 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 18:42:49.507897 kernel: pps_core: LinuxPPS API ver. 1 registered Dec 12 18:42:49.507965 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Dec 12 18:42:49.508091 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 12 18:42:49.508972 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 18:42:49.512401 kernel: PTP clock support registered Dec 12 18:42:49.519683 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 18:42:49.522469 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 12 18:42:49.522650 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 12 18:42:49.525988 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 18:42:49.526171 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 18:42:49.528065 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 18:42:49.528221 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 18:42:49.528866 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Dec 12 18:42:49.529021 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Dec 12 18:42:49.529611 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Dec 12 18:42:49.536223 augenrules[1701]: No rules Dec 12 18:42:49.536990 systemd[1]: audit-rules.service: Deactivated successfully. Dec 12 18:42:49.537210 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 12 18:42:49.538891 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Dec 12 18:42:49.540729 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 12 18:42:49.540844 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 12 18:42:49.542136 systemd[1]: Starting systemd-update-done.service - Update is Completed... Dec 12 18:42:49.562454 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Dec 12 18:42:49.564386 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Dec 12 18:42:49.569161 systemd[1]: Finished systemd-update-done.service - Update is Completed. Dec 12 18:42:49.596624 systemd[1]: Started systemd-userdbd.service - User Database Manager. Dec 12 18:42:49.643905 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 18:42:49.644917 systemd-networkd[1674]: lo: Link UP Dec 12 18:42:49.644925 systemd-networkd[1674]: lo: Gained carrier Dec 12 18:42:49.646024 systemd-networkd[1674]: Enumeration completed Dec 12 18:42:49.646128 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 12 18:42:49.646323 systemd-networkd[1674]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 18:42:49.646331 systemd-networkd[1674]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 12 18:42:49.648072 systemd-networkd[1674]: eth0: Link UP Dec 12 18:42:49.648171 systemd-networkd[1674]: eth0: Gained carrier Dec 12 18:42:49.648191 systemd-networkd[1674]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 18:42:49.649110 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Dec 12 18:42:49.649712 systemd-resolved[1675]: Positive Trust Anchors: Dec 12 18:42:49.649720 systemd-resolved[1675]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 12 18:42:49.649759 systemd-resolved[1675]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 12 18:42:49.651814 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Dec 12 18:42:49.656144 systemd-resolved[1675]: Using system hostname 'ci-4459-2-2-4-78a5f49b53'. Dec 12 18:42:49.657482 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 12 18:42:49.658562 systemd[1]: Reached target network.target - Network. Dec 12 18:42:49.659043 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 12 18:42:49.678519 systemd-networkd[1674]: eth0: DHCPv4 address 10.0.8.97/25, gateway 10.0.8.1 acquired from 10.0.8.1 Dec 12 18:42:49.680328 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Dec 12 18:42:49.687825 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Dec 12 18:42:49.689661 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 12 18:42:49.689713 systemd[1]: Reached target sysinit.target - System Initialization. Dec 12 18:42:49.690320 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Dec 12 18:42:49.690886 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Dec 12 18:42:49.691425 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Dec 12 18:42:49.692101 systemd[1]: Started logrotate.timer - Daily rotation of log files. Dec 12 18:42:49.692693 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Dec 12 18:42:49.693192 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Dec 12 18:42:49.693714 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Dec 12 18:42:49.693739 systemd[1]: Reached target paths.target - Path Units. Dec 12 18:42:49.694230 systemd[1]: Reached target timers.target - Timer Units. Dec 12 18:42:49.696603 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Dec 12 18:42:49.698477 systemd[1]: Starting docker.socket - Docker Socket for the API... Dec 12 18:42:49.701056 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Dec 12 18:42:49.707260 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Dec 12 18:42:49.707760 systemd[1]: Reached target ssh-access.target - SSH Access Available. Dec 12 18:42:49.715084 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Dec 12 18:42:49.719125 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Dec 12 18:42:49.720299 systemd[1]: Listening on docker.socket - Docker Socket for the API. Dec 12 18:42:49.721516 systemd[1]: Reached target sockets.target - Socket Units. Dec 12 18:42:49.727275 systemd[1]: Reached target basic.target - Basic System. Dec 12 18:42:49.727757 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Dec 12 18:42:49.727787 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Dec 12 18:42:49.731003 systemd[1]: Starting chronyd.service - NTP client/server... Dec 12 18:42:49.732919 systemd[1]: Starting containerd.service - containerd container runtime... Dec 12 18:42:49.750910 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Dec 12 18:42:49.753509 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Dec 12 18:42:49.756064 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Dec 12 18:42:49.757553 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Dec 12 18:42:49.759407 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 18:42:49.765969 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Dec 12 18:42:49.767204 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Dec 12 18:42:49.768401 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Dec 12 18:42:49.769827 jq[1735]: false Dec 12 18:42:49.770198 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Dec 12 18:42:49.771464 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Dec 12 18:42:49.773874 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Dec 12 18:42:49.776486 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Dec 12 18:42:49.778774 oslogin_cache_refresh[1739]: Refreshing passwd entry cache Dec 12 18:42:49.779692 google_oslogin_nss_cache[1739]: oslogin_cache_refresh[1739]: Refreshing passwd entry cache Dec 12 18:42:49.779803 extend-filesystems[1738]: Found /dev/vda6 Dec 12 18:42:49.781118 systemd[1]: Starting systemd-logind.service - User Login Management... Dec 12 18:42:49.783134 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Dec 12 18:42:49.783707 extend-filesystems[1738]: Found /dev/vda9 Dec 12 18:42:49.787978 extend-filesystems[1738]: Checking size of /dev/vda9 Dec 12 18:42:49.783707 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Dec 12 18:42:49.788640 oslogin_cache_refresh[1739]: Failure getting users, quitting Dec 12 18:42:49.790676 google_oslogin_nss_cache[1739]: oslogin_cache_refresh[1739]: Failure getting users, quitting Dec 12 18:42:49.790676 google_oslogin_nss_cache[1739]: oslogin_cache_refresh[1739]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Dec 12 18:42:49.790676 google_oslogin_nss_cache[1739]: oslogin_cache_refresh[1739]: Refreshing group entry cache Dec 12 18:42:49.784235 systemd[1]: Starting update-engine.service - Update Engine... Dec 12 18:42:49.788660 oslogin_cache_refresh[1739]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Dec 12 18:42:49.788289 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Dec 12 18:42:49.788702 oslogin_cache_refresh[1739]: Refreshing group entry cache Dec 12 18:42:49.791912 chronyd[1730]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Dec 12 18:42:49.792438 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Dec 12 18:42:49.792795 chronyd[1730]: Loaded seccomp filter (level 2) Dec 12 18:42:49.794735 jq[1760]: true Dec 12 18:42:49.794870 systemd[1]: Started chronyd.service - NTP client/server. Dec 12 18:42:49.795424 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Dec 12 18:42:49.795601 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Dec 12 18:42:49.795779 extend-filesystems[1738]: Resized partition /dev/vda9 Dec 12 18:42:49.795821 systemd[1]: motdgen.service: Deactivated successfully. Dec 12 18:42:49.797059 oslogin_cache_refresh[1739]: Failure getting groups, quitting Dec 12 18:42:49.798398 google_oslogin_nss_cache[1739]: oslogin_cache_refresh[1739]: Failure getting groups, quitting Dec 12 18:42:49.798398 google_oslogin_nss_cache[1739]: oslogin_cache_refresh[1739]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Dec 12 18:42:49.795966 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Dec 12 18:42:49.797071 oslogin_cache_refresh[1739]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Dec 12 18:42:49.798630 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Dec 12 18:42:49.798796 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Dec 12 18:42:49.800073 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Dec 12 18:42:49.800228 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Dec 12 18:42:49.800946 extend-filesystems[1766]: resize2fs 1.47.3 (8-Jul-2025) Dec 12 18:42:49.803456 update_engine[1754]: I20251212 18:42:49.802712 1754 main.cc:92] Flatcar Update Engine starting Dec 12 18:42:49.807393 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 12499963 blocks Dec 12 18:42:49.811797 jq[1768]: true Dec 12 18:42:49.813156 (ntainerd)[1769]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Dec 12 18:42:49.829906 tar[1767]: linux-amd64/LICENSE Dec 12 18:42:49.830173 tar[1767]: linux-amd64/helm Dec 12 18:42:49.831767 systemd-logind[1749]: New seat seat0. Dec 12 18:42:49.835008 systemd-logind[1749]: Watching system buttons on /dev/input/event3 (Power Button) Dec 12 18:42:49.835028 systemd-logind[1749]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Dec 12 18:42:49.835211 systemd[1]: Started systemd-logind.service - User Login Management. Dec 12 18:42:49.860148 dbus-daemon[1733]: [system] SELinux support is enabled Dec 12 18:42:49.863840 dbus-daemon[1733]: [system] Successfully activated service 'org.freedesktop.systemd1' Dec 12 18:42:49.866029 update_engine[1754]: I20251212 18:42:49.862527 1754 update_check_scheduler.cc:74] Next update check in 2m58s Dec 12 18:42:49.860376 systemd[1]: Started dbus.service - D-Bus System Message Bus. Dec 12 18:42:49.863188 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Dec 12 18:42:49.863212 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Dec 12 18:42:49.864319 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Dec 12 18:42:49.864335 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Dec 12 18:42:49.864857 systemd[1]: Started update-engine.service - Update Engine. Dec 12 18:42:49.866845 systemd[1]: Started locksmithd.service - Cluster reboot manager. Dec 12 18:42:49.925037 locksmithd[1800]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Dec 12 18:42:50.033469 containerd[1769]: time="2025-12-12T18:42:50Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Dec 12 18:42:50.038290 containerd[1769]: time="2025-12-12T18:42:50.038225628Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Dec 12 18:42:50.045482 containerd[1769]: time="2025-12-12T18:42:50.045414775Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.714µs" Dec 12 18:42:50.045482 containerd[1769]: time="2025-12-12T18:42:50.045450019Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Dec 12 18:42:50.045482 containerd[1769]: time="2025-12-12T18:42:50.045466926Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Dec 12 18:42:50.055569 containerd[1769]: time="2025-12-12T18:42:50.055513525Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Dec 12 18:42:50.055569 containerd[1769]: time="2025-12-12T18:42:50.055562659Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Dec 12 18:42:50.055631 containerd[1769]: time="2025-12-12T18:42:50.055589538Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 12 18:42:50.055656 containerd[1769]: time="2025-12-12T18:42:50.055640184Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 12 18:42:50.055681 containerd[1769]: time="2025-12-12T18:42:50.055655103Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 12 18:42:50.055947 containerd[1769]: time="2025-12-12T18:42:50.055905100Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 12 18:42:50.055947 containerd[1769]: time="2025-12-12T18:42:50.055921095Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 12 18:42:50.055947 containerd[1769]: time="2025-12-12T18:42:50.055930209Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 12 18:42:50.055947 containerd[1769]: time="2025-12-12T18:42:50.055937655Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Dec 12 18:42:50.056069 containerd[1769]: time="2025-12-12T18:42:50.056002096Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Dec 12 18:42:50.056204 containerd[1769]: time="2025-12-12T18:42:50.056177748Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 12 18:42:50.056226 containerd[1769]: time="2025-12-12T18:42:50.056206665Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 12 18:42:50.056226 containerd[1769]: time="2025-12-12T18:42:50.056216800Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Dec 12 18:42:50.056259 containerd[1769]: time="2025-12-12T18:42:50.056247223Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Dec 12 18:42:50.057675 containerd[1769]: time="2025-12-12T18:42:50.057478002Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Dec 12 18:42:50.060561 bash[1799]: Updated "/home/core/.ssh/authorized_keys" Dec 12 18:42:50.062293 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Dec 12 18:42:50.062433 containerd[1769]: time="2025-12-12T18:42:50.062399977Z" level=info msg="metadata content store policy set" policy=shared Dec 12 18:42:50.066007 systemd[1]: Starting sshkeys.service... Dec 12 18:42:50.094609 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Dec 12 18:42:50.111959 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 18:42:50.096474 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Dec 12 18:42:50.113724 sshd_keygen[1763]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Dec 12 18:42:50.135832 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Dec 12 18:42:50.140266 containerd[1769]: time="2025-12-12T18:42:50.140202174Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Dec 12 18:42:50.140372 containerd[1769]: time="2025-12-12T18:42:50.140275421Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Dec 12 18:42:50.140372 containerd[1769]: time="2025-12-12T18:42:50.140293741Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Dec 12 18:42:50.140372 containerd[1769]: time="2025-12-12T18:42:50.140306081Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Dec 12 18:42:50.140372 containerd[1769]: time="2025-12-12T18:42:50.140318363Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Dec 12 18:42:50.140372 containerd[1769]: time="2025-12-12T18:42:50.140328071Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Dec 12 18:42:50.140372 containerd[1769]: time="2025-12-12T18:42:50.140348823Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Dec 12 18:42:50.140372 containerd[1769]: time="2025-12-12T18:42:50.140359640Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Dec 12 18:42:50.140372 containerd[1769]: time="2025-12-12T18:42:50.140371142Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Dec 12 18:42:50.140372 containerd[1769]: time="2025-12-12T18:42:50.140402102Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Dec 12 18:42:50.140720 containerd[1769]: time="2025-12-12T18:42:50.140415175Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Dec 12 18:42:50.140720 containerd[1769]: time="2025-12-12T18:42:50.140440130Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Dec 12 18:42:50.140720 containerd[1769]: time="2025-12-12T18:42:50.140586006Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Dec 12 18:42:50.140720 containerd[1769]: time="2025-12-12T18:42:50.140602470Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Dec 12 18:42:50.140720 containerd[1769]: time="2025-12-12T18:42:50.140615314Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Dec 12 18:42:50.140720 containerd[1769]: time="2025-12-12T18:42:50.140626477Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Dec 12 18:42:50.140720 containerd[1769]: time="2025-12-12T18:42:50.140643440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Dec 12 18:42:50.140720 containerd[1769]: time="2025-12-12T18:42:50.140653927Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Dec 12 18:42:50.140720 containerd[1769]: time="2025-12-12T18:42:50.140669367Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Dec 12 18:42:50.140720 containerd[1769]: time="2025-12-12T18:42:50.140680087Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Dec 12 18:42:50.140720 containerd[1769]: time="2025-12-12T18:42:50.140694432Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Dec 12 18:42:50.140720 containerd[1769]: time="2025-12-12T18:42:50.140705548Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Dec 12 18:42:50.140720 containerd[1769]: time="2025-12-12T18:42:50.140717478Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Dec 12 18:42:50.140458 systemd[1]: Starting issuegen.service - Generate /run/issue... Dec 12 18:42:50.141316 containerd[1769]: time="2025-12-12T18:42:50.140756058Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Dec 12 18:42:50.141316 containerd[1769]: time="2025-12-12T18:42:50.140769042Z" level=info msg="Start snapshots syncer" Dec 12 18:42:50.141316 containerd[1769]: time="2025-12-12T18:42:50.140795343Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Dec 12 18:42:50.141391 containerd[1769]: time="2025-12-12T18:42:50.141049683Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Dec 12 18:42:50.141391 containerd[1769]: time="2025-12-12T18:42:50.141101144Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Dec 12 18:42:50.142407 containerd[1769]: time="2025-12-12T18:42:50.142360233Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Dec 12 18:42:50.142548 containerd[1769]: time="2025-12-12T18:42:50.142523211Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Dec 12 18:42:50.142576 containerd[1769]: time="2025-12-12T18:42:50.142555444Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Dec 12 18:42:50.142576 containerd[1769]: time="2025-12-12T18:42:50.142568041Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Dec 12 18:42:50.142610 containerd[1769]: time="2025-12-12T18:42:50.142578096Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Dec 12 18:42:50.142610 containerd[1769]: time="2025-12-12T18:42:50.142590556Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Dec 12 18:42:50.142610 containerd[1769]: time="2025-12-12T18:42:50.142604181Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Dec 12 18:42:50.142665 containerd[1769]: time="2025-12-12T18:42:50.142622401Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Dec 12 18:42:50.142665 containerd[1769]: time="2025-12-12T18:42:50.142652711Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Dec 12 18:42:50.142665 containerd[1769]: time="2025-12-12T18:42:50.142663237Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Dec 12 18:42:50.142720 containerd[1769]: time="2025-12-12T18:42:50.142673879Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Dec 12 18:42:50.142720 containerd[1769]: time="2025-12-12T18:42:50.142712253Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 12 18:42:50.142754 containerd[1769]: time="2025-12-12T18:42:50.142729852Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 12 18:42:50.142754 containerd[1769]: time="2025-12-12T18:42:50.142738826Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 12 18:42:50.142754 containerd[1769]: time="2025-12-12T18:42:50.142750777Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 12 18:42:50.142810 containerd[1769]: time="2025-12-12T18:42:50.142759096Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Dec 12 18:42:50.142810 containerd[1769]: time="2025-12-12T18:42:50.142774850Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Dec 12 18:42:50.142810 containerd[1769]: time="2025-12-12T18:42:50.142793838Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Dec 12 18:42:50.142863 containerd[1769]: time="2025-12-12T18:42:50.142809590Z" level=info msg="runtime interface created" Dec 12 18:42:50.142863 containerd[1769]: time="2025-12-12T18:42:50.142815941Z" level=info msg="created NRI interface" Dec 12 18:42:50.142863 containerd[1769]: time="2025-12-12T18:42:50.142823961Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Dec 12 18:42:50.142863 containerd[1769]: time="2025-12-12T18:42:50.142850762Z" level=info msg="Connect containerd service" Dec 12 18:42:50.142934 containerd[1769]: time="2025-12-12T18:42:50.142876501Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Dec 12 18:42:50.143795 containerd[1769]: time="2025-12-12T18:42:50.143718007Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 12 18:42:50.157419 systemd[1]: issuegen.service: Deactivated successfully. Dec 12 18:42:50.157657 systemd[1]: Finished issuegen.service - Generate /run/issue. Dec 12 18:42:50.161739 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Dec 12 18:42:50.184358 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Dec 12 18:42:50.188885 systemd[1]: Started getty@tty1.service - Getty on tty1. Dec 12 18:42:50.192976 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Dec 12 18:42:50.193680 systemd[1]: Reached target getty.target - Login Prompts. Dec 12 18:42:50.233550 containerd[1769]: time="2025-12-12T18:42:50.233504733Z" level=info msg="Start subscribing containerd event" Dec 12 18:42:50.233649 containerd[1769]: time="2025-12-12T18:42:50.233560431Z" level=info msg="Start recovering state" Dec 12 18:42:50.233739 containerd[1769]: time="2025-12-12T18:42:50.233596358Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Dec 12 18:42:50.233739 containerd[1769]: time="2025-12-12T18:42:50.233735434Z" level=info msg="Start event monitor" Dec 12 18:42:50.233775 containerd[1769]: time="2025-12-12T18:42:50.233749459Z" level=info msg="Start cni network conf syncer for default" Dec 12 18:42:50.233842 containerd[1769]: time="2025-12-12T18:42:50.233794763Z" level=info msg="Start streaming server" Dec 12 18:42:50.233842 containerd[1769]: time="2025-12-12T18:42:50.233809052Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Dec 12 18:42:50.233842 containerd[1769]: time="2025-12-12T18:42:50.233815800Z" level=info msg="runtime interface starting up..." Dec 12 18:42:50.233842 containerd[1769]: time="2025-12-12T18:42:50.233821404Z" level=info msg="starting plugins..." Dec 12 18:42:50.233928 containerd[1769]: time="2025-12-12T18:42:50.233798411Z" level=info msg=serving... address=/run/containerd/containerd.sock Dec 12 18:42:50.233928 containerd[1769]: time="2025-12-12T18:42:50.233837725Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Dec 12 18:42:50.234040 containerd[1769]: time="2025-12-12T18:42:50.234021946Z" level=info msg="containerd successfully booted in 0.227648s" Dec 12 18:42:50.234134 systemd[1]: Started containerd.service - containerd container runtime. Dec 12 18:42:50.362044 tar[1767]: linux-amd64/README.md Dec 12 18:42:50.385908 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Dec 12 18:42:50.472422 kernel: EXT4-fs (vda9): resized filesystem to 12499963 Dec 12 18:42:50.506591 extend-filesystems[1766]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Dec 12 18:42:50.506591 extend-filesystems[1766]: old_desc_blocks = 1, new_desc_blocks = 6 Dec 12 18:42:50.506591 extend-filesystems[1766]: The filesystem on /dev/vda9 is now 12499963 (4k) blocks long. Dec 12 18:42:50.508911 extend-filesystems[1738]: Resized filesystem in /dev/vda9 Dec 12 18:42:50.507356 systemd[1]: extend-filesystems.service: Deactivated successfully. Dec 12 18:42:50.507616 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Dec 12 18:42:50.770442 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 18:42:51.117452 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 18:42:51.511592 systemd-networkd[1674]: eth0: Gained IPv6LL Dec 12 18:42:51.514475 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Dec 12 18:42:51.515490 systemd[1]: Reached target network-online.target - Network is Online. Dec 12 18:42:51.517328 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 18:42:51.526994 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Dec 12 18:42:51.569144 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Dec 12 18:42:52.732942 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 18:42:52.737547 (kubelet)[1875]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 18:42:52.782433 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 18:42:53.125442 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 18:42:53.518617 kubelet[1875]: E1212 18:42:53.517710 1875 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 18:42:53.522563 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 18:42:53.522697 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 18:42:53.523024 systemd[1]: kubelet.service: Consumed 1.044s CPU time, 266.7M memory peak. Dec 12 18:42:54.379964 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Dec 12 18:42:54.381799 systemd[1]: Started sshd@0-10.0.8.97:22-147.75.109.163:43694.service - OpenSSH per-connection server daemon (147.75.109.163:43694). Dec 12 18:42:55.387333 sshd[1889]: Accepted publickey for core from 147.75.109.163 port 43694 ssh2: RSA SHA256:zFX4bNqXIKDKZnJ173gM+qIyMrs2hUKPmae9E2loakI Dec 12 18:42:55.390198 sshd-session[1889]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:42:55.400964 systemd-logind[1749]: New session 1 of user core. Dec 12 18:42:55.402135 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Dec 12 18:42:55.403168 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Dec 12 18:42:55.432393 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Dec 12 18:42:55.434639 systemd[1]: Starting user@500.service - User Manager for UID 500... Dec 12 18:42:55.465606 (systemd)[1898]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Dec 12 18:42:55.468854 systemd-logind[1749]: New session c1 of user core. Dec 12 18:42:55.580776 systemd[1898]: Queued start job for default target default.target. Dec 12 18:42:55.600702 systemd[1898]: Created slice app.slice - User Application Slice. Dec 12 18:42:55.600736 systemd[1898]: Reached target paths.target - Paths. Dec 12 18:42:55.600783 systemd[1898]: Reached target timers.target - Timers. Dec 12 18:42:55.602018 systemd[1898]: Starting dbus.socket - D-Bus User Message Bus Socket... Dec 12 18:42:55.612073 systemd[1898]: Listening on dbus.socket - D-Bus User Message Bus Socket. Dec 12 18:42:55.612144 systemd[1898]: Reached target sockets.target - Sockets. Dec 12 18:42:55.612184 systemd[1898]: Reached target basic.target - Basic System. Dec 12 18:42:55.612217 systemd[1898]: Reached target default.target - Main User Target. Dec 12 18:42:55.612246 systemd[1898]: Startup finished in 137ms. Dec 12 18:42:55.612469 systemd[1]: Started user@500.service - User Manager for UID 500. Dec 12 18:42:55.613921 systemd[1]: Started session-1.scope - Session 1 of User core. Dec 12 18:42:56.292876 systemd[1]: Started sshd@1-10.0.8.97:22-147.75.109.163:43710.service - OpenSSH per-connection server daemon (147.75.109.163:43710). Dec 12 18:42:56.794447 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 18:42:56.800642 coreos-metadata[1732]: Dec 12 18:42:56.800 WARN failed to locate config-drive, using the metadata service API instead Dec 12 18:42:56.815145 coreos-metadata[1732]: Dec 12 18:42:56.815 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Dec 12 18:42:57.061600 coreos-metadata[1732]: Dec 12 18:42:57.061 INFO Fetch successful Dec 12 18:42:57.061600 coreos-metadata[1732]: Dec 12 18:42:57.061 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 12 18:42:57.144446 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 18:42:57.154905 coreos-metadata[1818]: Dec 12 18:42:57.154 WARN failed to locate config-drive, using the metadata service API instead Dec 12 18:42:57.166519 coreos-metadata[1818]: Dec 12 18:42:57.166 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Dec 12 18:42:57.244946 coreos-metadata[1732]: Dec 12 18:42:57.244 INFO Fetch successful Dec 12 18:42:57.244946 coreos-metadata[1732]: Dec 12 18:42:57.244 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Dec 12 18:42:57.260134 sshd[1909]: Accepted publickey for core from 147.75.109.163 port 43710 ssh2: RSA SHA256:zFX4bNqXIKDKZnJ173gM+qIyMrs2hUKPmae9E2loakI Dec 12 18:42:57.261500 sshd-session[1909]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:42:57.265829 systemd-logind[1749]: New session 2 of user core. Dec 12 18:42:57.275631 systemd[1]: Started session-2.scope - Session 2 of User core. Dec 12 18:42:57.372626 coreos-metadata[1818]: Dec 12 18:42:57.372 INFO Fetch successful Dec 12 18:42:57.372626 coreos-metadata[1818]: Dec 12 18:42:57.372 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Dec 12 18:42:57.478756 coreos-metadata[1732]: Dec 12 18:42:57.478 INFO Fetch successful Dec 12 18:42:57.478756 coreos-metadata[1732]: Dec 12 18:42:57.478 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Dec 12 18:42:57.583532 coreos-metadata[1818]: Dec 12 18:42:57.583 INFO Fetch successful Dec 12 18:42:57.588843 unknown[1818]: wrote ssh authorized keys file for user: core Dec 12 18:42:57.589870 coreos-metadata[1732]: Dec 12 18:42:57.589 INFO Fetch successful Dec 12 18:42:57.589931 coreos-metadata[1732]: Dec 12 18:42:57.589 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Dec 12 18:42:57.623228 update-ssh-keys[1917]: Updated "/home/core/.ssh/authorized_keys" Dec 12 18:42:57.624355 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Dec 12 18:42:57.626069 systemd[1]: Finished sshkeys.service. Dec 12 18:42:57.701399 coreos-metadata[1732]: Dec 12 18:42:57.701 INFO Fetch successful Dec 12 18:42:57.701399 coreos-metadata[1732]: Dec 12 18:42:57.701 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Dec 12 18:42:57.812301 coreos-metadata[1732]: Dec 12 18:42:57.812 INFO Fetch successful Dec 12 18:42:57.842093 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Dec 12 18:42:57.842537 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Dec 12 18:42:57.842660 systemd[1]: Reached target multi-user.target - Multi-User System. Dec 12 18:42:57.842794 systemd[1]: Startup finished in 4.027s (kernel) + 12.908s (initrd) + 10.364s (userspace) = 27.300s. Dec 12 18:42:57.930355 sshd[1916]: Connection closed by 147.75.109.163 port 43710 Dec 12 18:42:57.930649 sshd-session[1909]: pam_unix(sshd:session): session closed for user core Dec 12 18:42:57.934489 systemd[1]: sshd@1-10.0.8.97:22-147.75.109.163:43710.service: Deactivated successfully. Dec 12 18:42:57.936089 systemd[1]: session-2.scope: Deactivated successfully. Dec 12 18:42:57.936820 systemd-logind[1749]: Session 2 logged out. Waiting for processes to exit. Dec 12 18:42:57.937928 systemd-logind[1749]: Removed session 2. Dec 12 18:42:58.099901 systemd[1]: Started sshd@2-10.0.8.97:22-147.75.109.163:43716.service - OpenSSH per-connection server daemon (147.75.109.163:43716). Dec 12 18:42:59.059235 sshd[1931]: Accepted publickey for core from 147.75.109.163 port 43716 ssh2: RSA SHA256:zFX4bNqXIKDKZnJ173gM+qIyMrs2hUKPmae9E2loakI Dec 12 18:42:59.060467 sshd-session[1931]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:42:59.064617 systemd-logind[1749]: New session 3 of user core. Dec 12 18:42:59.079640 systemd[1]: Started session-3.scope - Session 3 of User core. Dec 12 18:42:59.723404 sshd[1934]: Connection closed by 147.75.109.163 port 43716 Dec 12 18:42:59.723781 sshd-session[1931]: pam_unix(sshd:session): session closed for user core Dec 12 18:42:59.727504 systemd[1]: sshd@2-10.0.8.97:22-147.75.109.163:43716.service: Deactivated successfully. Dec 12 18:42:59.729207 systemd[1]: session-3.scope: Deactivated successfully. Dec 12 18:42:59.729842 systemd-logind[1749]: Session 3 logged out. Waiting for processes to exit. Dec 12 18:42:59.730670 systemd-logind[1749]: Removed session 3. Dec 12 18:43:03.773359 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Dec 12 18:43:03.774835 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 18:43:03.976869 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 18:43:03.981101 (kubelet)[1947]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 18:43:04.029030 kubelet[1947]: E1212 18:43:04.028860 1947 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 18:43:04.032935 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 18:43:04.033087 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 18:43:04.033398 systemd[1]: kubelet.service: Consumed 216ms CPU time, 111.8M memory peak. Dec 12 18:43:09.896546 systemd[1]: Started sshd@3-10.0.8.97:22-147.75.109.163:39734.service - OpenSSH per-connection server daemon (147.75.109.163:39734). Dec 12 18:43:10.891864 sshd[1960]: Accepted publickey for core from 147.75.109.163 port 39734 ssh2: RSA SHA256:zFX4bNqXIKDKZnJ173gM+qIyMrs2hUKPmae9E2loakI Dec 12 18:43:10.893160 sshd-session[1960]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:43:10.897180 systemd-logind[1749]: New session 4 of user core. Dec 12 18:43:10.907620 systemd[1]: Started session-4.scope - Session 4 of User core. Dec 12 18:43:11.564980 sshd[1963]: Connection closed by 147.75.109.163 port 39734 Dec 12 18:43:11.565338 sshd-session[1960]: pam_unix(sshd:session): session closed for user core Dec 12 18:43:11.569543 systemd[1]: sshd@3-10.0.8.97:22-147.75.109.163:39734.service: Deactivated successfully. Dec 12 18:43:11.571259 systemd[1]: session-4.scope: Deactivated successfully. Dec 12 18:43:11.571890 systemd-logind[1749]: Session 4 logged out. Waiting for processes to exit. Dec 12 18:43:11.572828 systemd-logind[1749]: Removed session 4. Dec 12 18:43:11.754680 systemd[1]: Started sshd@4-10.0.8.97:22-147.75.109.163:40738.service - OpenSSH per-connection server daemon (147.75.109.163:40738). Dec 12 18:43:12.815244 sshd[1969]: Accepted publickey for core from 147.75.109.163 port 40738 ssh2: RSA SHA256:zFX4bNqXIKDKZnJ173gM+qIyMrs2hUKPmae9E2loakI Dec 12 18:43:12.816592 sshd-session[1969]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:43:12.821225 systemd-logind[1749]: New session 5 of user core. Dec 12 18:43:12.835663 systemd[1]: Started session-5.scope - Session 5 of User core. Dec 12 18:43:13.528862 sshd[1972]: Connection closed by 147.75.109.163 port 40738 Dec 12 18:43:13.529237 sshd-session[1969]: pam_unix(sshd:session): session closed for user core Dec 12 18:43:13.532419 systemd[1]: sshd@4-10.0.8.97:22-147.75.109.163:40738.service: Deactivated successfully. Dec 12 18:43:13.534054 systemd[1]: session-5.scope: Deactivated successfully. Dec 12 18:43:13.535505 systemd-logind[1749]: Session 5 logged out. Waiting for processes to exit. Dec 12 18:43:13.536296 systemd-logind[1749]: Removed session 5. Dec 12 18:43:13.577213 chronyd[1730]: Selected source PHC0 Dec 12 18:43:13.577243 chronyd[1730]: System clock wrong by 2.213688 seconds Dec 12 18:43:15.791008 systemd-resolved[1675]: Clock change detected. Flushing caches. Dec 12 18:43:15.790960 chronyd[1730]: System clock was stepped by 2.213688 seconds Dec 12 18:43:15.914476 systemd[1]: Started sshd@5-10.0.8.97:22-147.75.109.163:40746.service - OpenSSH per-connection server daemon (147.75.109.163:40746). Dec 12 18:43:16.378359 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Dec 12 18:43:16.379826 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 18:43:16.527857 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 18:43:16.532157 (kubelet)[1989]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 18:43:16.569387 kubelet[1989]: E1212 18:43:16.569323 1989 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 18:43:16.571914 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 18:43:16.572049 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 18:43:16.572405 systemd[1]: kubelet.service: Consumed 162ms CPU time, 112.2M memory peak. Dec 12 18:43:16.874044 sshd[1978]: Accepted publickey for core from 147.75.109.163 port 40746 ssh2: RSA SHA256:zFX4bNqXIKDKZnJ173gM+qIyMrs2hUKPmae9E2loakI Dec 12 18:43:16.875426 sshd-session[1978]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:43:16.881153 systemd-logind[1749]: New session 6 of user core. Dec 12 18:43:16.895324 systemd[1]: Started session-6.scope - Session 6 of User core. Dec 12 18:43:17.554510 sshd[2001]: Connection closed by 147.75.109.163 port 40746 Dec 12 18:43:17.554995 sshd-session[1978]: pam_unix(sshd:session): session closed for user core Dec 12 18:43:17.558168 systemd[1]: sshd@5-10.0.8.97:22-147.75.109.163:40746.service: Deactivated successfully. Dec 12 18:43:17.559700 systemd[1]: session-6.scope: Deactivated successfully. Dec 12 18:43:17.560864 systemd-logind[1749]: Session 6 logged out. Waiting for processes to exit. Dec 12 18:43:17.561735 systemd-logind[1749]: Removed session 6. Dec 12 18:43:17.724502 systemd[1]: Started sshd@6-10.0.8.97:22-147.75.109.163:40760.service - OpenSSH per-connection server daemon (147.75.109.163:40760). Dec 12 18:43:18.712954 sshd[2007]: Accepted publickey for core from 147.75.109.163 port 40760 ssh2: RSA SHA256:zFX4bNqXIKDKZnJ173gM+qIyMrs2hUKPmae9E2loakI Dec 12 18:43:18.714226 sshd-session[2007]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:43:18.719495 systemd-logind[1749]: New session 7 of user core. Dec 12 18:43:18.734292 systemd[1]: Started session-7.scope - Session 7 of User core. Dec 12 18:43:19.243282 sudo[2011]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Dec 12 18:43:19.243524 sudo[2011]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 18:43:19.269429 sudo[2011]: pam_unix(sudo:session): session closed for user root Dec 12 18:43:19.425507 sshd[2010]: Connection closed by 147.75.109.163 port 40760 Dec 12 18:43:19.426019 sshd-session[2007]: pam_unix(sshd:session): session closed for user core Dec 12 18:43:19.430092 systemd[1]: sshd@6-10.0.8.97:22-147.75.109.163:40760.service: Deactivated successfully. Dec 12 18:43:19.431801 systemd[1]: session-7.scope: Deactivated successfully. Dec 12 18:43:19.432455 systemd-logind[1749]: Session 7 logged out. Waiting for processes to exit. Dec 12 18:43:19.433331 systemd-logind[1749]: Removed session 7. Dec 12 18:43:19.602849 systemd[1]: Started sshd@7-10.0.8.97:22-147.75.109.163:40768.service - OpenSSH per-connection server daemon (147.75.109.163:40768). Dec 12 18:43:20.579769 sshd[2017]: Accepted publickey for core from 147.75.109.163 port 40768 ssh2: RSA SHA256:zFX4bNqXIKDKZnJ173gM+qIyMrs2hUKPmae9E2loakI Dec 12 18:43:20.581134 sshd-session[2017]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:43:20.585631 systemd-logind[1749]: New session 8 of user core. Dec 12 18:43:20.597375 systemd[1]: Started session-8.scope - Session 8 of User core. Dec 12 18:43:21.091017 sudo[2022]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Dec 12 18:43:21.091272 sudo[2022]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 18:43:21.096343 sudo[2022]: pam_unix(sudo:session): session closed for user root Dec 12 18:43:21.101410 sudo[2021]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Dec 12 18:43:21.101641 sudo[2021]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 18:43:21.111551 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 12 18:43:21.162328 augenrules[2044]: No rules Dec 12 18:43:21.163794 systemd[1]: audit-rules.service: Deactivated successfully. Dec 12 18:43:21.164011 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 12 18:43:21.165418 sudo[2021]: pam_unix(sudo:session): session closed for user root Dec 12 18:43:21.321089 sshd[2020]: Connection closed by 147.75.109.163 port 40768 Dec 12 18:43:21.321471 sshd-session[2017]: pam_unix(sshd:session): session closed for user core Dec 12 18:43:21.325325 systemd[1]: sshd@7-10.0.8.97:22-147.75.109.163:40768.service: Deactivated successfully. Dec 12 18:43:21.326757 systemd[1]: session-8.scope: Deactivated successfully. Dec 12 18:43:21.327511 systemd-logind[1749]: Session 8 logged out. Waiting for processes to exit. Dec 12 18:43:21.328493 systemd-logind[1749]: Removed session 8. Dec 12 18:43:21.500199 systemd[1]: Started sshd@8-10.0.8.97:22-147.75.109.163:40776.service - OpenSSH per-connection server daemon (147.75.109.163:40776). Dec 12 18:43:22.530762 sshd[2053]: Accepted publickey for core from 147.75.109.163 port 40776 ssh2: RSA SHA256:zFX4bNqXIKDKZnJ173gM+qIyMrs2hUKPmae9E2loakI Dec 12 18:43:22.532122 sshd-session[2053]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:43:22.536095 systemd-logind[1749]: New session 9 of user core. Dec 12 18:43:22.544283 systemd[1]: Started session-9.scope - Session 9 of User core. Dec 12 18:43:23.075794 sudo[2057]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Dec 12 18:43:23.076031 sudo[2057]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 18:43:23.430806 systemd[1]: Starting docker.service - Docker Application Container Engine... Dec 12 18:43:23.447605 (dockerd)[2083]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Dec 12 18:43:23.708305 dockerd[2083]: time="2025-12-12T18:43:23.708151725Z" level=info msg="Starting up" Dec 12 18:43:23.710454 dockerd[2083]: time="2025-12-12T18:43:23.710401192Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Dec 12 18:43:23.731294 dockerd[2083]: time="2025-12-12T18:43:23.731238850Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Dec 12 18:43:23.786686 dockerd[2083]: time="2025-12-12T18:43:23.786628051Z" level=info msg="Loading containers: start." Dec 12 18:43:23.801105 kernel: Initializing XFRM netlink socket Dec 12 18:43:24.080407 systemd-networkd[1674]: docker0: Link UP Dec 12 18:43:24.089619 dockerd[2083]: time="2025-12-12T18:43:24.089547122Z" level=info msg="Loading containers: done." Dec 12 18:43:24.118684 dockerd[2083]: time="2025-12-12T18:43:24.118596891Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Dec 12 18:43:24.118684 dockerd[2083]: time="2025-12-12T18:43:24.118702370Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Dec 12 18:43:24.118913 dockerd[2083]: time="2025-12-12T18:43:24.118794679Z" level=info msg="Initializing buildkit" Dec 12 18:43:24.151374 dockerd[2083]: time="2025-12-12T18:43:24.151313249Z" level=info msg="Completed buildkit initialization" Dec 12 18:43:24.157229 dockerd[2083]: time="2025-12-12T18:43:24.157180016Z" level=info msg="Daemon has completed initialization" Dec 12 18:43:24.157374 dockerd[2083]: time="2025-12-12T18:43:24.157247355Z" level=info msg="API listen on /run/docker.sock" Dec 12 18:43:24.157491 systemd[1]: Started docker.service - Docker Application Container Engine. Dec 12 18:43:25.541712 containerd[1769]: time="2025-12-12T18:43:25.541651039Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.10\"" Dec 12 18:43:26.239437 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3644313937.mount: Deactivated successfully. Dec 12 18:43:26.630415 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Dec 12 18:43:26.632538 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 18:43:26.791781 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 18:43:26.795636 (kubelet)[2372]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 18:43:26.830546 kubelet[2372]: E1212 18:43:26.830494 2372 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 18:43:26.832746 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 18:43:26.832871 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 18:43:26.833170 systemd[1]: kubelet.service: Consumed 167ms CPU time, 109.4M memory peak. Dec 12 18:43:27.293817 containerd[1769]: time="2025-12-12T18:43:27.293745088Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:43:27.294897 containerd[1769]: time="2025-12-12T18:43:27.294862649Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.10: active requests=0, bytes read=29072281" Dec 12 18:43:27.296844 containerd[1769]: time="2025-12-12T18:43:27.296785577Z" level=info msg="ImageCreate event name:\"sha256:77f8b0de97da9ee43e174b170c363c893ab69a20b03878e1bf6b54b10d44ef6f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:43:27.300154 containerd[1769]: time="2025-12-12T18:43:27.300115838Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:af4ee57c047e31a7f58422b94a9ec4c62221d3deebb16755bdeff720df796189\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:43:27.300961 containerd[1769]: time="2025-12-12T18:43:27.300784035Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.10\" with image id \"sha256:77f8b0de97da9ee43e174b170c363c893ab69a20b03878e1bf6b54b10d44ef6f\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.10\", repo digest \"registry.k8s.io/kube-apiserver@sha256:af4ee57c047e31a7f58422b94a9ec4c62221d3deebb16755bdeff720df796189\", size \"29068782\" in 1.759080694s" Dec 12 18:43:27.300961 containerd[1769]: time="2025-12-12T18:43:27.300823144Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.10\" returns image reference \"sha256:77f8b0de97da9ee43e174b170c363c893ab69a20b03878e1bf6b54b10d44ef6f\"" Dec 12 18:43:27.301442 containerd[1769]: time="2025-12-12T18:43:27.301420121Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.10\"" Dec 12 18:43:28.575140 containerd[1769]: time="2025-12-12T18:43:28.575048291Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:43:28.576422 containerd[1769]: time="2025-12-12T18:43:28.576380015Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.10: active requests=0, bytes read=24992030" Dec 12 18:43:28.578026 containerd[1769]: time="2025-12-12T18:43:28.578001159Z" level=info msg="ImageCreate event name:\"sha256:34e0beef266f1ca24c0093506853b1cc0ed91e873aeef655f39721813f10f924\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:43:28.580996 containerd[1769]: time="2025-12-12T18:43:28.580933193Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:efbd9d1dfcd2940e1c73a1476c880c3c2cdf04cc60722d329b21cd48745c8660\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:43:28.582031 containerd[1769]: time="2025-12-12T18:43:28.581985706Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.10\" with image id \"sha256:34e0beef266f1ca24c0093506853b1cc0ed91e873aeef655f39721813f10f924\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.10\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:efbd9d1dfcd2940e1c73a1476c880c3c2cdf04cc60722d329b21cd48745c8660\", size \"26649046\" in 1.280537484s" Dec 12 18:43:28.582031 containerd[1769]: time="2025-12-12T18:43:28.582020291Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.10\" returns image reference \"sha256:34e0beef266f1ca24c0093506853b1cc0ed91e873aeef655f39721813f10f924\"" Dec 12 18:43:28.586633 containerd[1769]: time="2025-12-12T18:43:28.586574107Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.10\"" Dec 12 18:43:29.661387 containerd[1769]: time="2025-12-12T18:43:29.661340278Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:43:29.662788 containerd[1769]: time="2025-12-12T18:43:29.662762945Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.10: active requests=0, bytes read=19404268" Dec 12 18:43:29.664766 containerd[1769]: time="2025-12-12T18:43:29.664545684Z" level=info msg="ImageCreate event name:\"sha256:fd6f6aae834c2ec73b534bc30902f1602089a8f4d1bbd8c521fe2b39968efe4a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:43:29.667578 containerd[1769]: time="2025-12-12T18:43:29.667418461Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:9c58e1adcad5af66d1d9ca5cf9a4c266e4054b8f19f91a8fff1993549e657b10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:43:29.668121 containerd[1769]: time="2025-12-12T18:43:29.668081363Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.10\" with image id \"sha256:fd6f6aae834c2ec73b534bc30902f1602089a8f4d1bbd8c521fe2b39968efe4a\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.10\", repo digest \"registry.k8s.io/kube-scheduler@sha256:9c58e1adcad5af66d1d9ca5cf9a4c266e4054b8f19f91a8fff1993549e657b10\", size \"21061302\" in 1.081458223s" Dec 12 18:43:29.668121 containerd[1769]: time="2025-12-12T18:43:29.668110346Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.10\" returns image reference \"sha256:fd6f6aae834c2ec73b534bc30902f1602089a8f4d1bbd8c521fe2b39968efe4a\"" Dec 12 18:43:29.669404 containerd[1769]: time="2025-12-12T18:43:29.668750344Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.10\"" Dec 12 18:43:30.620561 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1845678111.mount: Deactivated successfully. Dec 12 18:43:30.922120 containerd[1769]: time="2025-12-12T18:43:30.922050053Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:43:30.923351 containerd[1769]: time="2025-12-12T18:43:30.923322100Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.10: active requests=0, bytes read=31161449" Dec 12 18:43:30.925597 containerd[1769]: time="2025-12-12T18:43:30.925549442Z" level=info msg="ImageCreate event name:\"sha256:db4bcdca85a39c02add2db5eed4fc6ab21eb20616fbf8cd2cf824e59e384a956\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:43:30.928980 containerd[1769]: time="2025-12-12T18:43:30.928910872Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:e3dda1c7b384f9eb5b2fa1c27493b23b80e6204b9fa2ee8791b2de078f468cbf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:43:30.929396 containerd[1769]: time="2025-12-12T18:43:30.929350643Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.10\" with image id \"sha256:db4bcdca85a39c02add2db5eed4fc6ab21eb20616fbf8cd2cf824e59e384a956\", repo tag \"registry.k8s.io/kube-proxy:v1.32.10\", repo digest \"registry.k8s.io/kube-proxy@sha256:e3dda1c7b384f9eb5b2fa1c27493b23b80e6204b9fa2ee8791b2de078f468cbf\", size \"31160442\" in 1.260568953s" Dec 12 18:43:30.929627 containerd[1769]: time="2025-12-12T18:43:30.929528540Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.10\" returns image reference \"sha256:db4bcdca85a39c02add2db5eed4fc6ab21eb20616fbf8cd2cf824e59e384a956\"" Dec 12 18:43:30.930077 containerd[1769]: time="2025-12-12T18:43:30.930038471Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Dec 12 18:43:31.524041 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2422347940.mount: Deactivated successfully. Dec 12 18:43:32.149134 containerd[1769]: time="2025-12-12T18:43:32.148984728Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:43:32.150519 containerd[1769]: time="2025-12-12T18:43:32.150490532Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565333" Dec 12 18:43:32.152264 containerd[1769]: time="2025-12-12T18:43:32.152218502Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:43:32.155606 containerd[1769]: time="2025-12-12T18:43:32.155528541Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:43:32.156549 containerd[1769]: time="2025-12-12T18:43:32.156401615Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.226223134s" Dec 12 18:43:32.156549 containerd[1769]: time="2025-12-12T18:43:32.156448319Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Dec 12 18:43:32.157019 containerd[1769]: time="2025-12-12T18:43:32.156995412Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Dec 12 18:43:32.708621 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1747101346.mount: Deactivated successfully. Dec 12 18:43:32.717485 containerd[1769]: time="2025-12-12T18:43:32.717427165Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 18:43:32.719091 containerd[1769]: time="2025-12-12T18:43:32.719043892Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321158" Dec 12 18:43:32.720839 containerd[1769]: time="2025-12-12T18:43:32.720794155Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 18:43:32.723214 containerd[1769]: time="2025-12-12T18:43:32.723160991Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 18:43:32.723865 containerd[1769]: time="2025-12-12T18:43:32.723722228Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 566.693851ms" Dec 12 18:43:32.723865 containerd[1769]: time="2025-12-12T18:43:32.723761651Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Dec 12 18:43:32.724303 containerd[1769]: time="2025-12-12T18:43:32.724281197Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Dec 12 18:43:33.336627 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3586739918.mount: Deactivated successfully. Dec 12 18:43:34.687549 containerd[1769]: time="2025-12-12T18:43:34.687474979Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:43:34.689576 containerd[1769]: time="2025-12-12T18:43:34.689534723Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57682130" Dec 12 18:43:34.693950 containerd[1769]: time="2025-12-12T18:43:34.693902750Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:43:34.698049 containerd[1769]: time="2025-12-12T18:43:34.698011793Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:43:34.699099 containerd[1769]: time="2025-12-12T18:43:34.698885844Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 1.974557315s" Dec 12 18:43:34.699099 containerd[1769]: time="2025-12-12T18:43:34.698914929Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Dec 12 18:43:36.878236 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Dec 12 18:43:36.879774 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 18:43:37.022948 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 18:43:37.026824 (kubelet)[2550]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 18:43:37.062395 kubelet[2550]: E1212 18:43:37.062355 2550 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 18:43:37.064725 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 18:43:37.064848 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 18:43:37.065117 systemd[1]: kubelet.service: Consumed 169ms CPU time, 112.1M memory peak. Dec 12 18:43:37.070596 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 18:43:37.070851 systemd[1]: kubelet.service: Consumed 169ms CPU time, 112.1M memory peak. Dec 12 18:43:37.073264 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 18:43:37.097671 systemd[1]: Reload requested from client PID 2569 ('systemctl') (unit session-9.scope)... Dec 12 18:43:37.097686 systemd[1]: Reloading... Dec 12 18:43:37.165273 zram_generator::config[2612]: No configuration found. Dec 12 18:43:37.348171 systemd[1]: Reloading finished in 250 ms. Dec 12 18:43:37.415014 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 18:43:37.417152 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 18:43:37.418880 systemd[1]: kubelet.service: Deactivated successfully. Dec 12 18:43:37.419096 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 18:43:37.419130 systemd[1]: kubelet.service: Consumed 101ms CPU time, 98.4M memory peak. Dec 12 18:43:37.420513 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 18:43:37.559160 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 18:43:37.563349 (kubelet)[2668]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 12 18:43:37.608456 kubelet[2668]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 18:43:37.608456 kubelet[2668]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 12 18:43:37.608456 kubelet[2668]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 18:43:37.608456 kubelet[2668]: I1212 18:43:37.608426 2668 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 12 18:43:37.770218 update_engine[1754]: I20251212 18:43:37.770118 1754 update_attempter.cc:509] Updating boot flags... Dec 12 18:43:37.848380 kubelet[2668]: I1212 18:43:37.848340 2668 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Dec 12 18:43:37.848380 kubelet[2668]: I1212 18:43:37.848368 2668 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 12 18:43:37.848650 kubelet[2668]: I1212 18:43:37.848634 2668 server.go:954] "Client rotation is on, will bootstrap in background" Dec 12 18:43:37.890165 kubelet[2668]: E1212 18:43:37.890119 2668 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.8.97:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.8.97:6443: connect: connection refused" logger="UnhandledError" Dec 12 18:43:37.893340 kubelet[2668]: I1212 18:43:37.893304 2668 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 12 18:43:37.901712 kubelet[2668]: I1212 18:43:37.901672 2668 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 12 18:43:37.908368 kubelet[2668]: I1212 18:43:37.908012 2668 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 12 18:43:37.908368 kubelet[2668]: I1212 18:43:37.908257 2668 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 12 18:43:37.908522 kubelet[2668]: I1212 18:43:37.908285 2668 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-4-78a5f49b53","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 12 18:43:37.909735 kubelet[2668]: I1212 18:43:37.909561 2668 topology_manager.go:138] "Creating topology manager with none policy" Dec 12 18:43:37.909735 kubelet[2668]: I1212 18:43:37.909592 2668 container_manager_linux.go:304] "Creating device plugin manager" Dec 12 18:43:37.909735 kubelet[2668]: I1212 18:43:37.909710 2668 state_mem.go:36] "Initialized new in-memory state store" Dec 12 18:43:37.915216 kubelet[2668]: I1212 18:43:37.915130 2668 kubelet.go:446] "Attempting to sync node with API server" Dec 12 18:43:37.915216 kubelet[2668]: I1212 18:43:37.915164 2668 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 12 18:43:37.915216 kubelet[2668]: I1212 18:43:37.915186 2668 kubelet.go:352] "Adding apiserver pod source" Dec 12 18:43:37.915216 kubelet[2668]: I1212 18:43:37.915198 2668 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 12 18:43:37.917691 kubelet[2668]: W1212 18:43:37.917629 2668 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.8.97:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.8.97:6443: connect: connection refused Dec 12 18:43:37.917691 kubelet[2668]: W1212 18:43:37.917631 2668 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.8.97:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-2-4-78a5f49b53&limit=500&resourceVersion=0": dial tcp 10.0.8.97:6443: connect: connection refused Dec 12 18:43:37.917691 kubelet[2668]: E1212 18:43:37.917690 2668 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.8.97:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.8.97:6443: connect: connection refused" logger="UnhandledError" Dec 12 18:43:37.917832 kubelet[2668]: E1212 18:43:37.917697 2668 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.8.97:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-2-4-78a5f49b53&limit=500&resourceVersion=0\": dial tcp 10.0.8.97:6443: connect: connection refused" logger="UnhandledError" Dec 12 18:43:37.919462 kubelet[2668]: I1212 18:43:37.919416 2668 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 12 18:43:37.919809 kubelet[2668]: I1212 18:43:37.919788 2668 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 12 18:43:37.919852 kubelet[2668]: W1212 18:43:37.919842 2668 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Dec 12 18:43:37.922121 kubelet[2668]: I1212 18:43:37.922082 2668 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 12 18:43:37.922121 kubelet[2668]: I1212 18:43:37.922123 2668 server.go:1287] "Started kubelet" Dec 12 18:43:37.922894 kubelet[2668]: I1212 18:43:37.922826 2668 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Dec 12 18:43:37.923408 kubelet[2668]: I1212 18:43:37.923337 2668 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 12 18:43:37.924519 kubelet[2668]: I1212 18:43:37.923744 2668 server.go:479] "Adding debug handlers to kubelet server" Dec 12 18:43:37.924519 kubelet[2668]: I1212 18:43:37.924428 2668 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 12 18:43:37.926084 kubelet[2668]: I1212 18:43:37.924995 2668 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 12 18:43:37.926084 kubelet[2668]: E1212 18:43:37.925187 2668 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459-2-2-4-78a5f49b53\" not found" Dec 12 18:43:37.926084 kubelet[2668]: I1212 18:43:37.925933 2668 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 12 18:43:37.926184 kubelet[2668]: W1212 18:43:37.926084 2668 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.8.97:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.8.97:6443: connect: connection refused Dec 12 18:43:37.926184 kubelet[2668]: E1212 18:43:37.926128 2668 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.8.97:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.8.97:6443: connect: connection refused" logger="UnhandledError" Dec 12 18:43:37.926184 kubelet[2668]: I1212 18:43:37.926149 2668 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 12 18:43:37.926184 kubelet[2668]: I1212 18:43:37.926164 2668 reconciler.go:26] "Reconciler: start to sync state" Dec 12 18:43:37.926184 kubelet[2668]: E1212 18:43:37.926150 2668 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.8.97:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-4-78a5f49b53?timeout=10s\": dial tcp 10.0.8.97:6443: connect: connection refused" interval="200ms" Dec 12 18:43:37.926184 kubelet[2668]: I1212 18:43:37.926164 2668 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 12 18:43:37.926395 kubelet[2668]: I1212 18:43:37.926350 2668 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 12 18:43:37.928532 kubelet[2668]: I1212 18:43:37.927351 2668 factory.go:221] Registration of the containerd container factory successfully Dec 12 18:43:37.928532 kubelet[2668]: I1212 18:43:37.927364 2668 factory.go:221] Registration of the systemd container factory successfully Dec 12 18:43:37.928532 kubelet[2668]: E1212 18:43:37.927767 2668 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 12 18:43:37.930180 kubelet[2668]: E1212 18:43:37.928086 2668 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.8.97:6443/api/v1/namespaces/default/events\": dial tcp 10.0.8.97:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-2-4-78a5f49b53.18808c07891a56a4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-2-4-78a5f49b53,UID:ci-4459-2-2-4-78a5f49b53,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-2-4-78a5f49b53,},FirstTimestamp:2025-12-12 18:43:37.922098852 +0000 UTC m=+0.355159440,LastTimestamp:2025-12-12 18:43:37.922098852 +0000 UTC m=+0.355159440,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-4-78a5f49b53,}" Dec 12 18:43:37.939786 kubelet[2668]: I1212 18:43:37.939759 2668 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 12 18:43:37.939786 kubelet[2668]: I1212 18:43:37.939776 2668 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 12 18:43:37.939786 kubelet[2668]: I1212 18:43:37.939791 2668 state_mem.go:36] "Initialized new in-memory state store" Dec 12 18:43:37.943180 kubelet[2668]: I1212 18:43:37.943158 2668 policy_none.go:49] "None policy: Start" Dec 12 18:43:37.943225 kubelet[2668]: I1212 18:43:37.943188 2668 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 12 18:43:37.943225 kubelet[2668]: I1212 18:43:37.943200 2668 state_mem.go:35] "Initializing new in-memory state store" Dec 12 18:43:37.943414 kubelet[2668]: I1212 18:43:37.943374 2668 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 12 18:43:37.944767 kubelet[2668]: I1212 18:43:37.944749 2668 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 12 18:43:37.944850 kubelet[2668]: I1212 18:43:37.944773 2668 status_manager.go:227] "Starting to sync pod status with apiserver" Dec 12 18:43:37.944850 kubelet[2668]: I1212 18:43:37.944797 2668 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 12 18:43:37.944850 kubelet[2668]: I1212 18:43:37.944806 2668 kubelet.go:2382] "Starting kubelet main sync loop" Dec 12 18:43:37.944910 kubelet[2668]: E1212 18:43:37.944853 2668 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 12 18:43:37.946932 kubelet[2668]: W1212 18:43:37.946846 2668 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.8.97:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.8.97:6443: connect: connection refused Dec 12 18:43:37.946932 kubelet[2668]: E1212 18:43:37.946882 2668 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.8.97:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.8.97:6443: connect: connection refused" logger="UnhandledError" Dec 12 18:43:37.949650 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Dec 12 18:43:37.967493 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Dec 12 18:43:37.970054 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Dec 12 18:43:37.985363 kubelet[2668]: I1212 18:43:37.985267 2668 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 12 18:43:37.985720 kubelet[2668]: I1212 18:43:37.985505 2668 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 12 18:43:37.985720 kubelet[2668]: I1212 18:43:37.985525 2668 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 12 18:43:37.985786 kubelet[2668]: I1212 18:43:37.985778 2668 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 12 18:43:37.986881 kubelet[2668]: E1212 18:43:37.986849 2668 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 12 18:43:37.986946 kubelet[2668]: E1212 18:43:37.986890 2668 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-2-4-78a5f49b53\" not found" Dec 12 18:43:38.053263 systemd[1]: Created slice kubepods-burstable-pod776b2f47aec24e658db0dd7317d23921.slice - libcontainer container kubepods-burstable-pod776b2f47aec24e658db0dd7317d23921.slice. Dec 12 18:43:38.086787 kubelet[2668]: E1212 18:43:38.086727 2668 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-4-78a5f49b53\" not found" node="ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:38.087577 kubelet[2668]: I1212 18:43:38.087550 2668 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:38.087982 kubelet[2668]: E1212 18:43:38.087949 2668 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.8.97:6443/api/v1/nodes\": dial tcp 10.0.8.97:6443: connect: connection refused" node="ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:38.089624 systemd[1]: Created slice kubepods-burstable-pode9bc71845d827c274b37135e2a4491a2.slice - libcontainer container kubepods-burstable-pode9bc71845d827c274b37135e2a4491a2.slice. Dec 12 18:43:38.100811 kubelet[2668]: E1212 18:43:38.100723 2668 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-4-78a5f49b53\" not found" node="ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:38.103410 systemd[1]: Created slice kubepods-burstable-podb9871c555250693d961009447d336dba.slice - libcontainer container kubepods-burstable-podb9871c555250693d961009447d336dba.slice. Dec 12 18:43:38.104931 kubelet[2668]: E1212 18:43:38.104900 2668 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-4-78a5f49b53\" not found" node="ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:38.127172 kubelet[2668]: I1212 18:43:38.127112 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e9bc71845d827c274b37135e2a4491a2-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-4-78a5f49b53\" (UID: \"e9bc71845d827c274b37135e2a4491a2\") " pod="kube-system/kube-apiserver-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:38.127172 kubelet[2668]: I1212 18:43:38.127151 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b9871c555250693d961009447d336dba-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-4-78a5f49b53\" (UID: \"b9871c555250693d961009447d336dba\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:38.127172 kubelet[2668]: I1212 18:43:38.127172 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b9871c555250693d961009447d336dba-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-4-78a5f49b53\" (UID: \"b9871c555250693d961009447d336dba\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:38.127172 kubelet[2668]: I1212 18:43:38.127185 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b9871c555250693d961009447d336dba-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-4-78a5f49b53\" (UID: \"b9871c555250693d961009447d336dba\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:38.127389 kubelet[2668]: I1212 18:43:38.127202 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/776b2f47aec24e658db0dd7317d23921-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-4-78a5f49b53\" (UID: \"776b2f47aec24e658db0dd7317d23921\") " pod="kube-system/kube-scheduler-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:38.127389 kubelet[2668]: I1212 18:43:38.127217 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e9bc71845d827c274b37135e2a4491a2-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-4-78a5f49b53\" (UID: \"e9bc71845d827c274b37135e2a4491a2\") " pod="kube-system/kube-apiserver-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:38.127389 kubelet[2668]: I1212 18:43:38.127231 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b9871c555250693d961009447d336dba-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-4-78a5f49b53\" (UID: \"b9871c555250693d961009447d336dba\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:38.127389 kubelet[2668]: I1212 18:43:38.127245 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b9871c555250693d961009447d336dba-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-4-78a5f49b53\" (UID: \"b9871c555250693d961009447d336dba\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:38.127389 kubelet[2668]: I1212 18:43:38.127261 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e9bc71845d827c274b37135e2a4491a2-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-4-78a5f49b53\" (UID: \"e9bc71845d827c274b37135e2a4491a2\") " pod="kube-system/kube-apiserver-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:38.127512 kubelet[2668]: E1212 18:43:38.127299 2668 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.8.97:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-4-78a5f49b53?timeout=10s\": dial tcp 10.0.8.97:6443: connect: connection refused" interval="400ms" Dec 12 18:43:38.289650 kubelet[2668]: I1212 18:43:38.289534 2668 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:38.290171 kubelet[2668]: E1212 18:43:38.290129 2668 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.8.97:6443/api/v1/nodes\": dial tcp 10.0.8.97:6443: connect: connection refused" node="ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:38.388845 containerd[1769]: time="2025-12-12T18:43:38.388797942Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-4-78a5f49b53,Uid:776b2f47aec24e658db0dd7317d23921,Namespace:kube-system,Attempt:0,}" Dec 12 18:43:38.401913 containerd[1769]: time="2025-12-12T18:43:38.401681780Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-4-78a5f49b53,Uid:e9bc71845d827c274b37135e2a4491a2,Namespace:kube-system,Attempt:0,}" Dec 12 18:43:38.406478 containerd[1769]: time="2025-12-12T18:43:38.406430529Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-4-78a5f49b53,Uid:b9871c555250693d961009447d336dba,Namespace:kube-system,Attempt:0,}" Dec 12 18:43:38.432778 containerd[1769]: time="2025-12-12T18:43:38.432731957Z" level=info msg="connecting to shim d92c4afe7398a41dfbcc1b1ea3288e2d206124a5d9cdf673b07ca58429055ccc" address="unix:///run/containerd/s/37491b236897be3031d0b90696f2993637d9994c2df23cdb737d9c0438c95c15" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:43:38.449670 containerd[1769]: time="2025-12-12T18:43:38.449633326Z" level=info msg="connecting to shim 47cd697f9a67fcf2b5ccc2a807412d37bd6a8b42be5db24e7b93e1af080476d1" address="unix:///run/containerd/s/ef1750b6067fb2455b26d21baeaf22ae1d4b13334aeec4411dbb8de576ae18be" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:43:38.454446 containerd[1769]: time="2025-12-12T18:43:38.454043603Z" level=info msg="connecting to shim a3f225da887e336afb06f82907544369396e7bc97bb5f03f1c105da8f17f245e" address="unix:///run/containerd/s/40eba25b9b0555f9812162942ed3d4a42bc08ba6d71443d6cda043ba58052d9c" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:43:38.457270 systemd[1]: Started cri-containerd-d92c4afe7398a41dfbcc1b1ea3288e2d206124a5d9cdf673b07ca58429055ccc.scope - libcontainer container d92c4afe7398a41dfbcc1b1ea3288e2d206124a5d9cdf673b07ca58429055ccc. Dec 12 18:43:38.481397 systemd[1]: Started cri-containerd-47cd697f9a67fcf2b5ccc2a807412d37bd6a8b42be5db24e7b93e1af080476d1.scope - libcontainer container 47cd697f9a67fcf2b5ccc2a807412d37bd6a8b42be5db24e7b93e1af080476d1. Dec 12 18:43:38.482611 systemd[1]: Started cri-containerd-a3f225da887e336afb06f82907544369396e7bc97bb5f03f1c105da8f17f245e.scope - libcontainer container a3f225da887e336afb06f82907544369396e7bc97bb5f03f1c105da8f17f245e. Dec 12 18:43:38.506797 containerd[1769]: time="2025-12-12T18:43:38.506587901Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-4-78a5f49b53,Uid:776b2f47aec24e658db0dd7317d23921,Namespace:kube-system,Attempt:0,} returns sandbox id \"d92c4afe7398a41dfbcc1b1ea3288e2d206124a5d9cdf673b07ca58429055ccc\"" Dec 12 18:43:38.512467 containerd[1769]: time="2025-12-12T18:43:38.512432504Z" level=info msg="CreateContainer within sandbox \"d92c4afe7398a41dfbcc1b1ea3288e2d206124a5d9cdf673b07ca58429055ccc\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Dec 12 18:43:38.527987 kubelet[2668]: E1212 18:43:38.527947 2668 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.8.97:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-4-78a5f49b53?timeout=10s\": dial tcp 10.0.8.97:6443: connect: connection refused" interval="800ms" Dec 12 18:43:38.548423 containerd[1769]: time="2025-12-12T18:43:38.548300031Z" level=info msg="Container d41abe8e3d62df0622cbbb1f097455ca2efdb02722843f0a22cbf0fc5e1b74a0: CDI devices from CRI Config.CDIDevices: []" Dec 12 18:43:38.550200 containerd[1769]: time="2025-12-12T18:43:38.550160029Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-4-78a5f49b53,Uid:e9bc71845d827c274b37135e2a4491a2,Namespace:kube-system,Attempt:0,} returns sandbox id \"a3f225da887e336afb06f82907544369396e7bc97bb5f03f1c105da8f17f245e\"" Dec 12 18:43:38.551665 containerd[1769]: time="2025-12-12T18:43:38.551636818Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-4-78a5f49b53,Uid:b9871c555250693d961009447d336dba,Namespace:kube-system,Attempt:0,} returns sandbox id \"47cd697f9a67fcf2b5ccc2a807412d37bd6a8b42be5db24e7b93e1af080476d1\"" Dec 12 18:43:38.553107 containerd[1769]: time="2025-12-12T18:43:38.553085204Z" level=info msg="CreateContainer within sandbox \"a3f225da887e336afb06f82907544369396e7bc97bb5f03f1c105da8f17f245e\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Dec 12 18:43:38.553818 containerd[1769]: time="2025-12-12T18:43:38.553544584Z" level=info msg="CreateContainer within sandbox \"47cd697f9a67fcf2b5ccc2a807412d37bd6a8b42be5db24e7b93e1af080476d1\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Dec 12 18:43:38.562510 containerd[1769]: time="2025-12-12T18:43:38.562459031Z" level=info msg="CreateContainer within sandbox \"d92c4afe7398a41dfbcc1b1ea3288e2d206124a5d9cdf673b07ca58429055ccc\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"d41abe8e3d62df0622cbbb1f097455ca2efdb02722843f0a22cbf0fc5e1b74a0\"" Dec 12 18:43:38.563026 containerd[1769]: time="2025-12-12T18:43:38.562994448Z" level=info msg="StartContainer for \"d41abe8e3d62df0622cbbb1f097455ca2efdb02722843f0a22cbf0fc5e1b74a0\"" Dec 12 18:43:38.564285 containerd[1769]: time="2025-12-12T18:43:38.564239522Z" level=info msg="connecting to shim d41abe8e3d62df0622cbbb1f097455ca2efdb02722843f0a22cbf0fc5e1b74a0" address="unix:///run/containerd/s/37491b236897be3031d0b90696f2993637d9994c2df23cdb737d9c0438c95c15" protocol=ttrpc version=3 Dec 12 18:43:38.566269 containerd[1769]: time="2025-12-12T18:43:38.566218430Z" level=info msg="Container 7deecb886d9666cc93807bc849455a1b5b31420fb22befaae4518e78bfde00de: CDI devices from CRI Config.CDIDevices: []" Dec 12 18:43:38.570451 containerd[1769]: time="2025-12-12T18:43:38.570172462Z" level=info msg="Container 2c4189e9da9cee3fe89ff8680b728d7063486d27f00175ca6babfb134472f80f: CDI devices from CRI Config.CDIDevices: []" Dec 12 18:43:38.584254 containerd[1769]: time="2025-12-12T18:43:38.584190559Z" level=info msg="CreateContainer within sandbox \"a3f225da887e336afb06f82907544369396e7bc97bb5f03f1c105da8f17f245e\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"7deecb886d9666cc93807bc849455a1b5b31420fb22befaae4518e78bfde00de\"" Dec 12 18:43:38.584303 systemd[1]: Started cri-containerd-d41abe8e3d62df0622cbbb1f097455ca2efdb02722843f0a22cbf0fc5e1b74a0.scope - libcontainer container d41abe8e3d62df0622cbbb1f097455ca2efdb02722843f0a22cbf0fc5e1b74a0. Dec 12 18:43:38.585080 containerd[1769]: time="2025-12-12T18:43:38.584645211Z" level=info msg="StartContainer for \"7deecb886d9666cc93807bc849455a1b5b31420fb22befaae4518e78bfde00de\"" Dec 12 18:43:38.585915 containerd[1769]: time="2025-12-12T18:43:38.585888950Z" level=info msg="connecting to shim 7deecb886d9666cc93807bc849455a1b5b31420fb22befaae4518e78bfde00de" address="unix:///run/containerd/s/40eba25b9b0555f9812162942ed3d4a42bc08ba6d71443d6cda043ba58052d9c" protocol=ttrpc version=3 Dec 12 18:43:38.586162 containerd[1769]: time="2025-12-12T18:43:38.585939255Z" level=info msg="CreateContainer within sandbox \"47cd697f9a67fcf2b5ccc2a807412d37bd6a8b42be5db24e7b93e1af080476d1\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"2c4189e9da9cee3fe89ff8680b728d7063486d27f00175ca6babfb134472f80f\"" Dec 12 18:43:38.586626 containerd[1769]: time="2025-12-12T18:43:38.586590027Z" level=info msg="StartContainer for \"2c4189e9da9cee3fe89ff8680b728d7063486d27f00175ca6babfb134472f80f\"" Dec 12 18:43:38.588417 containerd[1769]: time="2025-12-12T18:43:38.588150247Z" level=info msg="connecting to shim 2c4189e9da9cee3fe89ff8680b728d7063486d27f00175ca6babfb134472f80f" address="unix:///run/containerd/s/ef1750b6067fb2455b26d21baeaf22ae1d4b13334aeec4411dbb8de576ae18be" protocol=ttrpc version=3 Dec 12 18:43:38.607319 systemd[1]: Started cri-containerd-7deecb886d9666cc93807bc849455a1b5b31420fb22befaae4518e78bfde00de.scope - libcontainer container 7deecb886d9666cc93807bc849455a1b5b31420fb22befaae4518e78bfde00de. Dec 12 18:43:38.611578 systemd[1]: Started cri-containerd-2c4189e9da9cee3fe89ff8680b728d7063486d27f00175ca6babfb134472f80f.scope - libcontainer container 2c4189e9da9cee3fe89ff8680b728d7063486d27f00175ca6babfb134472f80f. Dec 12 18:43:38.652976 containerd[1769]: time="2025-12-12T18:43:38.652925181Z" level=info msg="StartContainer for \"d41abe8e3d62df0622cbbb1f097455ca2efdb02722843f0a22cbf0fc5e1b74a0\" returns successfully" Dec 12 18:43:38.664101 containerd[1769]: time="2025-12-12T18:43:38.664032086Z" level=info msg="StartContainer for \"7deecb886d9666cc93807bc849455a1b5b31420fb22befaae4518e78bfde00de\" returns successfully" Dec 12 18:43:38.666605 containerd[1769]: time="2025-12-12T18:43:38.666567694Z" level=info msg="StartContainer for \"2c4189e9da9cee3fe89ff8680b728d7063486d27f00175ca6babfb134472f80f\" returns successfully" Dec 12 18:43:38.692879 kubelet[2668]: I1212 18:43:38.692824 2668 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:38.693343 kubelet[2668]: E1212 18:43:38.693176 2668 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.8.97:6443/api/v1/nodes\": dial tcp 10.0.8.97:6443: connect: connection refused" node="ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:38.951308 kubelet[2668]: E1212 18:43:38.951278 2668 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-4-78a5f49b53\" not found" node="ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:38.954305 kubelet[2668]: E1212 18:43:38.954276 2668 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-4-78a5f49b53\" not found" node="ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:38.955064 kubelet[2668]: E1212 18:43:38.955045 2668 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-4-78a5f49b53\" not found" node="ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:39.495711 kubelet[2668]: I1212 18:43:39.495515 2668 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:39.505747 kubelet[2668]: E1212 18:43:39.505709 2668 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-2-2-4-78a5f49b53\" not found" node="ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:39.612484 kubelet[2668]: I1212 18:43:39.612191 2668 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:39.626632 kubelet[2668]: I1212 18:43:39.626593 2668 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:39.630620 kubelet[2668]: E1212 18:43:39.630558 2668 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-4-78a5f49b53\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:39.630620 kubelet[2668]: I1212 18:43:39.630586 2668 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:39.632080 kubelet[2668]: E1212 18:43:39.632053 2668 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-4-78a5f49b53\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:39.632246 kubelet[2668]: I1212 18:43:39.632176 2668 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:39.633421 kubelet[2668]: E1212 18:43:39.633401 2668 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-2-4-78a5f49b53\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:39.916595 kubelet[2668]: I1212 18:43:39.916511 2668 apiserver.go:52] "Watching apiserver" Dec 12 18:43:39.926979 kubelet[2668]: I1212 18:43:39.926937 2668 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 12 18:43:39.955651 kubelet[2668]: I1212 18:43:39.955627 2668 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:39.955938 kubelet[2668]: I1212 18:43:39.955868 2668 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:39.957554 kubelet[2668]: E1212 18:43:39.957532 2668 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-4-78a5f49b53\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:39.957661 kubelet[2668]: E1212 18:43:39.957581 2668 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-4-78a5f49b53\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:41.529862 systemd[1]: Reload requested from client PID 2968 ('systemctl') (unit session-9.scope)... Dec 12 18:43:41.529877 systemd[1]: Reloading... Dec 12 18:43:41.593119 zram_generator::config[3011]: No configuration found. Dec 12 18:43:41.775245 systemd[1]: Reloading finished in 245 ms. Dec 12 18:43:41.806495 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 18:43:41.819300 systemd[1]: kubelet.service: Deactivated successfully. Dec 12 18:43:41.819541 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 18:43:41.819602 systemd[1]: kubelet.service: Consumed 714ms CPU time, 135.3M memory peak. Dec 12 18:43:41.821191 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 18:43:41.973155 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 18:43:41.977081 (kubelet)[3062]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 12 18:43:42.011745 kubelet[3062]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 18:43:42.011745 kubelet[3062]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 12 18:43:42.011745 kubelet[3062]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 18:43:42.012102 kubelet[3062]: I1212 18:43:42.011792 3062 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 12 18:43:42.017514 kubelet[3062]: I1212 18:43:42.017476 3062 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Dec 12 18:43:42.017514 kubelet[3062]: I1212 18:43:42.017502 3062 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 12 18:43:42.017732 kubelet[3062]: I1212 18:43:42.017718 3062 server.go:954] "Client rotation is on, will bootstrap in background" Dec 12 18:43:42.020499 kubelet[3062]: I1212 18:43:42.020461 3062 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 12 18:43:42.022365 kubelet[3062]: I1212 18:43:42.022251 3062 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 12 18:43:42.025582 kubelet[3062]: I1212 18:43:42.025564 3062 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 12 18:43:42.031677 kubelet[3062]: I1212 18:43:42.031621 3062 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 12 18:43:42.031876 kubelet[3062]: I1212 18:43:42.031839 3062 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 12 18:43:42.032031 kubelet[3062]: I1212 18:43:42.031866 3062 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-4-78a5f49b53","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 12 18:43:42.032130 kubelet[3062]: I1212 18:43:42.032038 3062 topology_manager.go:138] "Creating topology manager with none policy" Dec 12 18:43:42.032130 kubelet[3062]: I1212 18:43:42.032046 3062 container_manager_linux.go:304] "Creating device plugin manager" Dec 12 18:43:42.032130 kubelet[3062]: I1212 18:43:42.032101 3062 state_mem.go:36] "Initialized new in-memory state store" Dec 12 18:43:42.032254 kubelet[3062]: I1212 18:43:42.032247 3062 kubelet.go:446] "Attempting to sync node with API server" Dec 12 18:43:42.032281 kubelet[3062]: I1212 18:43:42.032267 3062 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 12 18:43:42.032302 kubelet[3062]: I1212 18:43:42.032284 3062 kubelet.go:352] "Adding apiserver pod source" Dec 12 18:43:42.032302 kubelet[3062]: I1212 18:43:42.032294 3062 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 12 18:43:42.033207 kubelet[3062]: I1212 18:43:42.033177 3062 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 12 18:43:42.033670 kubelet[3062]: I1212 18:43:42.033619 3062 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 12 18:43:42.034147 kubelet[3062]: I1212 18:43:42.034128 3062 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 12 18:43:42.034235 kubelet[3062]: I1212 18:43:42.034160 3062 server.go:1287] "Started kubelet" Dec 12 18:43:42.034736 kubelet[3062]: I1212 18:43:42.034680 3062 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 12 18:43:42.035061 kubelet[3062]: I1212 18:43:42.035030 3062 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Dec 12 18:43:42.035143 kubelet[3062]: I1212 18:43:42.035132 3062 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 12 18:43:42.036435 kubelet[3062]: I1212 18:43:42.036363 3062 server.go:479] "Adding debug handlers to kubelet server" Dec 12 18:43:42.037835 kubelet[3062]: E1212 18:43:42.037812 3062 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 12 18:43:42.038090 kubelet[3062]: I1212 18:43:42.038077 3062 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 12 18:43:42.038330 kubelet[3062]: I1212 18:43:42.038317 3062 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 12 18:43:42.039885 kubelet[3062]: E1212 18:43:42.039824 3062 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459-2-2-4-78a5f49b53\" not found" Dec 12 18:43:42.040412 kubelet[3062]: I1212 18:43:42.040386 3062 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 12 18:43:42.041052 kubelet[3062]: I1212 18:43:42.040998 3062 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 12 18:43:42.041892 kubelet[3062]: I1212 18:43:42.041320 3062 reconciler.go:26] "Reconciler: start to sync state" Dec 12 18:43:42.042171 kubelet[3062]: I1212 18:43:42.042153 3062 factory.go:221] Registration of the systemd container factory successfully Dec 12 18:43:42.042370 kubelet[3062]: I1212 18:43:42.042318 3062 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 12 18:43:42.044015 kubelet[3062]: I1212 18:43:42.043992 3062 factory.go:221] Registration of the containerd container factory successfully Dec 12 18:43:42.063243 kubelet[3062]: I1212 18:43:42.062498 3062 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 12 18:43:42.066146 kubelet[3062]: I1212 18:43:42.066113 3062 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 12 18:43:42.066239 kubelet[3062]: I1212 18:43:42.066154 3062 status_manager.go:227] "Starting to sync pod status with apiserver" Dec 12 18:43:42.066239 kubelet[3062]: I1212 18:43:42.066175 3062 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 12 18:43:42.066239 kubelet[3062]: I1212 18:43:42.066182 3062 kubelet.go:2382] "Starting kubelet main sync loop" Dec 12 18:43:42.066239 kubelet[3062]: E1212 18:43:42.066229 3062 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 12 18:43:42.086278 kubelet[3062]: I1212 18:43:42.086154 3062 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 12 18:43:42.086278 kubelet[3062]: I1212 18:43:42.086172 3062 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 12 18:43:42.086278 kubelet[3062]: I1212 18:43:42.086192 3062 state_mem.go:36] "Initialized new in-memory state store" Dec 12 18:43:42.086467 kubelet[3062]: I1212 18:43:42.086378 3062 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 12 18:43:42.086467 kubelet[3062]: I1212 18:43:42.086389 3062 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 12 18:43:42.086467 kubelet[3062]: I1212 18:43:42.086407 3062 policy_none.go:49] "None policy: Start" Dec 12 18:43:42.086467 kubelet[3062]: I1212 18:43:42.086416 3062 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 12 18:43:42.086467 kubelet[3062]: I1212 18:43:42.086427 3062 state_mem.go:35] "Initializing new in-memory state store" Dec 12 18:43:42.086558 kubelet[3062]: I1212 18:43:42.086519 3062 state_mem.go:75] "Updated machine memory state" Dec 12 18:43:42.090213 kubelet[3062]: I1212 18:43:42.090180 3062 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 12 18:43:42.090354 kubelet[3062]: I1212 18:43:42.090342 3062 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 12 18:43:42.090382 kubelet[3062]: I1212 18:43:42.090354 3062 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 12 18:43:42.090709 kubelet[3062]: I1212 18:43:42.090542 3062 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 12 18:43:42.091237 kubelet[3062]: E1212 18:43:42.091220 3062 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 12 18:43:42.167876 kubelet[3062]: I1212 18:43:42.167833 3062 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:42.168042 kubelet[3062]: I1212 18:43:42.167899 3062 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:42.168291 kubelet[3062]: I1212 18:43:42.168273 3062 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:42.193223 kubelet[3062]: I1212 18:43:42.193186 3062 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:42.200867 kubelet[3062]: I1212 18:43:42.200816 3062 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:42.201007 kubelet[3062]: I1212 18:43:42.200891 3062 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:42.242477 kubelet[3062]: I1212 18:43:42.242414 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b9871c555250693d961009447d336dba-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-4-78a5f49b53\" (UID: \"b9871c555250693d961009447d336dba\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:42.242477 kubelet[3062]: I1212 18:43:42.242463 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b9871c555250693d961009447d336dba-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-4-78a5f49b53\" (UID: \"b9871c555250693d961009447d336dba\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:42.242477 kubelet[3062]: I1212 18:43:42.242486 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b9871c555250693d961009447d336dba-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-4-78a5f49b53\" (UID: \"b9871c555250693d961009447d336dba\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:42.242691 kubelet[3062]: I1212 18:43:42.242520 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e9bc71845d827c274b37135e2a4491a2-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-4-78a5f49b53\" (UID: \"e9bc71845d827c274b37135e2a4491a2\") " pod="kube-system/kube-apiserver-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:42.242691 kubelet[3062]: I1212 18:43:42.242538 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b9871c555250693d961009447d336dba-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-4-78a5f49b53\" (UID: \"b9871c555250693d961009447d336dba\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:42.242691 kubelet[3062]: I1212 18:43:42.242555 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b9871c555250693d961009447d336dba-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-4-78a5f49b53\" (UID: \"b9871c555250693d961009447d336dba\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:42.242691 kubelet[3062]: I1212 18:43:42.242572 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/776b2f47aec24e658db0dd7317d23921-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-4-78a5f49b53\" (UID: \"776b2f47aec24e658db0dd7317d23921\") " pod="kube-system/kube-scheduler-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:42.242691 kubelet[3062]: I1212 18:43:42.242595 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e9bc71845d827c274b37135e2a4491a2-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-4-78a5f49b53\" (UID: \"e9bc71845d827c274b37135e2a4491a2\") " pod="kube-system/kube-apiserver-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:42.242794 kubelet[3062]: I1212 18:43:42.242608 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e9bc71845d827c274b37135e2a4491a2-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-4-78a5f49b53\" (UID: \"e9bc71845d827c274b37135e2a4491a2\") " pod="kube-system/kube-apiserver-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:43.032974 kubelet[3062]: I1212 18:43:43.032927 3062 apiserver.go:52] "Watching apiserver" Dec 12 18:43:43.040652 kubelet[3062]: I1212 18:43:43.040533 3062 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 12 18:43:43.073490 kubelet[3062]: I1212 18:43:43.073425 3062 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:43.080282 kubelet[3062]: E1212 18:43:43.080029 3062 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-4-78a5f49b53\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-2-4-78a5f49b53" Dec 12 18:43:43.098271 kubelet[3062]: I1212 18:43:43.098206 3062 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-2-4-78a5f49b53" podStartSLOduration=1.098189874 podStartE2EDuration="1.098189874s" podCreationTimestamp="2025-12-12 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 18:43:43.09806516 +0000 UTC m=+1.117802056" watchObservedRunningTime="2025-12-12 18:43:43.098189874 +0000 UTC m=+1.117926749" Dec 12 18:43:43.098449 kubelet[3062]: I1212 18:43:43.098309 3062 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-2-4-78a5f49b53" podStartSLOduration=1.098304009 podStartE2EDuration="1.098304009s" podCreationTimestamp="2025-12-12 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 18:43:43.090057054 +0000 UTC m=+1.109793948" watchObservedRunningTime="2025-12-12 18:43:43.098304009 +0000 UTC m=+1.118040906" Dec 12 18:43:43.104917 kubelet[3062]: I1212 18:43:43.104862 3062 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-2-4-78a5f49b53" podStartSLOduration=1.104845258 podStartE2EDuration="1.104845258s" podCreationTimestamp="2025-12-12 18:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 18:43:43.104808385 +0000 UTC m=+1.124545281" watchObservedRunningTime="2025-12-12 18:43:43.104845258 +0000 UTC m=+1.124582154" Dec 12 18:43:47.032415 kubelet[3062]: I1212 18:43:47.032355 3062 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Dec 12 18:43:47.032785 containerd[1769]: time="2025-12-12T18:43:47.032661246Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Dec 12 18:43:47.032953 kubelet[3062]: I1212 18:43:47.032810 3062 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Dec 12 18:43:47.743728 systemd[1]: Created slice kubepods-besteffort-podb9e4a0b6_3efd_47b0_92f9_c8e367d7d487.slice - libcontainer container kubepods-besteffort-podb9e4a0b6_3efd_47b0_92f9_c8e367d7d487.slice. Dec 12 18:43:47.777536 kubelet[3062]: I1212 18:43:47.777481 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/b9e4a0b6-3efd-47b0-92f9-c8e367d7d487-kube-proxy\") pod \"kube-proxy-nj2zc\" (UID: \"b9e4a0b6-3efd-47b0-92f9-c8e367d7d487\") " pod="kube-system/kube-proxy-nj2zc" Dec 12 18:43:47.777536 kubelet[3062]: I1212 18:43:47.777518 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b9e4a0b6-3efd-47b0-92f9-c8e367d7d487-lib-modules\") pod \"kube-proxy-nj2zc\" (UID: \"b9e4a0b6-3efd-47b0-92f9-c8e367d7d487\") " pod="kube-system/kube-proxy-nj2zc" Dec 12 18:43:47.777536 kubelet[3062]: I1212 18:43:47.777537 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b9e4a0b6-3efd-47b0-92f9-c8e367d7d487-xtables-lock\") pod \"kube-proxy-nj2zc\" (UID: \"b9e4a0b6-3efd-47b0-92f9-c8e367d7d487\") " pod="kube-system/kube-proxy-nj2zc" Dec 12 18:43:47.777760 kubelet[3062]: I1212 18:43:47.777553 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psqn8\" (UniqueName: \"kubernetes.io/projected/b9e4a0b6-3efd-47b0-92f9-c8e367d7d487-kube-api-access-psqn8\") pod \"kube-proxy-nj2zc\" (UID: \"b9e4a0b6-3efd-47b0-92f9-c8e367d7d487\") " pod="kube-system/kube-proxy-nj2zc" Dec 12 18:43:48.060195 containerd[1769]: time="2025-12-12T18:43:48.059907358Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-nj2zc,Uid:b9e4a0b6-3efd-47b0-92f9-c8e367d7d487,Namespace:kube-system,Attempt:0,}" Dec 12 18:43:48.093847 containerd[1769]: time="2025-12-12T18:43:48.093798236Z" level=info msg="connecting to shim b1c793fabb5bc8a7224cec97e47c2abc9118b32d8c5bcbd904065b3a8c3087b4" address="unix:///run/containerd/s/3ba337e47bea273e71168851a8419ca00741b92fd83e5c3e7b45594bab19acac" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:43:48.111611 systemd[1]: Created slice kubepods-besteffort-podeef14882_8b6b_4161_a7c8_d2d8cb94c6ea.slice - libcontainer container kubepods-besteffort-podeef14882_8b6b_4161_a7c8_d2d8cb94c6ea.slice. Dec 12 18:43:48.128300 systemd[1]: Started cri-containerd-b1c793fabb5bc8a7224cec97e47c2abc9118b32d8c5bcbd904065b3a8c3087b4.scope - libcontainer container b1c793fabb5bc8a7224cec97e47c2abc9118b32d8c5bcbd904065b3a8c3087b4. Dec 12 18:43:48.149330 containerd[1769]: time="2025-12-12T18:43:48.149254084Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-nj2zc,Uid:b9e4a0b6-3efd-47b0-92f9-c8e367d7d487,Namespace:kube-system,Attempt:0,} returns sandbox id \"b1c793fabb5bc8a7224cec97e47c2abc9118b32d8c5bcbd904065b3a8c3087b4\"" Dec 12 18:43:48.152216 containerd[1769]: time="2025-12-12T18:43:48.152181126Z" level=info msg="CreateContainer within sandbox \"b1c793fabb5bc8a7224cec97e47c2abc9118b32d8c5bcbd904065b3a8c3087b4\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Dec 12 18:43:48.164164 containerd[1769]: time="2025-12-12T18:43:48.164125535Z" level=info msg="Container 8bb19dc05987c004b88c76adc598ed29e9fcea910645b00ffb58bd19a6be597b: CDI devices from CRI Config.CDIDevices: []" Dec 12 18:43:48.174280 containerd[1769]: time="2025-12-12T18:43:48.174234727Z" level=info msg="CreateContainer within sandbox \"b1c793fabb5bc8a7224cec97e47c2abc9118b32d8c5bcbd904065b3a8c3087b4\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"8bb19dc05987c004b88c76adc598ed29e9fcea910645b00ffb58bd19a6be597b\"" Dec 12 18:43:48.174954 containerd[1769]: time="2025-12-12T18:43:48.174746989Z" level=info msg="StartContainer for \"8bb19dc05987c004b88c76adc598ed29e9fcea910645b00ffb58bd19a6be597b\"" Dec 12 18:43:48.175925 containerd[1769]: time="2025-12-12T18:43:48.175897941Z" level=info msg="connecting to shim 8bb19dc05987c004b88c76adc598ed29e9fcea910645b00ffb58bd19a6be597b" address="unix:///run/containerd/s/3ba337e47bea273e71168851a8419ca00741b92fd83e5c3e7b45594bab19acac" protocol=ttrpc version=3 Dec 12 18:43:48.180485 kubelet[3062]: I1212 18:43:48.180450 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/eef14882-8b6b-4161-a7c8-d2d8cb94c6ea-var-lib-calico\") pod \"tigera-operator-7dcd859c48-xdz8b\" (UID: \"eef14882-8b6b-4161-a7c8-d2d8cb94c6ea\") " pod="tigera-operator/tigera-operator-7dcd859c48-xdz8b" Dec 12 18:43:48.180485 kubelet[3062]: I1212 18:43:48.180488 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxjd2\" (UniqueName: \"kubernetes.io/projected/eef14882-8b6b-4161-a7c8-d2d8cb94c6ea-kube-api-access-mxjd2\") pod \"tigera-operator-7dcd859c48-xdz8b\" (UID: \"eef14882-8b6b-4161-a7c8-d2d8cb94c6ea\") " pod="tigera-operator/tigera-operator-7dcd859c48-xdz8b" Dec 12 18:43:48.197350 systemd[1]: Started cri-containerd-8bb19dc05987c004b88c76adc598ed29e9fcea910645b00ffb58bd19a6be597b.scope - libcontainer container 8bb19dc05987c004b88c76adc598ed29e9fcea910645b00ffb58bd19a6be597b. Dec 12 18:43:48.286974 containerd[1769]: time="2025-12-12T18:43:48.286930910Z" level=info msg="StartContainer for \"8bb19dc05987c004b88c76adc598ed29e9fcea910645b00ffb58bd19a6be597b\" returns successfully" Dec 12 18:43:48.417390 containerd[1769]: time="2025-12-12T18:43:48.417345733Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-xdz8b,Uid:eef14882-8b6b-4161-a7c8-d2d8cb94c6ea,Namespace:tigera-operator,Attempt:0,}" Dec 12 18:43:48.442391 containerd[1769]: time="2025-12-12T18:43:48.442343859Z" level=info msg="connecting to shim d88cfc52bbb7af53cadd834591290a279fd9cda8988bba39be4ecfd66865b315" address="unix:///run/containerd/s/c86b23421add47bb414fca27a4abc8e978423e43b123f698ce17ab599e53e090" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:43:48.476432 systemd[1]: Started cri-containerd-d88cfc52bbb7af53cadd834591290a279fd9cda8988bba39be4ecfd66865b315.scope - libcontainer container d88cfc52bbb7af53cadd834591290a279fd9cda8988bba39be4ecfd66865b315. Dec 12 18:43:48.516428 containerd[1769]: time="2025-12-12T18:43:48.516372020Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-xdz8b,Uid:eef14882-8b6b-4161-a7c8-d2d8cb94c6ea,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"d88cfc52bbb7af53cadd834591290a279fd9cda8988bba39be4ecfd66865b315\"" Dec 12 18:43:48.517942 containerd[1769]: time="2025-12-12T18:43:48.517913386Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Dec 12 18:43:49.767186 kubelet[3062]: I1212 18:43:49.767124 3062 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-nj2zc" podStartSLOduration=2.767095717 podStartE2EDuration="2.767095717s" podCreationTimestamp="2025-12-12 18:43:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 18:43:49.096155767 +0000 UTC m=+7.115892653" watchObservedRunningTime="2025-12-12 18:43:49.767095717 +0000 UTC m=+7.786832648" Dec 12 18:43:50.295381 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4098869766.mount: Deactivated successfully. Dec 12 18:43:50.637494 containerd[1769]: time="2025-12-12T18:43:50.637431324Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:43:50.638512 containerd[1769]: time="2025-12-12T18:43:50.638482780Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=25061691" Dec 12 18:43:50.640432 containerd[1769]: time="2025-12-12T18:43:50.640395622Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:43:50.643977 containerd[1769]: time="2025-12-12T18:43:50.643936242Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:43:50.644479 containerd[1769]: time="2025-12-12T18:43:50.644454517Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 2.126510158s" Dec 12 18:43:50.644509 containerd[1769]: time="2025-12-12T18:43:50.644481785Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Dec 12 18:43:50.646107 containerd[1769]: time="2025-12-12T18:43:50.646084032Z" level=info msg="CreateContainer within sandbox \"d88cfc52bbb7af53cadd834591290a279fd9cda8988bba39be4ecfd66865b315\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Dec 12 18:43:50.655362 containerd[1769]: time="2025-12-12T18:43:50.655315780Z" level=info msg="Container 430b1e0a319c9dae63bff3f99661248bd2cc48d7b5d310158ff293428330051f: CDI devices from CRI Config.CDIDevices: []" Dec 12 18:43:50.663257 containerd[1769]: time="2025-12-12T18:43:50.663197069Z" level=info msg="CreateContainer within sandbox \"d88cfc52bbb7af53cadd834591290a279fd9cda8988bba39be4ecfd66865b315\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"430b1e0a319c9dae63bff3f99661248bd2cc48d7b5d310158ff293428330051f\"" Dec 12 18:43:50.663699 containerd[1769]: time="2025-12-12T18:43:50.663678654Z" level=info msg="StartContainer for \"430b1e0a319c9dae63bff3f99661248bd2cc48d7b5d310158ff293428330051f\"" Dec 12 18:43:50.664361 containerd[1769]: time="2025-12-12T18:43:50.664334041Z" level=info msg="connecting to shim 430b1e0a319c9dae63bff3f99661248bd2cc48d7b5d310158ff293428330051f" address="unix:///run/containerd/s/c86b23421add47bb414fca27a4abc8e978423e43b123f698ce17ab599e53e090" protocol=ttrpc version=3 Dec 12 18:43:50.692441 systemd[1]: Started cri-containerd-430b1e0a319c9dae63bff3f99661248bd2cc48d7b5d310158ff293428330051f.scope - libcontainer container 430b1e0a319c9dae63bff3f99661248bd2cc48d7b5d310158ff293428330051f. Dec 12 18:43:50.718160 containerd[1769]: time="2025-12-12T18:43:50.718122314Z" level=info msg="StartContainer for \"430b1e0a319c9dae63bff3f99661248bd2cc48d7b5d310158ff293428330051f\" returns successfully" Dec 12 18:43:51.104102 kubelet[3062]: I1212 18:43:51.103923 3062 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-xdz8b" podStartSLOduration=0.976304415 podStartE2EDuration="3.103907698s" podCreationTimestamp="2025-12-12 18:43:48 +0000 UTC" firstStartedPulling="2025-12-12 18:43:48.517552341 +0000 UTC m=+6.537289215" lastFinishedPulling="2025-12-12 18:43:50.645155618 +0000 UTC m=+8.664892498" observedRunningTime="2025-12-12 18:43:51.103818081 +0000 UTC m=+9.123554976" watchObservedRunningTime="2025-12-12 18:43:51.103907698 +0000 UTC m=+9.123644645" Dec 12 18:43:55.470216 sudo[2057]: pam_unix(sudo:session): session closed for user root Dec 12 18:43:55.637252 sshd[2056]: Connection closed by 147.75.109.163 port 40776 Dec 12 18:43:55.637611 sshd-session[2053]: pam_unix(sshd:session): session closed for user core Dec 12 18:43:55.641301 systemd[1]: sshd@8-10.0.8.97:22-147.75.109.163:40776.service: Deactivated successfully. Dec 12 18:43:55.643389 systemd[1]: session-9.scope: Deactivated successfully. Dec 12 18:43:55.643589 systemd[1]: session-9.scope: Consumed 3.922s CPU time, 229.4M memory peak. Dec 12 18:43:55.644671 systemd-logind[1749]: Session 9 logged out. Waiting for processes to exit. Dec 12 18:43:55.646697 systemd-logind[1749]: Removed session 9. Dec 12 18:43:59.703117 systemd[1]: Created slice kubepods-besteffort-pod9533f08c_1d9c_41c8_995a_5265bcd45da0.slice - libcontainer container kubepods-besteffort-pod9533f08c_1d9c_41c8_995a_5265bcd45da0.slice. Dec 12 18:43:59.751246 kubelet[3062]: I1212 18:43:59.751186 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/9533f08c-1d9c-41c8-995a-5265bcd45da0-typha-certs\") pod \"calico-typha-59bd4dd7cc-l2bwf\" (UID: \"9533f08c-1d9c-41c8-995a-5265bcd45da0\") " pod="calico-system/calico-typha-59bd4dd7cc-l2bwf" Dec 12 18:43:59.751246 kubelet[3062]: I1212 18:43:59.751236 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9533f08c-1d9c-41c8-995a-5265bcd45da0-tigera-ca-bundle\") pod \"calico-typha-59bd4dd7cc-l2bwf\" (UID: \"9533f08c-1d9c-41c8-995a-5265bcd45da0\") " pod="calico-system/calico-typha-59bd4dd7cc-l2bwf" Dec 12 18:43:59.751246 kubelet[3062]: I1212 18:43:59.751255 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8kwl\" (UniqueName: \"kubernetes.io/projected/9533f08c-1d9c-41c8-995a-5265bcd45da0-kube-api-access-g8kwl\") pod \"calico-typha-59bd4dd7cc-l2bwf\" (UID: \"9533f08c-1d9c-41c8-995a-5265bcd45da0\") " pod="calico-system/calico-typha-59bd4dd7cc-l2bwf" Dec 12 18:43:59.893156 systemd[1]: Created slice kubepods-besteffort-podfe09f112_dc18_4084_b6b7_265cc036a513.slice - libcontainer container kubepods-besteffort-podfe09f112_dc18_4084_b6b7_265cc036a513.slice. Dec 12 18:43:59.953653 kubelet[3062]: I1212 18:43:59.953328 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/fe09f112-dc18-4084-b6b7-265cc036a513-cni-net-dir\") pod \"calico-node-fjmlz\" (UID: \"fe09f112-dc18-4084-b6b7-265cc036a513\") " pod="calico-system/calico-node-fjmlz" Dec 12 18:43:59.953653 kubelet[3062]: I1212 18:43:59.953390 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/fe09f112-dc18-4084-b6b7-265cc036a513-node-certs\") pod \"calico-node-fjmlz\" (UID: \"fe09f112-dc18-4084-b6b7-265cc036a513\") " pod="calico-system/calico-node-fjmlz" Dec 12 18:43:59.953653 kubelet[3062]: I1212 18:43:59.953458 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/fe09f112-dc18-4084-b6b7-265cc036a513-xtables-lock\") pod \"calico-node-fjmlz\" (UID: \"fe09f112-dc18-4084-b6b7-265cc036a513\") " pod="calico-system/calico-node-fjmlz" Dec 12 18:43:59.953653 kubelet[3062]: I1212 18:43:59.953493 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/fe09f112-dc18-4084-b6b7-265cc036a513-cni-log-dir\") pod \"calico-node-fjmlz\" (UID: \"fe09f112-dc18-4084-b6b7-265cc036a513\") " pod="calico-system/calico-node-fjmlz" Dec 12 18:43:59.953653 kubelet[3062]: I1212 18:43:59.953509 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fe09f112-dc18-4084-b6b7-265cc036a513-lib-modules\") pod \"calico-node-fjmlz\" (UID: \"fe09f112-dc18-4084-b6b7-265cc036a513\") " pod="calico-system/calico-node-fjmlz" Dec 12 18:43:59.953845 kubelet[3062]: I1212 18:43:59.953527 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/fe09f112-dc18-4084-b6b7-265cc036a513-var-lib-calico\") pod \"calico-node-fjmlz\" (UID: \"fe09f112-dc18-4084-b6b7-265cc036a513\") " pod="calico-system/calico-node-fjmlz" Dec 12 18:43:59.953845 kubelet[3062]: I1212 18:43:59.953548 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/fe09f112-dc18-4084-b6b7-265cc036a513-var-run-calico\") pod \"calico-node-fjmlz\" (UID: \"fe09f112-dc18-4084-b6b7-265cc036a513\") " pod="calico-system/calico-node-fjmlz" Dec 12 18:43:59.953845 kubelet[3062]: I1212 18:43:59.953565 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe09f112-dc18-4084-b6b7-265cc036a513-tigera-ca-bundle\") pod \"calico-node-fjmlz\" (UID: \"fe09f112-dc18-4084-b6b7-265cc036a513\") " pod="calico-system/calico-node-fjmlz" Dec 12 18:43:59.953845 kubelet[3062]: I1212 18:43:59.953583 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/fe09f112-dc18-4084-b6b7-265cc036a513-cni-bin-dir\") pod \"calico-node-fjmlz\" (UID: \"fe09f112-dc18-4084-b6b7-265cc036a513\") " pod="calico-system/calico-node-fjmlz" Dec 12 18:43:59.953845 kubelet[3062]: I1212 18:43:59.953656 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/fe09f112-dc18-4084-b6b7-265cc036a513-policysync\") pod \"calico-node-fjmlz\" (UID: \"fe09f112-dc18-4084-b6b7-265cc036a513\") " pod="calico-system/calico-node-fjmlz" Dec 12 18:43:59.953951 kubelet[3062]: I1212 18:43:59.953716 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/fe09f112-dc18-4084-b6b7-265cc036a513-flexvol-driver-host\") pod \"calico-node-fjmlz\" (UID: \"fe09f112-dc18-4084-b6b7-265cc036a513\") " pod="calico-system/calico-node-fjmlz" Dec 12 18:43:59.953951 kubelet[3062]: I1212 18:43:59.953735 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxhdp\" (UniqueName: \"kubernetes.io/projected/fe09f112-dc18-4084-b6b7-265cc036a513-kube-api-access-vxhdp\") pod \"calico-node-fjmlz\" (UID: \"fe09f112-dc18-4084-b6b7-265cc036a513\") " pod="calico-system/calico-node-fjmlz" Dec 12 18:44:00.006623 containerd[1769]: time="2025-12-12T18:44:00.006578247Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-59bd4dd7cc-l2bwf,Uid:9533f08c-1d9c-41c8-995a-5265bcd45da0,Namespace:calico-system,Attempt:0,}" Dec 12 18:44:00.029139 containerd[1769]: time="2025-12-12T18:44:00.029097590Z" level=info msg="connecting to shim fc5b3e84308341b8e6b128051db086b6710545f50ebacb164b55bf15c51460b8" address="unix:///run/containerd/s/815ef1d0f08dc4d96a655c78b2d692f47683f97fc7b933bedf24060416d4f7a3" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:44:00.054385 systemd[1]: Started cri-containerd-fc5b3e84308341b8e6b128051db086b6710545f50ebacb164b55bf15c51460b8.scope - libcontainer container fc5b3e84308341b8e6b128051db086b6710545f50ebacb164b55bf15c51460b8. Dec 12 18:44:00.057325 kubelet[3062]: E1212 18:44:00.057269 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.057325 kubelet[3062]: W1212 18:44:00.057288 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.057325 kubelet[3062]: E1212 18:44:00.057316 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.060558 kubelet[3062]: E1212 18:44:00.060467 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.060558 kubelet[3062]: W1212 18:44:00.060495 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.060558 kubelet[3062]: E1212 18:44:00.060522 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.070243 kubelet[3062]: E1212 18:44:00.070211 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.070243 kubelet[3062]: W1212 18:44:00.070230 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.070243 kubelet[3062]: E1212 18:44:00.070250 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.084140 kubelet[3062]: E1212 18:44:00.083432 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pls24" podUID="541d8bd6-57ea-4711-86e4-5819a7795d8f" Dec 12 18:44:00.116392 containerd[1769]: time="2025-12-12T18:44:00.116348841Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-59bd4dd7cc-l2bwf,Uid:9533f08c-1d9c-41c8-995a-5265bcd45da0,Namespace:calico-system,Attempt:0,} returns sandbox id \"fc5b3e84308341b8e6b128051db086b6710545f50ebacb164b55bf15c51460b8\"" Dec 12 18:44:00.117657 containerd[1769]: time="2025-12-12T18:44:00.117636148Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Dec 12 18:44:00.149723 kubelet[3062]: E1212 18:44:00.149670 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.149723 kubelet[3062]: W1212 18:44:00.149696 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.149723 kubelet[3062]: E1212 18:44:00.149719 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.150000 kubelet[3062]: E1212 18:44:00.149927 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.150000 kubelet[3062]: W1212 18:44:00.149940 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.150000 kubelet[3062]: E1212 18:44:00.149948 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.150232 kubelet[3062]: E1212 18:44:00.150152 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.150232 kubelet[3062]: W1212 18:44:00.150158 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.150232 kubelet[3062]: E1212 18:44:00.150165 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.150386 kubelet[3062]: E1212 18:44:00.150369 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.150386 kubelet[3062]: W1212 18:44:00.150378 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.150386 kubelet[3062]: E1212 18:44:00.150385 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.150573 kubelet[3062]: E1212 18:44:00.150559 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.150573 kubelet[3062]: W1212 18:44:00.150568 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.150615 kubelet[3062]: E1212 18:44:00.150575 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.150691 kubelet[3062]: E1212 18:44:00.150683 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.150712 kubelet[3062]: W1212 18:44:00.150691 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.150736 kubelet[3062]: E1212 18:44:00.150716 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.150863 kubelet[3062]: E1212 18:44:00.150842 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.150863 kubelet[3062]: W1212 18:44:00.150862 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.150904 kubelet[3062]: E1212 18:44:00.150869 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.151020 kubelet[3062]: E1212 18:44:00.151006 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.151020 kubelet[3062]: W1212 18:44:00.151014 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.151063 kubelet[3062]: E1212 18:44:00.151020 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.151202 kubelet[3062]: E1212 18:44:00.151193 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.151225 kubelet[3062]: W1212 18:44:00.151202 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.151225 kubelet[3062]: E1212 18:44:00.151209 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.151331 kubelet[3062]: E1212 18:44:00.151323 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.151350 kubelet[3062]: W1212 18:44:00.151331 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.151350 kubelet[3062]: E1212 18:44:00.151336 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.151452 kubelet[3062]: E1212 18:44:00.151444 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.151452 kubelet[3062]: W1212 18:44:00.151452 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.151492 kubelet[3062]: E1212 18:44:00.151457 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.151574 kubelet[3062]: E1212 18:44:00.151566 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.151574 kubelet[3062]: W1212 18:44:00.151573 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.151614 kubelet[3062]: E1212 18:44:00.151579 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.151703 kubelet[3062]: E1212 18:44:00.151695 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.151703 kubelet[3062]: W1212 18:44:00.151702 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.151741 kubelet[3062]: E1212 18:44:00.151708 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.151825 kubelet[3062]: E1212 18:44:00.151818 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.151846 kubelet[3062]: W1212 18:44:00.151825 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.151846 kubelet[3062]: E1212 18:44:00.151830 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.151944 kubelet[3062]: E1212 18:44:00.151936 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.151944 kubelet[3062]: W1212 18:44:00.151943 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.151986 kubelet[3062]: E1212 18:44:00.151949 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.152100 kubelet[3062]: E1212 18:44:00.152092 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.152123 kubelet[3062]: W1212 18:44:00.152099 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.152123 kubelet[3062]: E1212 18:44:00.152106 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.152423 kubelet[3062]: E1212 18:44:00.152413 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.152423 kubelet[3062]: W1212 18:44:00.152421 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.152470 kubelet[3062]: E1212 18:44:00.152428 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.152603 kubelet[3062]: E1212 18:44:00.152591 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.152603 kubelet[3062]: W1212 18:44:00.152599 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.152656 kubelet[3062]: E1212 18:44:00.152608 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.153163 kubelet[3062]: E1212 18:44:00.153154 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.153163 kubelet[3062]: W1212 18:44:00.153162 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.153207 kubelet[3062]: E1212 18:44:00.153167 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.153338 kubelet[3062]: E1212 18:44:00.153325 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.153338 kubelet[3062]: W1212 18:44:00.153333 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.153380 kubelet[3062]: E1212 18:44:00.153338 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.156696 kubelet[3062]: E1212 18:44:00.156659 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.156696 kubelet[3062]: W1212 18:44:00.156676 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.156696 kubelet[3062]: E1212 18:44:00.156690 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.156811 kubelet[3062]: I1212 18:44:00.156720 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/541d8bd6-57ea-4711-86e4-5819a7795d8f-kubelet-dir\") pod \"csi-node-driver-pls24\" (UID: \"541d8bd6-57ea-4711-86e4-5819a7795d8f\") " pod="calico-system/csi-node-driver-pls24" Dec 12 18:44:00.156898 kubelet[3062]: E1212 18:44:00.156880 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.156898 kubelet[3062]: W1212 18:44:00.156891 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.156950 kubelet[3062]: E1212 18:44:00.156905 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.156950 kubelet[3062]: I1212 18:44:00.156919 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8zv6\" (UniqueName: \"kubernetes.io/projected/541d8bd6-57ea-4711-86e4-5819a7795d8f-kube-api-access-m8zv6\") pod \"csi-node-driver-pls24\" (UID: \"541d8bd6-57ea-4711-86e4-5819a7795d8f\") " pod="calico-system/csi-node-driver-pls24" Dec 12 18:44:00.157167 kubelet[3062]: E1212 18:44:00.157138 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.157167 kubelet[3062]: W1212 18:44:00.157161 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.157209 kubelet[3062]: E1212 18:44:00.157183 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.157371 kubelet[3062]: E1212 18:44:00.157323 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.157371 kubelet[3062]: W1212 18:44:00.157332 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.157371 kubelet[3062]: E1212 18:44:00.157339 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.157540 kubelet[3062]: E1212 18:44:00.157527 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.157540 kubelet[3062]: W1212 18:44:00.157537 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.157581 kubelet[3062]: E1212 18:44:00.157548 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.157581 kubelet[3062]: I1212 18:44:00.157570 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/541d8bd6-57ea-4711-86e4-5819a7795d8f-socket-dir\") pod \"csi-node-driver-pls24\" (UID: \"541d8bd6-57ea-4711-86e4-5819a7795d8f\") " pod="calico-system/csi-node-driver-pls24" Dec 12 18:44:00.157808 kubelet[3062]: E1212 18:44:00.157752 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.157808 kubelet[3062]: W1212 18:44:00.157768 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.158102 kubelet[3062]: E1212 18:44:00.157952 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.158148 kubelet[3062]: I1212 18:44:00.158106 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/541d8bd6-57ea-4711-86e4-5819a7795d8f-varrun\") pod \"csi-node-driver-pls24\" (UID: \"541d8bd6-57ea-4711-86e4-5819a7795d8f\") " pod="calico-system/csi-node-driver-pls24" Dec 12 18:44:00.158295 kubelet[3062]: E1212 18:44:00.158272 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.158295 kubelet[3062]: W1212 18:44:00.158290 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.158346 kubelet[3062]: E1212 18:44:00.158308 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.158494 kubelet[3062]: E1212 18:44:00.158480 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.158494 kubelet[3062]: W1212 18:44:00.158490 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.158538 kubelet[3062]: E1212 18:44:00.158503 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.159300 kubelet[3062]: E1212 18:44:00.159285 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.159300 kubelet[3062]: W1212 18:44:00.159298 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.159361 kubelet[3062]: E1212 18:44:00.159315 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.160116 kubelet[3062]: E1212 18:44:00.160097 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.160116 kubelet[3062]: W1212 18:44:00.160110 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.160450 kubelet[3062]: E1212 18:44:00.160150 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.160486 kubelet[3062]: E1212 18:44:00.160282 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.160486 kubelet[3062]: W1212 18:44:00.160474 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.160543 kubelet[3062]: E1212 18:44:00.160528 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.160567 kubelet[3062]: I1212 18:44:00.160555 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/541d8bd6-57ea-4711-86e4-5819a7795d8f-registration-dir\") pod \"csi-node-driver-pls24\" (UID: \"541d8bd6-57ea-4711-86e4-5819a7795d8f\") " pod="calico-system/csi-node-driver-pls24" Dec 12 18:44:00.160863 kubelet[3062]: E1212 18:44:00.160673 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.160863 kubelet[3062]: W1212 18:44:00.160683 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.160863 kubelet[3062]: E1212 18:44:00.160757 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.160974 kubelet[3062]: E1212 18:44:00.160961 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.160974 kubelet[3062]: W1212 18:44:00.160971 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.161015 kubelet[3062]: E1212 18:44:00.160984 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.161158 kubelet[3062]: E1212 18:44:00.161146 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.161158 kubelet[3062]: W1212 18:44:00.161155 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.161210 kubelet[3062]: E1212 18:44:00.161161 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.161296 kubelet[3062]: E1212 18:44:00.161286 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.161296 kubelet[3062]: W1212 18:44:00.161294 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.161339 kubelet[3062]: E1212 18:44:00.161299 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.196578 containerd[1769]: time="2025-12-12T18:44:00.196532701Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-fjmlz,Uid:fe09f112-dc18-4084-b6b7-265cc036a513,Namespace:calico-system,Attempt:0,}" Dec 12 18:44:00.226667 containerd[1769]: time="2025-12-12T18:44:00.226543254Z" level=info msg="connecting to shim a6f77ae20c16b8f01ff967c34da8a6c45896ff9964b43bd75dc4778c30d271a0" address="unix:///run/containerd/s/b7c778ce930b309857723dff04064cb8312486388016cc2aeb34ed381a1c3b10" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:44:00.257303 systemd[1]: Started cri-containerd-a6f77ae20c16b8f01ff967c34da8a6c45896ff9964b43bd75dc4778c30d271a0.scope - libcontainer container a6f77ae20c16b8f01ff967c34da8a6c45896ff9964b43bd75dc4778c30d271a0. Dec 12 18:44:00.261634 kubelet[3062]: E1212 18:44:00.261589 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.261634 kubelet[3062]: W1212 18:44:00.261615 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.261634 kubelet[3062]: E1212 18:44:00.261637 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.261903 kubelet[3062]: E1212 18:44:00.261884 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.261903 kubelet[3062]: W1212 18:44:00.261895 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.261946 kubelet[3062]: E1212 18:44:00.261909 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.262193 kubelet[3062]: E1212 18:44:00.262171 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.262220 kubelet[3062]: W1212 18:44:00.262193 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.262239 kubelet[3062]: E1212 18:44:00.262223 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.262406 kubelet[3062]: E1212 18:44:00.262393 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.262406 kubelet[3062]: W1212 18:44:00.262402 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.262501 kubelet[3062]: E1212 18:44:00.262415 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.262568 kubelet[3062]: E1212 18:44:00.262559 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.262568 kubelet[3062]: W1212 18:44:00.262567 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.262607 kubelet[3062]: E1212 18:44:00.262578 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.262749 kubelet[3062]: E1212 18:44:00.262737 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.262749 kubelet[3062]: W1212 18:44:00.262745 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.262794 kubelet[3062]: E1212 18:44:00.262755 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.263031 kubelet[3062]: E1212 18:44:00.263012 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.263031 kubelet[3062]: W1212 18:44:00.263024 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.263088 kubelet[3062]: E1212 18:44:00.263038 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.263256 kubelet[3062]: E1212 18:44:00.263240 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.263256 kubelet[3062]: W1212 18:44:00.263253 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.263301 kubelet[3062]: E1212 18:44:00.263266 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.263435 kubelet[3062]: E1212 18:44:00.263425 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.263435 kubelet[3062]: W1212 18:44:00.263433 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.263478 kubelet[3062]: E1212 18:44:00.263444 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.263591 kubelet[3062]: E1212 18:44:00.263583 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.263591 kubelet[3062]: W1212 18:44:00.263590 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.263627 kubelet[3062]: E1212 18:44:00.263604 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.263805 kubelet[3062]: E1212 18:44:00.263793 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.263825 kubelet[3062]: W1212 18:44:00.263805 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.263825 kubelet[3062]: E1212 18:44:00.263814 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.264000 kubelet[3062]: E1212 18:44:00.263992 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.264022 kubelet[3062]: W1212 18:44:00.263999 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.264022 kubelet[3062]: E1212 18:44:00.264009 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.264192 kubelet[3062]: E1212 18:44:00.264183 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.264215 kubelet[3062]: W1212 18:44:00.264192 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.264215 kubelet[3062]: E1212 18:44:00.264201 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.264368 kubelet[3062]: E1212 18:44:00.264359 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.264368 kubelet[3062]: W1212 18:44:00.264367 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.264416 kubelet[3062]: E1212 18:44:00.264398 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.264500 kubelet[3062]: E1212 18:44:00.264493 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.264520 kubelet[3062]: W1212 18:44:00.264500 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.264543 kubelet[3062]: E1212 18:44:00.264519 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.264664 kubelet[3062]: E1212 18:44:00.264656 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.264686 kubelet[3062]: W1212 18:44:00.264664 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.264686 kubelet[3062]: E1212 18:44:00.264674 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.265203 kubelet[3062]: E1212 18:44:00.265186 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.265203 kubelet[3062]: W1212 18:44:00.265199 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.265252 kubelet[3062]: E1212 18:44:00.265212 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.265458 kubelet[3062]: E1212 18:44:00.265448 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.265458 kubelet[3062]: W1212 18:44:00.265457 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.265523 kubelet[3062]: E1212 18:44:00.265469 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.266816 kubelet[3062]: E1212 18:44:00.266797 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.266816 kubelet[3062]: W1212 18:44:00.266812 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.266876 kubelet[3062]: E1212 18:44:00.266862 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.266972 kubelet[3062]: E1212 18:44:00.266963 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.266994 kubelet[3062]: W1212 18:44:00.266972 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.267054 kubelet[3062]: E1212 18:44:00.267041 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.267156 kubelet[3062]: E1212 18:44:00.267148 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.267180 kubelet[3062]: W1212 18:44:00.267155 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.267218 kubelet[3062]: E1212 18:44:00.267208 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.267441 kubelet[3062]: E1212 18:44:00.267427 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.267441 kubelet[3062]: W1212 18:44:00.267436 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.267481 kubelet[3062]: E1212 18:44:00.267447 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.267632 kubelet[3062]: E1212 18:44:00.267624 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.267654 kubelet[3062]: W1212 18:44:00.267632 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.267654 kubelet[3062]: E1212 18:44:00.267643 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.267869 kubelet[3062]: E1212 18:44:00.267860 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.268108 kubelet[3062]: W1212 18:44:00.267868 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.268131 kubelet[3062]: E1212 18:44:00.268113 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.268289 kubelet[3062]: E1212 18:44:00.268281 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.268314 kubelet[3062]: W1212 18:44:00.268290 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.268314 kubelet[3062]: E1212 18:44:00.268297 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.271303 kubelet[3062]: E1212 18:44:00.271278 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:00.271303 kubelet[3062]: W1212 18:44:00.271293 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:00.271303 kubelet[3062]: E1212 18:44:00.271304 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:00.280084 containerd[1769]: time="2025-12-12T18:44:00.280024419Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-fjmlz,Uid:fe09f112-dc18-4084-b6b7-265cc036a513,Namespace:calico-system,Attempt:0,} returns sandbox id \"a6f77ae20c16b8f01ff967c34da8a6c45896ff9964b43bd75dc4778c30d271a0\"" Dec 12 18:44:01.429341 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3563976947.mount: Deactivated successfully. Dec 12 18:44:01.839963 containerd[1769]: time="2025-12-12T18:44:01.839839363Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:44:01.841108 containerd[1769]: time="2025-12-12T18:44:01.841063777Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=35234628" Dec 12 18:44:01.842713 containerd[1769]: time="2025-12-12T18:44:01.842679149Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:44:01.845373 containerd[1769]: time="2025-12-12T18:44:01.845325298Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:44:01.845964 containerd[1769]: time="2025-12-12T18:44:01.845759369Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 1.728095208s" Dec 12 18:44:01.845964 containerd[1769]: time="2025-12-12T18:44:01.845784680Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Dec 12 18:44:01.846568 containerd[1769]: time="2025-12-12T18:44:01.846544150Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Dec 12 18:44:01.855151 containerd[1769]: time="2025-12-12T18:44:01.855117225Z" level=info msg="CreateContainer within sandbox \"fc5b3e84308341b8e6b128051db086b6710545f50ebacb164b55bf15c51460b8\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Dec 12 18:44:01.868419 containerd[1769]: time="2025-12-12T18:44:01.868380613Z" level=info msg="Container ac0aab42ccd06fc9f6240b3018bcf1cfd13dec4494e21cf7908efedecfee58c2: CDI devices from CRI Config.CDIDevices: []" Dec 12 18:44:01.880573 containerd[1769]: time="2025-12-12T18:44:01.880533651Z" level=info msg="CreateContainer within sandbox \"fc5b3e84308341b8e6b128051db086b6710545f50ebacb164b55bf15c51460b8\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"ac0aab42ccd06fc9f6240b3018bcf1cfd13dec4494e21cf7908efedecfee58c2\"" Dec 12 18:44:01.881008 containerd[1769]: time="2025-12-12T18:44:01.880990205Z" level=info msg="StartContainer for \"ac0aab42ccd06fc9f6240b3018bcf1cfd13dec4494e21cf7908efedecfee58c2\"" Dec 12 18:44:01.881863 containerd[1769]: time="2025-12-12T18:44:01.881837325Z" level=info msg="connecting to shim ac0aab42ccd06fc9f6240b3018bcf1cfd13dec4494e21cf7908efedecfee58c2" address="unix:///run/containerd/s/815ef1d0f08dc4d96a655c78b2d692f47683f97fc7b933bedf24060416d4f7a3" protocol=ttrpc version=3 Dec 12 18:44:01.901288 systemd[1]: Started cri-containerd-ac0aab42ccd06fc9f6240b3018bcf1cfd13dec4494e21cf7908efedecfee58c2.scope - libcontainer container ac0aab42ccd06fc9f6240b3018bcf1cfd13dec4494e21cf7908efedecfee58c2. Dec 12 18:44:01.945527 containerd[1769]: time="2025-12-12T18:44:01.945495280Z" level=info msg="StartContainer for \"ac0aab42ccd06fc9f6240b3018bcf1cfd13dec4494e21cf7908efedecfee58c2\" returns successfully" Dec 12 18:44:02.067427 kubelet[3062]: E1212 18:44:02.067373 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pls24" podUID="541d8bd6-57ea-4711-86e4-5819a7795d8f" Dec 12 18:44:02.127015 kubelet[3062]: I1212 18:44:02.126869 3062 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-59bd4dd7cc-l2bwf" podStartSLOduration=1.39784585 podStartE2EDuration="3.126853643s" podCreationTimestamp="2025-12-12 18:43:59 +0000 UTC" firstStartedPulling="2025-12-12 18:44:00.117411933 +0000 UTC m=+18.137148808" lastFinishedPulling="2025-12-12 18:44:01.846419717 +0000 UTC m=+19.866156601" observedRunningTime="2025-12-12 18:44:02.126443977 +0000 UTC m=+20.146180878" watchObservedRunningTime="2025-12-12 18:44:02.126853643 +0000 UTC m=+20.146590540" Dec 12 18:44:02.164831 kubelet[3062]: E1212 18:44:02.164790 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.164831 kubelet[3062]: W1212 18:44:02.164815 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.164831 kubelet[3062]: E1212 18:44:02.164836 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.165014 kubelet[3062]: E1212 18:44:02.164991 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.165014 kubelet[3062]: W1212 18:44:02.164996 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.165014 kubelet[3062]: E1212 18:44:02.165003 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.165230 kubelet[3062]: E1212 18:44:02.165211 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.165230 kubelet[3062]: W1212 18:44:02.165220 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.165230 kubelet[3062]: E1212 18:44:02.165226 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.165447 kubelet[3062]: E1212 18:44:02.165433 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.165447 kubelet[3062]: W1212 18:44:02.165441 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.165447 kubelet[3062]: E1212 18:44:02.165447 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.165618 kubelet[3062]: E1212 18:44:02.165605 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.165618 kubelet[3062]: W1212 18:44:02.165613 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.165657 kubelet[3062]: E1212 18:44:02.165619 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.165749 kubelet[3062]: E1212 18:44:02.165740 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.165749 kubelet[3062]: W1212 18:44:02.165747 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.165788 kubelet[3062]: E1212 18:44:02.165753 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.165875 kubelet[3062]: E1212 18:44:02.165861 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.165875 kubelet[3062]: W1212 18:44:02.165869 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.165875 kubelet[3062]: E1212 18:44:02.165874 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.166008 kubelet[3062]: E1212 18:44:02.165998 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.166008 kubelet[3062]: W1212 18:44:02.166006 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.166048 kubelet[3062]: E1212 18:44:02.166012 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.166143 kubelet[3062]: E1212 18:44:02.166135 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.166143 kubelet[3062]: W1212 18:44:02.166142 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.166186 kubelet[3062]: E1212 18:44:02.166148 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.166258 kubelet[3062]: E1212 18:44:02.166250 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.166258 kubelet[3062]: W1212 18:44:02.166257 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.166296 kubelet[3062]: E1212 18:44:02.166263 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.166373 kubelet[3062]: E1212 18:44:02.166365 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.166395 kubelet[3062]: W1212 18:44:02.166372 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.166395 kubelet[3062]: E1212 18:44:02.166379 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.166490 kubelet[3062]: E1212 18:44:02.166482 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.166510 kubelet[3062]: W1212 18:44:02.166490 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.166510 kubelet[3062]: E1212 18:44:02.166495 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.166609 kubelet[3062]: E1212 18:44:02.166601 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.166609 kubelet[3062]: W1212 18:44:02.166609 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.166657 kubelet[3062]: E1212 18:44:02.166614 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.166746 kubelet[3062]: E1212 18:44:02.166737 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.166746 kubelet[3062]: W1212 18:44:02.166745 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.166790 kubelet[3062]: E1212 18:44:02.166750 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.166888 kubelet[3062]: E1212 18:44:02.166880 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.166888 kubelet[3062]: W1212 18:44:02.166888 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.166933 kubelet[3062]: E1212 18:44:02.166894 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.177396 kubelet[3062]: E1212 18:44:02.177355 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.177396 kubelet[3062]: W1212 18:44:02.177378 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.177396 kubelet[3062]: E1212 18:44:02.177396 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.177644 kubelet[3062]: E1212 18:44:02.177632 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.177671 kubelet[3062]: W1212 18:44:02.177646 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.177671 kubelet[3062]: E1212 18:44:02.177668 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.177937 kubelet[3062]: E1212 18:44:02.177913 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.177937 kubelet[3062]: W1212 18:44:02.177931 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.177988 kubelet[3062]: E1212 18:44:02.177951 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.178155 kubelet[3062]: E1212 18:44:02.178141 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.178155 kubelet[3062]: W1212 18:44:02.178150 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.178200 kubelet[3062]: E1212 18:44:02.178161 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.178342 kubelet[3062]: E1212 18:44:02.178322 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.178342 kubelet[3062]: W1212 18:44:02.178330 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.178342 kubelet[3062]: E1212 18:44:02.178342 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.178646 kubelet[3062]: E1212 18:44:02.178592 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.178646 kubelet[3062]: W1212 18:44:02.178620 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.178646 kubelet[3062]: E1212 18:44:02.178646 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.178897 kubelet[3062]: E1212 18:44:02.178871 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.178897 kubelet[3062]: W1212 18:44:02.178888 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.178949 kubelet[3062]: E1212 18:44:02.178905 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.179123 kubelet[3062]: E1212 18:44:02.179113 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.179123 kubelet[3062]: W1212 18:44:02.179122 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.179164 kubelet[3062]: E1212 18:44:02.179135 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.179286 kubelet[3062]: E1212 18:44:02.179273 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.179286 kubelet[3062]: W1212 18:44:02.179281 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.179329 kubelet[3062]: E1212 18:44:02.179292 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.179514 kubelet[3062]: E1212 18:44:02.179501 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.179514 kubelet[3062]: W1212 18:44:02.179512 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.179558 kubelet[3062]: E1212 18:44:02.179524 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.179831 kubelet[3062]: E1212 18:44:02.179807 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.179831 kubelet[3062]: W1212 18:44:02.179826 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.179875 kubelet[3062]: E1212 18:44:02.179844 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.179997 kubelet[3062]: E1212 18:44:02.179978 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.179997 kubelet[3062]: W1212 18:44:02.179991 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.180036 kubelet[3062]: E1212 18:44:02.180005 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.180224 kubelet[3062]: E1212 18:44:02.180208 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.180224 kubelet[3062]: W1212 18:44:02.180219 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.180277 kubelet[3062]: E1212 18:44:02.180235 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.180402 kubelet[3062]: E1212 18:44:02.180389 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.180402 kubelet[3062]: W1212 18:44:02.180398 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.180444 kubelet[3062]: E1212 18:44:02.180425 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.180554 kubelet[3062]: E1212 18:44:02.180542 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.180554 kubelet[3062]: W1212 18:44:02.180551 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.180597 kubelet[3062]: E1212 18:44:02.180561 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.180752 kubelet[3062]: E1212 18:44:02.180739 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.180752 kubelet[3062]: W1212 18:44:02.180750 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.180790 kubelet[3062]: E1212 18:44:02.180764 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.180905 kubelet[3062]: E1212 18:44:02.180892 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.180905 kubelet[3062]: W1212 18:44:02.180902 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.180952 kubelet[3062]: E1212 18:44:02.180909 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:02.181094 kubelet[3062]: E1212 18:44:02.181082 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:02.181094 kubelet[3062]: W1212 18:44:02.181091 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:02.181131 kubelet[3062]: E1212 18:44:02.181098 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.116808 kubelet[3062]: I1212 18:44:03.116772 3062 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 12 18:44:03.173670 kubelet[3062]: E1212 18:44:03.173608 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.173670 kubelet[3062]: W1212 18:44:03.173637 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.173670 kubelet[3062]: E1212 18:44:03.173660 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.173912 kubelet[3062]: E1212 18:44:03.173880 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.173912 kubelet[3062]: W1212 18:44:03.173887 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.173912 kubelet[3062]: E1212 18:44:03.173895 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.174097 kubelet[3062]: E1212 18:44:03.174060 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.174097 kubelet[3062]: W1212 18:44:03.174086 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.174097 kubelet[3062]: E1212 18:44:03.174093 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.174238 kubelet[3062]: E1212 18:44:03.174223 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.174238 kubelet[3062]: W1212 18:44:03.174229 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.174238 kubelet[3062]: E1212 18:44:03.174238 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.174452 kubelet[3062]: E1212 18:44:03.174430 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.174452 kubelet[3062]: W1212 18:44:03.174438 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.174514 kubelet[3062]: E1212 18:44:03.174446 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.174646 kubelet[3062]: E1212 18:44:03.174631 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.174646 kubelet[3062]: W1212 18:44:03.174639 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.174697 kubelet[3062]: E1212 18:44:03.174647 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.174800 kubelet[3062]: E1212 18:44:03.174786 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.174800 kubelet[3062]: W1212 18:44:03.174794 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.174845 kubelet[3062]: E1212 18:44:03.174800 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.174920 kubelet[3062]: E1212 18:44:03.174910 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.174920 kubelet[3062]: W1212 18:44:03.174918 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.174964 kubelet[3062]: E1212 18:44:03.174924 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.175055 kubelet[3062]: E1212 18:44:03.175045 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.175055 kubelet[3062]: W1212 18:44:03.175053 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.175123 kubelet[3062]: E1212 18:44:03.175059 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.175248 kubelet[3062]: E1212 18:44:03.175230 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.175248 kubelet[3062]: W1212 18:44:03.175239 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.175248 kubelet[3062]: E1212 18:44:03.175245 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.175381 kubelet[3062]: E1212 18:44:03.175371 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.175381 kubelet[3062]: W1212 18:44:03.175379 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.175430 kubelet[3062]: E1212 18:44:03.175385 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.175506 kubelet[3062]: E1212 18:44:03.175496 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.175506 kubelet[3062]: W1212 18:44:03.175504 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.175545 kubelet[3062]: E1212 18:44:03.175510 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.175638 kubelet[3062]: E1212 18:44:03.175629 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.175638 kubelet[3062]: W1212 18:44:03.175636 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.175690 kubelet[3062]: E1212 18:44:03.175642 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.175765 kubelet[3062]: E1212 18:44:03.175755 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.175765 kubelet[3062]: W1212 18:44:03.175763 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.175808 kubelet[3062]: E1212 18:44:03.175769 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.175892 kubelet[3062]: E1212 18:44:03.175883 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.175892 kubelet[3062]: W1212 18:44:03.175890 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.175936 kubelet[3062]: E1212 18:44:03.175897 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.185482 kubelet[3062]: E1212 18:44:03.185420 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.185482 kubelet[3062]: W1212 18:44:03.185447 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.185482 kubelet[3062]: E1212 18:44:03.185470 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.185741 kubelet[3062]: E1212 18:44:03.185722 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.185770 kubelet[3062]: W1212 18:44:03.185739 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.185770 kubelet[3062]: E1212 18:44:03.185759 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.186026 kubelet[3062]: E1212 18:44:03.186012 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.186026 kubelet[3062]: W1212 18:44:03.186023 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.186110 kubelet[3062]: E1212 18:44:03.186037 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.186281 kubelet[3062]: E1212 18:44:03.186263 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.186281 kubelet[3062]: W1212 18:44:03.186273 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.186333 kubelet[3062]: E1212 18:44:03.186285 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.186433 kubelet[3062]: E1212 18:44:03.186423 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.186433 kubelet[3062]: W1212 18:44:03.186432 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.186471 kubelet[3062]: E1212 18:44:03.186442 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.186572 kubelet[3062]: E1212 18:44:03.186562 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.186572 kubelet[3062]: W1212 18:44:03.186570 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.186615 kubelet[3062]: E1212 18:44:03.186580 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.186741 kubelet[3062]: E1212 18:44:03.186727 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.186741 kubelet[3062]: W1212 18:44:03.186736 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.186780 kubelet[3062]: E1212 18:44:03.186760 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.186865 kubelet[3062]: E1212 18:44:03.186854 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.186865 kubelet[3062]: W1212 18:44:03.186862 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.186922 kubelet[3062]: E1212 18:44:03.186903 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.186991 kubelet[3062]: E1212 18:44:03.186981 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.186991 kubelet[3062]: W1212 18:44:03.186989 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.187035 kubelet[3062]: E1212 18:44:03.187001 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.187149 kubelet[3062]: E1212 18:44:03.187139 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.187149 kubelet[3062]: W1212 18:44:03.187147 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.187195 kubelet[3062]: E1212 18:44:03.187156 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.187316 kubelet[3062]: E1212 18:44:03.187306 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.187316 kubelet[3062]: W1212 18:44:03.187313 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.187354 kubelet[3062]: E1212 18:44:03.187323 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.187472 kubelet[3062]: E1212 18:44:03.187463 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.187497 kubelet[3062]: W1212 18:44:03.187477 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.187497 kubelet[3062]: E1212 18:44:03.187487 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.187916 kubelet[3062]: E1212 18:44:03.187900 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.187940 kubelet[3062]: W1212 18:44:03.187915 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.187940 kubelet[3062]: E1212 18:44:03.187932 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.188158 kubelet[3062]: E1212 18:44:03.188119 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.188158 kubelet[3062]: W1212 18:44:03.188130 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.188158 kubelet[3062]: E1212 18:44:03.188154 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.188397 kubelet[3062]: E1212 18:44:03.188358 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.188397 kubelet[3062]: W1212 18:44:03.188373 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.188397 kubelet[3062]: E1212 18:44:03.188388 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.188644 kubelet[3062]: E1212 18:44:03.188626 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.188644 kubelet[3062]: W1212 18:44:03.188640 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.188716 kubelet[3062]: E1212 18:44:03.188657 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.188910 kubelet[3062]: E1212 18:44:03.188838 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.188910 kubelet[3062]: W1212 18:44:03.188854 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.188910 kubelet[3062]: E1212 18:44:03.188870 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.189033 kubelet[3062]: E1212 18:44:03.189022 3062 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:44:03.189033 kubelet[3062]: W1212 18:44:03.189031 3062 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:44:03.189100 kubelet[3062]: E1212 18:44:03.189040 3062 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:44:03.429462 containerd[1769]: time="2025-12-12T18:44:03.429395677Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:44:03.430761 containerd[1769]: time="2025-12-12T18:44:03.430719408Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4446754" Dec 12 18:44:03.432715 containerd[1769]: time="2025-12-12T18:44:03.432674409Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:44:03.435156 containerd[1769]: time="2025-12-12T18:44:03.435118788Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:44:03.435596 containerd[1769]: time="2025-12-12T18:44:03.435562393Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.588993917s" Dec 12 18:44:03.435624 containerd[1769]: time="2025-12-12T18:44:03.435598729Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Dec 12 18:44:03.437707 containerd[1769]: time="2025-12-12T18:44:03.437662264Z" level=info msg="CreateContainer within sandbox \"a6f77ae20c16b8f01ff967c34da8a6c45896ff9964b43bd75dc4778c30d271a0\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Dec 12 18:44:03.448408 containerd[1769]: time="2025-12-12T18:44:03.448350567Z" level=info msg="Container 038ca4a6b2ad82fdec83f20133030f647014837136d81e3cf85c496f15f98f19: CDI devices from CRI Config.CDIDevices: []" Dec 12 18:44:03.461458 containerd[1769]: time="2025-12-12T18:44:03.461410759Z" level=info msg="CreateContainer within sandbox \"a6f77ae20c16b8f01ff967c34da8a6c45896ff9964b43bd75dc4778c30d271a0\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"038ca4a6b2ad82fdec83f20133030f647014837136d81e3cf85c496f15f98f19\"" Dec 12 18:44:03.462005 containerd[1769]: time="2025-12-12T18:44:03.461929260Z" level=info msg="StartContainer for \"038ca4a6b2ad82fdec83f20133030f647014837136d81e3cf85c496f15f98f19\"" Dec 12 18:44:03.463126 containerd[1769]: time="2025-12-12T18:44:03.463100503Z" level=info msg="connecting to shim 038ca4a6b2ad82fdec83f20133030f647014837136d81e3cf85c496f15f98f19" address="unix:///run/containerd/s/b7c778ce930b309857723dff04064cb8312486388016cc2aeb34ed381a1c3b10" protocol=ttrpc version=3 Dec 12 18:44:03.493339 systemd[1]: Started cri-containerd-038ca4a6b2ad82fdec83f20133030f647014837136d81e3cf85c496f15f98f19.scope - libcontainer container 038ca4a6b2ad82fdec83f20133030f647014837136d81e3cf85c496f15f98f19. Dec 12 18:44:03.554029 containerd[1769]: time="2025-12-12T18:44:03.553642502Z" level=info msg="StartContainer for \"038ca4a6b2ad82fdec83f20133030f647014837136d81e3cf85c496f15f98f19\" returns successfully" Dec 12 18:44:03.560143 systemd[1]: cri-containerd-038ca4a6b2ad82fdec83f20133030f647014837136d81e3cf85c496f15f98f19.scope: Deactivated successfully. Dec 12 18:44:03.563475 containerd[1769]: time="2025-12-12T18:44:03.563439404Z" level=info msg="received container exit event container_id:\"038ca4a6b2ad82fdec83f20133030f647014837136d81e3cf85c496f15f98f19\" id:\"038ca4a6b2ad82fdec83f20133030f647014837136d81e3cf85c496f15f98f19\" pid:3819 exited_at:{seconds:1765565043 nanos:562976272}" Dec 12 18:44:03.582895 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-038ca4a6b2ad82fdec83f20133030f647014837136d81e3cf85c496f15f98f19-rootfs.mount: Deactivated successfully. Dec 12 18:44:04.067241 kubelet[3062]: E1212 18:44:04.067108 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pls24" podUID="541d8bd6-57ea-4711-86e4-5819a7795d8f" Dec 12 18:44:04.121431 containerd[1769]: time="2025-12-12T18:44:04.121389091Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Dec 12 18:44:06.067223 kubelet[3062]: E1212 18:44:06.067169 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pls24" podUID="541d8bd6-57ea-4711-86e4-5819a7795d8f" Dec 12 18:44:06.471307 containerd[1769]: time="2025-12-12T18:44:06.471242437Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:44:06.472771 containerd[1769]: time="2025-12-12T18:44:06.472720626Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70446859" Dec 12 18:44:06.474471 containerd[1769]: time="2025-12-12T18:44:06.474424391Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:44:06.476736 containerd[1769]: time="2025-12-12T18:44:06.476690865Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:44:06.477304 containerd[1769]: time="2025-12-12T18:44:06.477255720Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 2.355832695s" Dec 12 18:44:06.477304 containerd[1769]: time="2025-12-12T18:44:06.477287936Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Dec 12 18:44:06.479582 containerd[1769]: time="2025-12-12T18:44:06.479554104Z" level=info msg="CreateContainer within sandbox \"a6f77ae20c16b8f01ff967c34da8a6c45896ff9964b43bd75dc4778c30d271a0\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Dec 12 18:44:06.489712 containerd[1769]: time="2025-12-12T18:44:06.489668872Z" level=info msg="Container b0934aab84a3e2f3ba27df79d581745cdfa9bc965ad42980e0b43383b139ff27: CDI devices from CRI Config.CDIDevices: []" Dec 12 18:44:06.501664 containerd[1769]: time="2025-12-12T18:44:06.501619777Z" level=info msg="CreateContainer within sandbox \"a6f77ae20c16b8f01ff967c34da8a6c45896ff9964b43bd75dc4778c30d271a0\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"b0934aab84a3e2f3ba27df79d581745cdfa9bc965ad42980e0b43383b139ff27\"" Dec 12 18:44:06.502209 containerd[1769]: time="2025-12-12T18:44:06.502182933Z" level=info msg="StartContainer for \"b0934aab84a3e2f3ba27df79d581745cdfa9bc965ad42980e0b43383b139ff27\"" Dec 12 18:44:06.503467 containerd[1769]: time="2025-12-12T18:44:06.503441331Z" level=info msg="connecting to shim b0934aab84a3e2f3ba27df79d581745cdfa9bc965ad42980e0b43383b139ff27" address="unix:///run/containerd/s/b7c778ce930b309857723dff04064cb8312486388016cc2aeb34ed381a1c3b10" protocol=ttrpc version=3 Dec 12 18:44:06.525290 systemd[1]: Started cri-containerd-b0934aab84a3e2f3ba27df79d581745cdfa9bc965ad42980e0b43383b139ff27.scope - libcontainer container b0934aab84a3e2f3ba27df79d581745cdfa9bc965ad42980e0b43383b139ff27. Dec 12 18:44:06.587317 kubelet[3062]: I1212 18:44:06.587236 3062 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 12 18:44:06.613448 containerd[1769]: time="2025-12-12T18:44:06.613408526Z" level=info msg="StartContainer for \"b0934aab84a3e2f3ba27df79d581745cdfa9bc965ad42980e0b43383b139ff27\" returns successfully" Dec 12 18:44:07.047420 systemd[1]: cri-containerd-b0934aab84a3e2f3ba27df79d581745cdfa9bc965ad42980e0b43383b139ff27.scope: Deactivated successfully. Dec 12 18:44:07.047698 systemd[1]: cri-containerd-b0934aab84a3e2f3ba27df79d581745cdfa9bc965ad42980e0b43383b139ff27.scope: Consumed 566ms CPU time, 191.7M memory peak, 171.3M written to disk. Dec 12 18:44:07.049731 containerd[1769]: time="2025-12-12T18:44:07.049419701Z" level=info msg="received container exit event container_id:\"b0934aab84a3e2f3ba27df79d581745cdfa9bc965ad42980e0b43383b139ff27\" id:\"b0934aab84a3e2f3ba27df79d581745cdfa9bc965ad42980e0b43383b139ff27\" pid:3877 exited_at:{seconds:1765565047 nanos:49171964}" Dec 12 18:44:07.063765 kubelet[3062]: I1212 18:44:07.063723 3062 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Dec 12 18:44:07.071838 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b0934aab84a3e2f3ba27df79d581745cdfa9bc965ad42980e0b43383b139ff27-rootfs.mount: Deactivated successfully. Dec 12 18:44:07.094378 systemd[1]: Created slice kubepods-burstable-pod17892527_ab06_4143_9edf_e99eaea41942.slice - libcontainer container kubepods-burstable-pod17892527_ab06_4143_9edf_e99eaea41942.slice. Dec 12 18:44:07.102689 systemd[1]: Created slice kubepods-burstable-pode01e5f95_946a_4bdc_b90f_58fb4afdeffd.slice - libcontainer container kubepods-burstable-pode01e5f95_946a_4bdc_b90f_58fb4afdeffd.slice. Dec 12 18:44:07.107618 systemd[1]: Created slice kubepods-besteffort-pod0d411828_ce18_4a86_9fa9_09a812dd1345.slice - libcontainer container kubepods-besteffort-pod0d411828_ce18_4a86_9fa9_09a812dd1345.slice. Dec 12 18:44:07.111138 kubelet[3062]: I1212 18:44:07.110368 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgmbx\" (UniqueName: \"kubernetes.io/projected/abfafbe4-68ce-455d-b3a3-cff627574124-kube-api-access-mgmbx\") pod \"whisker-6c4894f89-z88s6\" (UID: \"abfafbe4-68ce-455d-b3a3-cff627574124\") " pod="calico-system/whisker-6c4894f89-z88s6" Dec 12 18:44:07.111138 kubelet[3062]: I1212 18:44:07.110412 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/77f6fa4e-0cea-4d83-8d55-8707dfa4d505-goldmane-key-pair\") pod \"goldmane-666569f655-pcfdn\" (UID: \"77f6fa4e-0cea-4d83-8d55-8707dfa4d505\") " pod="calico-system/goldmane-666569f655-pcfdn" Dec 12 18:44:07.111138 kubelet[3062]: I1212 18:44:07.110432 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abfafbe4-68ce-455d-b3a3-cff627574124-whisker-ca-bundle\") pod \"whisker-6c4894f89-z88s6\" (UID: \"abfafbe4-68ce-455d-b3a3-cff627574124\") " pod="calico-system/whisker-6c4894f89-z88s6" Dec 12 18:44:07.111138 kubelet[3062]: I1212 18:44:07.110450 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/590a3c9e-ec4d-4eb7-8a94-169742d9929c-calico-apiserver-certs\") pod \"calico-apiserver-85cb55bbfc-bnqwj\" (UID: \"590a3c9e-ec4d-4eb7-8a94-169742d9929c\") " pod="calico-apiserver/calico-apiserver-85cb55bbfc-bnqwj" Dec 12 18:44:07.111138 kubelet[3062]: I1212 18:44:07.110469 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnkrr\" (UniqueName: \"kubernetes.io/projected/17892527-ab06-4143-9edf-e99eaea41942-kube-api-access-xnkrr\") pod \"coredns-668d6bf9bc-2rb65\" (UID: \"17892527-ab06-4143-9edf-e99eaea41942\") " pod="kube-system/coredns-668d6bf9bc-2rb65" Dec 12 18:44:07.111623 kubelet[3062]: I1212 18:44:07.110484 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e01e5f95-946a-4bdc-b90f-58fb4afdeffd-config-volume\") pod \"coredns-668d6bf9bc-lt2zp\" (UID: \"e01e5f95-946a-4bdc-b90f-58fb4afdeffd\") " pod="kube-system/coredns-668d6bf9bc-lt2zp" Dec 12 18:44:07.111623 kubelet[3062]: I1212 18:44:07.110501 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq72z\" (UniqueName: \"kubernetes.io/projected/e01e5f95-946a-4bdc-b90f-58fb4afdeffd-kube-api-access-tq72z\") pod \"coredns-668d6bf9bc-lt2zp\" (UID: \"e01e5f95-946a-4bdc-b90f-58fb4afdeffd\") " pod="kube-system/coredns-668d6bf9bc-lt2zp" Dec 12 18:44:07.111623 kubelet[3062]: I1212 18:44:07.110516 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77f6fa4e-0cea-4d83-8d55-8707dfa4d505-config\") pod \"goldmane-666569f655-pcfdn\" (UID: \"77f6fa4e-0cea-4d83-8d55-8707dfa4d505\") " pod="calico-system/goldmane-666569f655-pcfdn" Dec 12 18:44:07.111623 kubelet[3062]: I1212 18:44:07.110535 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nch2q\" (UniqueName: \"kubernetes.io/projected/20d32882-7b09-425d-879e-55aa1742ac4a-kube-api-access-nch2q\") pod \"calico-apiserver-85cb55bbfc-m7vzj\" (UID: \"20d32882-7b09-425d-879e-55aa1742ac4a\") " pod="calico-apiserver/calico-apiserver-85cb55bbfc-m7vzj" Dec 12 18:44:07.111623 kubelet[3062]: I1212 18:44:07.110549 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77f6fa4e-0cea-4d83-8d55-8707dfa4d505-goldmane-ca-bundle\") pod \"goldmane-666569f655-pcfdn\" (UID: \"77f6fa4e-0cea-4d83-8d55-8707dfa4d505\") " pod="calico-system/goldmane-666569f655-pcfdn" Dec 12 18:44:07.111994 kubelet[3062]: I1212 18:44:07.110566 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldgzq\" (UniqueName: \"kubernetes.io/projected/0d411828-ce18-4a86-9fa9-09a812dd1345-kube-api-access-ldgzq\") pod \"calico-kube-controllers-6678f769fb-bpdwv\" (UID: \"0d411828-ce18-4a86-9fa9-09a812dd1345\") " pod="calico-system/calico-kube-controllers-6678f769fb-bpdwv" Dec 12 18:44:07.111994 kubelet[3062]: I1212 18:44:07.110584 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d411828-ce18-4a86-9fa9-09a812dd1345-tigera-ca-bundle\") pod \"calico-kube-controllers-6678f769fb-bpdwv\" (UID: \"0d411828-ce18-4a86-9fa9-09a812dd1345\") " pod="calico-system/calico-kube-controllers-6678f769fb-bpdwv" Dec 12 18:44:07.111994 kubelet[3062]: I1212 18:44:07.110603 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wccvq\" (UniqueName: \"kubernetes.io/projected/77f6fa4e-0cea-4d83-8d55-8707dfa4d505-kube-api-access-wccvq\") pod \"goldmane-666569f655-pcfdn\" (UID: \"77f6fa4e-0cea-4d83-8d55-8707dfa4d505\") " pod="calico-system/goldmane-666569f655-pcfdn" Dec 12 18:44:07.111994 kubelet[3062]: I1212 18:44:07.110628 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/20d32882-7b09-425d-879e-55aa1742ac4a-calico-apiserver-certs\") pod \"calico-apiserver-85cb55bbfc-m7vzj\" (UID: \"20d32882-7b09-425d-879e-55aa1742ac4a\") " pod="calico-apiserver/calico-apiserver-85cb55bbfc-m7vzj" Dec 12 18:44:07.111994 kubelet[3062]: I1212 18:44:07.110644 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-278rl\" (UniqueName: \"kubernetes.io/projected/590a3c9e-ec4d-4eb7-8a94-169742d9929c-kube-api-access-278rl\") pod \"calico-apiserver-85cb55bbfc-bnqwj\" (UID: \"590a3c9e-ec4d-4eb7-8a94-169742d9929c\") " pod="calico-apiserver/calico-apiserver-85cb55bbfc-bnqwj" Dec 12 18:44:07.111900 systemd[1]: Created slice kubepods-besteffort-pod590a3c9e_ec4d_4eb7_8a94_169742d9929c.slice - libcontainer container kubepods-besteffort-pod590a3c9e_ec4d_4eb7_8a94_169742d9929c.slice. Dec 12 18:44:07.112200 kubelet[3062]: I1212 18:44:07.110659 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/abfafbe4-68ce-455d-b3a3-cff627574124-whisker-backend-key-pair\") pod \"whisker-6c4894f89-z88s6\" (UID: \"abfafbe4-68ce-455d-b3a3-cff627574124\") " pod="calico-system/whisker-6c4894f89-z88s6" Dec 12 18:44:07.112200 kubelet[3062]: I1212 18:44:07.110676 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/17892527-ab06-4143-9edf-e99eaea41942-config-volume\") pod \"coredns-668d6bf9bc-2rb65\" (UID: \"17892527-ab06-4143-9edf-e99eaea41942\") " pod="kube-system/coredns-668d6bf9bc-2rb65" Dec 12 18:44:07.114988 systemd[1]: Created slice kubepods-besteffort-pod20d32882_7b09_425d_879e_55aa1742ac4a.slice - libcontainer container kubepods-besteffort-pod20d32882_7b09_425d_879e_55aa1742ac4a.slice. Dec 12 18:44:07.119686 systemd[1]: Created slice kubepods-besteffort-podabfafbe4_68ce_455d_b3a3_cff627574124.slice - libcontainer container kubepods-besteffort-podabfafbe4_68ce_455d_b3a3_cff627574124.slice. Dec 12 18:44:07.124565 systemd[1]: Created slice kubepods-besteffort-pod77f6fa4e_0cea_4d83_8d55_8707dfa4d505.slice - libcontainer container kubepods-besteffort-pod77f6fa4e_0cea_4d83_8d55_8707dfa4d505.slice. Dec 12 18:44:07.399810 containerd[1769]: time="2025-12-12T18:44:07.399758955Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-2rb65,Uid:17892527-ab06-4143-9edf-e99eaea41942,Namespace:kube-system,Attempt:0,}" Dec 12 18:44:07.405580 containerd[1769]: time="2025-12-12T18:44:07.405538646Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-lt2zp,Uid:e01e5f95-946a-4bdc-b90f-58fb4afdeffd,Namespace:kube-system,Attempt:0,}" Dec 12 18:44:07.411446 containerd[1769]: time="2025-12-12T18:44:07.411406947Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6678f769fb-bpdwv,Uid:0d411828-ce18-4a86-9fa9-09a812dd1345,Namespace:calico-system,Attempt:0,}" Dec 12 18:44:07.414274 containerd[1769]: time="2025-12-12T18:44:07.414243203Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85cb55bbfc-bnqwj,Uid:590a3c9e-ec4d-4eb7-8a94-169742d9929c,Namespace:calico-apiserver,Attempt:0,}" Dec 12 18:44:07.418000 containerd[1769]: time="2025-12-12T18:44:07.417957290Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85cb55bbfc-m7vzj,Uid:20d32882-7b09-425d-879e-55aa1742ac4a,Namespace:calico-apiserver,Attempt:0,}" Dec 12 18:44:07.423697 containerd[1769]: time="2025-12-12T18:44:07.423658534Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6c4894f89-z88s6,Uid:abfafbe4-68ce-455d-b3a3-cff627574124,Namespace:calico-system,Attempt:0,}" Dec 12 18:44:07.426825 containerd[1769]: time="2025-12-12T18:44:07.426766287Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-pcfdn,Uid:77f6fa4e-0cea-4d83-8d55-8707dfa4d505,Namespace:calico-system,Attempt:0,}" Dec 12 18:44:07.463090 containerd[1769]: time="2025-12-12T18:44:07.462475365Z" level=error msg="Failed to destroy network for sandbox \"d8ce0292ef40493e0966d54c53dc9efef282de5678f3bba3780c7347b548fcf1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:44:07.464937 containerd[1769]: time="2025-12-12T18:44:07.464834647Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-2rb65,Uid:17892527-ab06-4143-9edf-e99eaea41942,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d8ce0292ef40493e0966d54c53dc9efef282de5678f3bba3780c7347b548fcf1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:44:07.465200 kubelet[3062]: E1212 18:44:07.465158 3062 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d8ce0292ef40493e0966d54c53dc9efef282de5678f3bba3780c7347b548fcf1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:44:07.465261 kubelet[3062]: E1212 18:44:07.465244 3062 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d8ce0292ef40493e0966d54c53dc9efef282de5678f3bba3780c7347b548fcf1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-2rb65" Dec 12 18:44:07.465299 kubelet[3062]: E1212 18:44:07.465268 3062 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d8ce0292ef40493e0966d54c53dc9efef282de5678f3bba3780c7347b548fcf1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-2rb65" Dec 12 18:44:07.465343 kubelet[3062]: E1212 18:44:07.465319 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-2rb65_kube-system(17892527-ab06-4143-9edf-e99eaea41942)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-2rb65_kube-system(17892527-ab06-4143-9edf-e99eaea41942)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d8ce0292ef40493e0966d54c53dc9efef282de5678f3bba3780c7347b548fcf1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-2rb65" podUID="17892527-ab06-4143-9edf-e99eaea41942" Dec 12 18:44:07.469946 containerd[1769]: time="2025-12-12T18:44:07.469910357Z" level=error msg="Failed to destroy network for sandbox \"128e258324418a81d4cc106276ed7c68644275e70b90c304f1bc33276226d0e8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:44:07.472658 containerd[1769]: time="2025-12-12T18:44:07.472612174Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-lt2zp,Uid:e01e5f95-946a-4bdc-b90f-58fb4afdeffd,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"128e258324418a81d4cc106276ed7c68644275e70b90c304f1bc33276226d0e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:44:07.473030 kubelet[3062]: E1212 18:44:07.472815 3062 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"128e258324418a81d4cc106276ed7c68644275e70b90c304f1bc33276226d0e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:44:07.473030 kubelet[3062]: E1212 18:44:07.472865 3062 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"128e258324418a81d4cc106276ed7c68644275e70b90c304f1bc33276226d0e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-lt2zp" Dec 12 18:44:07.473030 kubelet[3062]: E1212 18:44:07.472887 3062 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"128e258324418a81d4cc106276ed7c68644275e70b90c304f1bc33276226d0e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-lt2zp" Dec 12 18:44:07.473140 kubelet[3062]: E1212 18:44:07.472921 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-lt2zp_kube-system(e01e5f95-946a-4bdc-b90f-58fb4afdeffd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-lt2zp_kube-system(e01e5f95-946a-4bdc-b90f-58fb4afdeffd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"128e258324418a81d4cc106276ed7c68644275e70b90c304f1bc33276226d0e8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-lt2zp" podUID="e01e5f95-946a-4bdc-b90f-58fb4afdeffd" Dec 12 18:44:07.476290 containerd[1769]: time="2025-12-12T18:44:07.476132758Z" level=error msg="Failed to destroy network for sandbox \"a9e9923339e4b8266c5d6d11170d103d45faddd59a4baeec199d66271fdfd5b8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:44:07.477910 containerd[1769]: time="2025-12-12T18:44:07.477862084Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6678f769fb-bpdwv,Uid:0d411828-ce18-4a86-9fa9-09a812dd1345,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9e9923339e4b8266c5d6d11170d103d45faddd59a4baeec199d66271fdfd5b8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:44:07.478496 kubelet[3062]: E1212 18:44:07.478458 3062 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9e9923339e4b8266c5d6d11170d103d45faddd59a4baeec199d66271fdfd5b8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:44:07.478559 kubelet[3062]: E1212 18:44:07.478517 3062 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9e9923339e4b8266c5d6d11170d103d45faddd59a4baeec199d66271fdfd5b8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6678f769fb-bpdwv" Dec 12 18:44:07.478559 kubelet[3062]: E1212 18:44:07.478537 3062 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9e9923339e4b8266c5d6d11170d103d45faddd59a4baeec199d66271fdfd5b8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6678f769fb-bpdwv" Dec 12 18:44:07.478617 kubelet[3062]: E1212 18:44:07.478595 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6678f769fb-bpdwv_calico-system(0d411828-ce18-4a86-9fa9-09a812dd1345)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6678f769fb-bpdwv_calico-system(0d411828-ce18-4a86-9fa9-09a812dd1345)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a9e9923339e4b8266c5d6d11170d103d45faddd59a4baeec199d66271fdfd5b8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6678f769fb-bpdwv" podUID="0d411828-ce18-4a86-9fa9-09a812dd1345" Dec 12 18:44:07.491733 containerd[1769]: time="2025-12-12T18:44:07.491669263Z" level=error msg="Failed to destroy network for sandbox \"28392351abb18e2085c05ddac066f547a8502233a6a19dcd60a5b5bdbcf70480\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:44:07.493754 containerd[1769]: time="2025-12-12T18:44:07.493715979Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85cb55bbfc-m7vzj,Uid:20d32882-7b09-425d-879e-55aa1742ac4a,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"28392351abb18e2085c05ddac066f547a8502233a6a19dcd60a5b5bdbcf70480\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:44:07.493993 kubelet[3062]: E1212 18:44:07.493940 3062 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"28392351abb18e2085c05ddac066f547a8502233a6a19dcd60a5b5bdbcf70480\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:44:07.494039 kubelet[3062]: E1212 18:44:07.494011 3062 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"28392351abb18e2085c05ddac066f547a8502233a6a19dcd60a5b5bdbcf70480\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85cb55bbfc-m7vzj" Dec 12 18:44:07.494039 kubelet[3062]: E1212 18:44:07.494034 3062 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"28392351abb18e2085c05ddac066f547a8502233a6a19dcd60a5b5bdbcf70480\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85cb55bbfc-m7vzj" Dec 12 18:44:07.494120 kubelet[3062]: E1212 18:44:07.494085 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-85cb55bbfc-m7vzj_calico-apiserver(20d32882-7b09-425d-879e-55aa1742ac4a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-85cb55bbfc-m7vzj_calico-apiserver(20d32882-7b09-425d-879e-55aa1742ac4a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"28392351abb18e2085c05ddac066f547a8502233a6a19dcd60a5b5bdbcf70480\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-m7vzj" podUID="20d32882-7b09-425d-879e-55aa1742ac4a" Dec 12 18:44:07.502117 containerd[1769]: time="2025-12-12T18:44:07.501965093Z" level=error msg="Failed to destroy network for sandbox \"d0069650679431249aa4d13cab340e7858055ffb7a8ba53cf63aeb40b048edc0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:44:07.504363 systemd[1]: run-netns-cni\x2ddbb64f45\x2dc9e4\x2d6e30\x2d8377\x2dc4bbe2ebf80e.mount: Deactivated successfully. Dec 12 18:44:07.504454 systemd[1]: run-netns-cni\x2dabe35924\x2d995f\x2d0a9b\x2d5894\x2d3428ad940529.mount: Deactivated successfully. Dec 12 18:44:07.505224 containerd[1769]: time="2025-12-12T18:44:07.505179085Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-pcfdn,Uid:77f6fa4e-0cea-4d83-8d55-8707dfa4d505,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0069650679431249aa4d13cab340e7858055ffb7a8ba53cf63aeb40b048edc0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:44:07.505508 kubelet[3062]: E1212 18:44:07.505465 3062 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0069650679431249aa4d13cab340e7858055ffb7a8ba53cf63aeb40b048edc0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:44:07.505560 kubelet[3062]: E1212 18:44:07.505532 3062 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0069650679431249aa4d13cab340e7858055ffb7a8ba53cf63aeb40b048edc0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-pcfdn" Dec 12 18:44:07.505560 kubelet[3062]: E1212 18:44:07.505554 3062 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0069650679431249aa4d13cab340e7858055ffb7a8ba53cf63aeb40b048edc0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-pcfdn" Dec 12 18:44:07.505847 kubelet[3062]: E1212 18:44:07.505601 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-pcfdn_calico-system(77f6fa4e-0cea-4d83-8d55-8707dfa4d505)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-pcfdn_calico-system(77f6fa4e-0cea-4d83-8d55-8707dfa4d505)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d0069650679431249aa4d13cab340e7858055ffb7a8ba53cf63aeb40b048edc0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-pcfdn" podUID="77f6fa4e-0cea-4d83-8d55-8707dfa4d505" Dec 12 18:44:07.505936 containerd[1769]: time="2025-12-12T18:44:07.505806176Z" level=error msg="Failed to destroy network for sandbox \"cabe698e3a93ea3f7352202e7eddcf56a28e6862fdd901cfd27765f3145ea053\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:44:07.507473 systemd[1]: run-netns-cni\x2de6d15a46\x2d960d\x2d50ab\x2dbb60\x2da70082b50313.mount: Deactivated successfully. Dec 12 18:44:07.509531 containerd[1769]: time="2025-12-12T18:44:07.509493952Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85cb55bbfc-bnqwj,Uid:590a3c9e-ec4d-4eb7-8a94-169742d9929c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cabe698e3a93ea3f7352202e7eddcf56a28e6862fdd901cfd27765f3145ea053\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:44:07.509782 kubelet[3062]: E1212 18:44:07.509749 3062 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cabe698e3a93ea3f7352202e7eddcf56a28e6862fdd901cfd27765f3145ea053\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:44:07.509828 kubelet[3062]: E1212 18:44:07.509804 3062 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cabe698e3a93ea3f7352202e7eddcf56a28e6862fdd901cfd27765f3145ea053\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85cb55bbfc-bnqwj" Dec 12 18:44:07.509854 kubelet[3062]: E1212 18:44:07.509825 3062 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cabe698e3a93ea3f7352202e7eddcf56a28e6862fdd901cfd27765f3145ea053\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85cb55bbfc-bnqwj" Dec 12 18:44:07.509885 kubelet[3062]: E1212 18:44:07.509862 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-85cb55bbfc-bnqwj_calico-apiserver(590a3c9e-ec4d-4eb7-8a94-169742d9929c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-85cb55bbfc-bnqwj_calico-apiserver(590a3c9e-ec4d-4eb7-8a94-169742d9929c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cabe698e3a93ea3f7352202e7eddcf56a28e6862fdd901cfd27765f3145ea053\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-bnqwj" podUID="590a3c9e-ec4d-4eb7-8a94-169742d9929c" Dec 12 18:44:07.520931 containerd[1769]: time="2025-12-12T18:44:07.520881107Z" level=error msg="Failed to destroy network for sandbox \"42224b159b0819fc9bd19eee9f1dd52f4228292144ed1476a68c7110eaf715df\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:44:07.522573 containerd[1769]: time="2025-12-12T18:44:07.522523348Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6c4894f89-z88s6,Uid:abfafbe4-68ce-455d-b3a3-cff627574124,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"42224b159b0819fc9bd19eee9f1dd52f4228292144ed1476a68c7110eaf715df\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:44:07.522799 kubelet[3062]: E1212 18:44:07.522748 3062 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"42224b159b0819fc9bd19eee9f1dd52f4228292144ed1476a68c7110eaf715df\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:44:07.522885 kubelet[3062]: E1212 18:44:07.522826 3062 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"42224b159b0819fc9bd19eee9f1dd52f4228292144ed1476a68c7110eaf715df\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6c4894f89-z88s6" Dec 12 18:44:07.522885 kubelet[3062]: E1212 18:44:07.522847 3062 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"42224b159b0819fc9bd19eee9f1dd52f4228292144ed1476a68c7110eaf715df\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6c4894f89-z88s6" Dec 12 18:44:07.522954 kubelet[3062]: E1212 18:44:07.522894 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6c4894f89-z88s6_calico-system(abfafbe4-68ce-455d-b3a3-cff627574124)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6c4894f89-z88s6_calico-system(abfafbe4-68ce-455d-b3a3-cff627574124)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"42224b159b0819fc9bd19eee9f1dd52f4228292144ed1476a68c7110eaf715df\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6c4894f89-z88s6" podUID="abfafbe4-68ce-455d-b3a3-cff627574124" Dec 12 18:44:07.522943 systemd[1]: run-netns-cni\x2d226f3405\x2d5637\x2d85d4\x2dd4ba\x2d96a5fcaf8a08.mount: Deactivated successfully. Dec 12 18:44:08.072500 systemd[1]: Created slice kubepods-besteffort-pod541d8bd6_57ea_4711_86e4_5819a7795d8f.slice - libcontainer container kubepods-besteffort-pod541d8bd6_57ea_4711_86e4_5819a7795d8f.slice. Dec 12 18:44:08.074833 containerd[1769]: time="2025-12-12T18:44:08.074753891Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-pls24,Uid:541d8bd6-57ea-4711-86e4-5819a7795d8f,Namespace:calico-system,Attempt:0,}" Dec 12 18:44:08.121224 containerd[1769]: time="2025-12-12T18:44:08.121167944Z" level=error msg="Failed to destroy network for sandbox \"acdc03be8605bee2cf7f95d304e23ecf1e44ecdeba38904d6a53581be37d1cc6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:44:08.123286 containerd[1769]: time="2025-12-12T18:44:08.123238252Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-pls24,Uid:541d8bd6-57ea-4711-86e4-5819a7795d8f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"acdc03be8605bee2cf7f95d304e23ecf1e44ecdeba38904d6a53581be37d1cc6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:44:08.123475 kubelet[3062]: E1212 18:44:08.123442 3062 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"acdc03be8605bee2cf7f95d304e23ecf1e44ecdeba38904d6a53581be37d1cc6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:44:08.124475 kubelet[3062]: E1212 18:44:08.123496 3062 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"acdc03be8605bee2cf7f95d304e23ecf1e44ecdeba38904d6a53581be37d1cc6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-pls24" Dec 12 18:44:08.124475 kubelet[3062]: E1212 18:44:08.123518 3062 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"acdc03be8605bee2cf7f95d304e23ecf1e44ecdeba38904d6a53581be37d1cc6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-pls24" Dec 12 18:44:08.124475 kubelet[3062]: E1212 18:44:08.123567 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-pls24_calico-system(541d8bd6-57ea-4711-86e4-5819a7795d8f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-pls24_calico-system(541d8bd6-57ea-4711-86e4-5819a7795d8f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"acdc03be8605bee2cf7f95d304e23ecf1e44ecdeba38904d6a53581be37d1cc6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-pls24" podUID="541d8bd6-57ea-4711-86e4-5819a7795d8f" Dec 12 18:44:08.136180 containerd[1769]: time="2025-12-12T18:44:08.136135380Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Dec 12 18:44:08.491823 systemd[1]: run-netns-cni\x2da079bf4b\x2d4cc2\x2d0197\x2df58d\x2dd5a144452f2c.mount: Deactivated successfully. Dec 12 18:44:11.805366 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3423936048.mount: Deactivated successfully. Dec 12 18:44:11.825756 containerd[1769]: time="2025-12-12T18:44:11.825681893Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:44:11.826738 containerd[1769]: time="2025-12-12T18:44:11.826708503Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156883675" Dec 12 18:44:11.828273 containerd[1769]: time="2025-12-12T18:44:11.828246587Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:44:11.830503 containerd[1769]: time="2025-12-12T18:44:11.830452829Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:44:11.830884 containerd[1769]: time="2025-12-12T18:44:11.830837245Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 3.69465769s" Dec 12 18:44:11.830884 containerd[1769]: time="2025-12-12T18:44:11.830869077Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Dec 12 18:44:11.839243 containerd[1769]: time="2025-12-12T18:44:11.839198099Z" level=info msg="CreateContainer within sandbox \"a6f77ae20c16b8f01ff967c34da8a6c45896ff9964b43bd75dc4778c30d271a0\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Dec 12 18:44:11.862091 containerd[1769]: time="2025-12-12T18:44:11.861427470Z" level=info msg="Container 569fc39c58a96913a9af49eda1bf41f1cadd7de1e762ba7442796776758a798b: CDI devices from CRI Config.CDIDevices: []" Dec 12 18:44:11.872445 containerd[1769]: time="2025-12-12T18:44:11.872402945Z" level=info msg="CreateContainer within sandbox \"a6f77ae20c16b8f01ff967c34da8a6c45896ff9964b43bd75dc4778c30d271a0\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"569fc39c58a96913a9af49eda1bf41f1cadd7de1e762ba7442796776758a798b\"" Dec 12 18:44:11.873158 containerd[1769]: time="2025-12-12T18:44:11.873132928Z" level=info msg="StartContainer for \"569fc39c58a96913a9af49eda1bf41f1cadd7de1e762ba7442796776758a798b\"" Dec 12 18:44:11.874494 containerd[1769]: time="2025-12-12T18:44:11.874470887Z" level=info msg="connecting to shim 569fc39c58a96913a9af49eda1bf41f1cadd7de1e762ba7442796776758a798b" address="unix:///run/containerd/s/b7c778ce930b309857723dff04064cb8312486388016cc2aeb34ed381a1c3b10" protocol=ttrpc version=3 Dec 12 18:44:11.903319 systemd[1]: Started cri-containerd-569fc39c58a96913a9af49eda1bf41f1cadd7de1e762ba7442796776758a798b.scope - libcontainer container 569fc39c58a96913a9af49eda1bf41f1cadd7de1e762ba7442796776758a798b. Dec 12 18:44:11.992083 containerd[1769]: time="2025-12-12T18:44:11.991575967Z" level=info msg="StartContainer for \"569fc39c58a96913a9af49eda1bf41f1cadd7de1e762ba7442796776758a798b\" returns successfully" Dec 12 18:44:12.092179 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Dec 12 18:44:12.092328 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Dec 12 18:44:12.177964 kubelet[3062]: I1212 18:44:12.177884 3062 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-fjmlz" podStartSLOduration=1.629128858 podStartE2EDuration="13.177867333s" podCreationTimestamp="2025-12-12 18:43:59 +0000 UTC" firstStartedPulling="2025-12-12 18:44:00.282822595 +0000 UTC m=+18.302559470" lastFinishedPulling="2025-12-12 18:44:11.83156107 +0000 UTC m=+29.851297945" observedRunningTime="2025-12-12 18:44:12.162256785 +0000 UTC m=+30.181993675" watchObservedRunningTime="2025-12-12 18:44:12.177867333 +0000 UTC m=+30.197604208" Dec 12 18:44:12.243778 kubelet[3062]: I1212 18:44:12.243590 3062 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgmbx\" (UniqueName: \"kubernetes.io/projected/abfafbe4-68ce-455d-b3a3-cff627574124-kube-api-access-mgmbx\") pod \"abfafbe4-68ce-455d-b3a3-cff627574124\" (UID: \"abfafbe4-68ce-455d-b3a3-cff627574124\") " Dec 12 18:44:12.243778 kubelet[3062]: I1212 18:44:12.243631 3062 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abfafbe4-68ce-455d-b3a3-cff627574124-whisker-ca-bundle\") pod \"abfafbe4-68ce-455d-b3a3-cff627574124\" (UID: \"abfafbe4-68ce-455d-b3a3-cff627574124\") " Dec 12 18:44:12.243778 kubelet[3062]: I1212 18:44:12.243659 3062 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/abfafbe4-68ce-455d-b3a3-cff627574124-whisker-backend-key-pair\") pod \"abfafbe4-68ce-455d-b3a3-cff627574124\" (UID: \"abfafbe4-68ce-455d-b3a3-cff627574124\") " Dec 12 18:44:12.244118 kubelet[3062]: I1212 18:44:12.244013 3062 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abfafbe4-68ce-455d-b3a3-cff627574124-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "abfafbe4-68ce-455d-b3a3-cff627574124" (UID: "abfafbe4-68ce-455d-b3a3-cff627574124"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Dec 12 18:44:12.246340 kubelet[3062]: I1212 18:44:12.246306 3062 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abfafbe4-68ce-455d-b3a3-cff627574124-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "abfafbe4-68ce-455d-b3a3-cff627574124" (UID: "abfafbe4-68ce-455d-b3a3-cff627574124"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Dec 12 18:44:12.246885 kubelet[3062]: I1212 18:44:12.246554 3062 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abfafbe4-68ce-455d-b3a3-cff627574124-kube-api-access-mgmbx" (OuterVolumeSpecName: "kube-api-access-mgmbx") pod "abfafbe4-68ce-455d-b3a3-cff627574124" (UID: "abfafbe4-68ce-455d-b3a3-cff627574124"). InnerVolumeSpecName "kube-api-access-mgmbx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Dec 12 18:44:12.345380 kubelet[3062]: I1212 18:44:12.345048 3062 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/abfafbe4-68ce-455d-b3a3-cff627574124-whisker-backend-key-pair\") on node \"ci-4459-2-2-4-78a5f49b53\" DevicePath \"\"" Dec 12 18:44:12.345380 kubelet[3062]: I1212 18:44:12.345129 3062 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mgmbx\" (UniqueName: \"kubernetes.io/projected/abfafbe4-68ce-455d-b3a3-cff627574124-kube-api-access-mgmbx\") on node \"ci-4459-2-2-4-78a5f49b53\" DevicePath \"\"" Dec 12 18:44:12.345380 kubelet[3062]: I1212 18:44:12.345141 3062 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abfafbe4-68ce-455d-b3a3-cff627574124-whisker-ca-bundle\") on node \"ci-4459-2-2-4-78a5f49b53\" DevicePath \"\"" Dec 12 18:44:12.806200 systemd[1]: var-lib-kubelet-pods-abfafbe4\x2d68ce\x2d455d\x2db3a3\x2dcff627574124-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dmgmbx.mount: Deactivated successfully. Dec 12 18:44:12.806294 systemd[1]: var-lib-kubelet-pods-abfafbe4\x2d68ce\x2d455d\x2db3a3\x2dcff627574124-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Dec 12 18:44:13.150340 systemd[1]: Removed slice kubepods-besteffort-podabfafbe4_68ce_455d_b3a3_cff627574124.slice - libcontainer container kubepods-besteffort-podabfafbe4_68ce_455d_b3a3_cff627574124.slice. Dec 12 18:44:13.201092 systemd[1]: Created slice kubepods-besteffort-podc1fa9bf4_cb97_4583_82c6_dc2a04ecc620.slice - libcontainer container kubepods-besteffort-podc1fa9bf4_cb97_4583_82c6_dc2a04ecc620.slice. Dec 12 18:44:13.251763 kubelet[3062]: I1212 18:44:13.251691 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjpc5\" (UniqueName: \"kubernetes.io/projected/c1fa9bf4-cb97-4583-82c6-dc2a04ecc620-kube-api-access-sjpc5\") pod \"whisker-5854486f4f-qw8k4\" (UID: \"c1fa9bf4-cb97-4583-82c6-dc2a04ecc620\") " pod="calico-system/whisker-5854486f4f-qw8k4" Dec 12 18:44:13.251763 kubelet[3062]: I1212 18:44:13.251769 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1fa9bf4-cb97-4583-82c6-dc2a04ecc620-whisker-ca-bundle\") pod \"whisker-5854486f4f-qw8k4\" (UID: \"c1fa9bf4-cb97-4583-82c6-dc2a04ecc620\") " pod="calico-system/whisker-5854486f4f-qw8k4" Dec 12 18:44:13.252232 kubelet[3062]: I1212 18:44:13.251858 3062 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c1fa9bf4-cb97-4583-82c6-dc2a04ecc620-whisker-backend-key-pair\") pod \"whisker-5854486f4f-qw8k4\" (UID: \"c1fa9bf4-cb97-4583-82c6-dc2a04ecc620\") " pod="calico-system/whisker-5854486f4f-qw8k4" Dec 12 18:44:13.505580 containerd[1769]: time="2025-12-12T18:44:13.505236898Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5854486f4f-qw8k4,Uid:c1fa9bf4-cb97-4583-82c6-dc2a04ecc620,Namespace:calico-system,Attempt:0,}" Dec 12 18:44:13.602006 systemd-networkd[1674]: cali09ef17c56e8: Link UP Dec 12 18:44:13.602197 systemd-networkd[1674]: cali09ef17c56e8: Gained carrier Dec 12 18:44:13.613593 containerd[1769]: 2025-12-12 18:44:13.541 [INFO][4535] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--4--78a5f49b53-k8s-whisker--5854486f4f--qw8k4-eth0 whisker-5854486f4f- calico-system c1fa9bf4-cb97-4583-82c6-dc2a04ecc620 889 0 2025-12-12 18:44:13 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5854486f4f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-2-4-78a5f49b53 whisker-5854486f4f-qw8k4 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali09ef17c56e8 [] [] }} ContainerID="88a7a848515d0deadaebb69d1eb9bd26f00b0225e8d3c0777881b0964ea31db6" Namespace="calico-system" Pod="whisker-5854486f4f-qw8k4" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-whisker--5854486f4f--qw8k4-" Dec 12 18:44:13.613593 containerd[1769]: 2025-12-12 18:44:13.541 [INFO][4535] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="88a7a848515d0deadaebb69d1eb9bd26f00b0225e8d3c0777881b0964ea31db6" Namespace="calico-system" Pod="whisker-5854486f4f-qw8k4" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-whisker--5854486f4f--qw8k4-eth0" Dec 12 18:44:13.613593 containerd[1769]: 2025-12-12 18:44:13.563 [INFO][4552] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="88a7a848515d0deadaebb69d1eb9bd26f00b0225e8d3c0777881b0964ea31db6" HandleID="k8s-pod-network.88a7a848515d0deadaebb69d1eb9bd26f00b0225e8d3c0777881b0964ea31db6" Workload="ci--4459--2--2--4--78a5f49b53-k8s-whisker--5854486f4f--qw8k4-eth0" Dec 12 18:44:13.613593 containerd[1769]: 2025-12-12 18:44:13.563 [INFO][4552] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="88a7a848515d0deadaebb69d1eb9bd26f00b0225e8d3c0777881b0964ea31db6" HandleID="k8s-pod-network.88a7a848515d0deadaebb69d1eb9bd26f00b0225e8d3c0777881b0964ea31db6" Workload="ci--4459--2--2--4--78a5f49b53-k8s-whisker--5854486f4f--qw8k4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5820), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-4-78a5f49b53", "pod":"whisker-5854486f4f-qw8k4", "timestamp":"2025-12-12 18:44:13.563135916 +0000 UTC"}, Hostname:"ci-4459-2-2-4-78a5f49b53", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 18:44:13.613593 containerd[1769]: 2025-12-12 18:44:13.563 [INFO][4552] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 18:44:13.613593 containerd[1769]: 2025-12-12 18:44:13.563 [INFO][4552] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 18:44:13.613593 containerd[1769]: 2025-12-12 18:44:13.563 [INFO][4552] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-4-78a5f49b53' Dec 12 18:44:13.613593 containerd[1769]: 2025-12-12 18:44:13.569 [INFO][4552] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.88a7a848515d0deadaebb69d1eb9bd26f00b0225e8d3c0777881b0964ea31db6" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:13.613593 containerd[1769]: 2025-12-12 18:44:13.572 [INFO][4552] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:13.613593 containerd[1769]: 2025-12-12 18:44:13.577 [INFO][4552] ipam/ipam.go 511: Trying affinity for 192.168.91.0/26 host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:13.613593 containerd[1769]: 2025-12-12 18:44:13.578 [INFO][4552] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.0/26 host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:13.613593 containerd[1769]: 2025-12-12 18:44:13.580 [INFO][4552] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.0/26 host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:13.613593 containerd[1769]: 2025-12-12 18:44:13.580 [INFO][4552] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.91.0/26 handle="k8s-pod-network.88a7a848515d0deadaebb69d1eb9bd26f00b0225e8d3c0777881b0964ea31db6" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:13.613593 containerd[1769]: 2025-12-12 18:44:13.581 [INFO][4552] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.88a7a848515d0deadaebb69d1eb9bd26f00b0225e8d3c0777881b0964ea31db6 Dec 12 18:44:13.613593 containerd[1769]: 2025-12-12 18:44:13.585 [INFO][4552] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.91.0/26 handle="k8s-pod-network.88a7a848515d0deadaebb69d1eb9bd26f00b0225e8d3c0777881b0964ea31db6" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:13.613593 containerd[1769]: 2025-12-12 18:44:13.591 [INFO][4552] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.91.1/26] block=192.168.91.0/26 handle="k8s-pod-network.88a7a848515d0deadaebb69d1eb9bd26f00b0225e8d3c0777881b0964ea31db6" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:13.613593 containerd[1769]: 2025-12-12 18:44:13.591 [INFO][4552] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.1/26] handle="k8s-pod-network.88a7a848515d0deadaebb69d1eb9bd26f00b0225e8d3c0777881b0964ea31db6" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:13.613593 containerd[1769]: 2025-12-12 18:44:13.591 [INFO][4552] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 18:44:13.613593 containerd[1769]: 2025-12-12 18:44:13.591 [INFO][4552] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.91.1/26] IPv6=[] ContainerID="88a7a848515d0deadaebb69d1eb9bd26f00b0225e8d3c0777881b0964ea31db6" HandleID="k8s-pod-network.88a7a848515d0deadaebb69d1eb9bd26f00b0225e8d3c0777881b0964ea31db6" Workload="ci--4459--2--2--4--78a5f49b53-k8s-whisker--5854486f4f--qw8k4-eth0" Dec 12 18:44:13.614233 containerd[1769]: 2025-12-12 18:44:13.594 [INFO][4535] cni-plugin/k8s.go 418: Populated endpoint ContainerID="88a7a848515d0deadaebb69d1eb9bd26f00b0225e8d3c0777881b0964ea31db6" Namespace="calico-system" Pod="whisker-5854486f4f-qw8k4" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-whisker--5854486f4f--qw8k4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--78a5f49b53-k8s-whisker--5854486f4f--qw8k4-eth0", GenerateName:"whisker-5854486f4f-", Namespace:"calico-system", SelfLink:"", UID:"c1fa9bf4-cb97-4583-82c6-dc2a04ecc620", ResourceVersion:"889", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 44, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5854486f4f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-78a5f49b53", ContainerID:"", Pod:"whisker-5854486f4f-qw8k4", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.91.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali09ef17c56e8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:44:13.614233 containerd[1769]: 2025-12-12 18:44:13.594 [INFO][4535] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.1/32] ContainerID="88a7a848515d0deadaebb69d1eb9bd26f00b0225e8d3c0777881b0964ea31db6" Namespace="calico-system" Pod="whisker-5854486f4f-qw8k4" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-whisker--5854486f4f--qw8k4-eth0" Dec 12 18:44:13.614233 containerd[1769]: 2025-12-12 18:44:13.595 [INFO][4535] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali09ef17c56e8 ContainerID="88a7a848515d0deadaebb69d1eb9bd26f00b0225e8d3c0777881b0964ea31db6" Namespace="calico-system" Pod="whisker-5854486f4f-qw8k4" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-whisker--5854486f4f--qw8k4-eth0" Dec 12 18:44:13.614233 containerd[1769]: 2025-12-12 18:44:13.601 [INFO][4535] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="88a7a848515d0deadaebb69d1eb9bd26f00b0225e8d3c0777881b0964ea31db6" Namespace="calico-system" Pod="whisker-5854486f4f-qw8k4" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-whisker--5854486f4f--qw8k4-eth0" Dec 12 18:44:13.614233 containerd[1769]: 2025-12-12 18:44:13.601 [INFO][4535] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="88a7a848515d0deadaebb69d1eb9bd26f00b0225e8d3c0777881b0964ea31db6" Namespace="calico-system" Pod="whisker-5854486f4f-qw8k4" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-whisker--5854486f4f--qw8k4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--78a5f49b53-k8s-whisker--5854486f4f--qw8k4-eth0", GenerateName:"whisker-5854486f4f-", Namespace:"calico-system", SelfLink:"", UID:"c1fa9bf4-cb97-4583-82c6-dc2a04ecc620", ResourceVersion:"889", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 44, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5854486f4f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-78a5f49b53", ContainerID:"88a7a848515d0deadaebb69d1eb9bd26f00b0225e8d3c0777881b0964ea31db6", Pod:"whisker-5854486f4f-qw8k4", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.91.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali09ef17c56e8", MAC:"ba:f2:71:1f:83:39", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:44:13.614233 containerd[1769]: 2025-12-12 18:44:13.612 [INFO][4535] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="88a7a848515d0deadaebb69d1eb9bd26f00b0225e8d3c0777881b0964ea31db6" Namespace="calico-system" Pod="whisker-5854486f4f-qw8k4" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-whisker--5854486f4f--qw8k4-eth0" Dec 12 18:44:13.643586 containerd[1769]: time="2025-12-12T18:44:13.643534842Z" level=info msg="connecting to shim 88a7a848515d0deadaebb69d1eb9bd26f00b0225e8d3c0777881b0964ea31db6" address="unix:///run/containerd/s/f873c423910a8a7c54dd2f906cad80dec6e49b1f49edd682db2996608c702360" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:44:13.657006 systemd-networkd[1674]: vxlan.calico: Link UP Dec 12 18:44:13.657015 systemd-networkd[1674]: vxlan.calico: Gained carrier Dec 12 18:44:13.679280 systemd[1]: Started cri-containerd-88a7a848515d0deadaebb69d1eb9bd26f00b0225e8d3c0777881b0964ea31db6.scope - libcontainer container 88a7a848515d0deadaebb69d1eb9bd26f00b0225e8d3c0777881b0964ea31db6. Dec 12 18:44:13.726874 containerd[1769]: time="2025-12-12T18:44:13.726810742Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5854486f4f-qw8k4,Uid:c1fa9bf4-cb97-4583-82c6-dc2a04ecc620,Namespace:calico-system,Attempt:0,} returns sandbox id \"88a7a848515d0deadaebb69d1eb9bd26f00b0225e8d3c0777881b0964ea31db6\"" Dec 12 18:44:13.728366 containerd[1769]: time="2025-12-12T18:44:13.728338713Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 18:44:14.033876 containerd[1769]: time="2025-12-12T18:44:14.033799332Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:44:14.035599 containerd[1769]: time="2025-12-12T18:44:14.035557823Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 18:44:14.035664 containerd[1769]: time="2025-12-12T18:44:14.035637407Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 18:44:14.035856 kubelet[3062]: E1212 18:44:14.035808 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 18:44:14.035901 kubelet[3062]: E1212 18:44:14.035874 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 18:44:14.036075 kubelet[3062]: E1212 18:44:14.036033 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8b05f8ca7479467c89608bbc825ffb49,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sjpc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5854486f4f-qw8k4_calico-system(c1fa9bf4-cb97-4583-82c6-dc2a04ecc620): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 18:44:14.038177 containerd[1769]: time="2025-12-12T18:44:14.038151905Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 18:44:14.069011 kubelet[3062]: I1212 18:44:14.068954 3062 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abfafbe4-68ce-455d-b3a3-cff627574124" path="/var/lib/kubelet/pods/abfafbe4-68ce-455d-b3a3-cff627574124/volumes" Dec 12 18:44:14.384643 containerd[1769]: time="2025-12-12T18:44:14.384435644Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:44:14.387386 containerd[1769]: time="2025-12-12T18:44:14.387325769Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 18:44:14.387495 containerd[1769]: time="2025-12-12T18:44:14.387404915Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 18:44:14.387618 kubelet[3062]: E1212 18:44:14.387578 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 18:44:14.387880 kubelet[3062]: E1212 18:44:14.387627 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 18:44:14.387904 kubelet[3062]: E1212 18:44:14.387734 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sjpc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5854486f4f-qw8k4_calico-system(c1fa9bf4-cb97-4583-82c6-dc2a04ecc620): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 18:44:14.388962 kubelet[3062]: E1212 18:44:14.388911 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5854486f4f-qw8k4" podUID="c1fa9bf4-cb97-4583-82c6-dc2a04ecc620" Dec 12 18:44:14.621208 systemd-networkd[1674]: cali09ef17c56e8: Gained IPv6LL Dec 12 18:44:14.749253 systemd-networkd[1674]: vxlan.calico: Gained IPv6LL Dec 12 18:44:15.151808 kubelet[3062]: E1212 18:44:15.151741 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5854486f4f-qw8k4" podUID="c1fa9bf4-cb97-4583-82c6-dc2a04ecc620" Dec 12 18:44:19.067482 containerd[1769]: time="2025-12-12T18:44:19.067424282Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-2rb65,Uid:17892527-ab06-4143-9edf-e99eaea41942,Namespace:kube-system,Attempt:0,}" Dec 12 18:44:19.069711 containerd[1769]: time="2025-12-12T18:44:19.067488321Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-pcfdn,Uid:77f6fa4e-0cea-4d83-8d55-8707dfa4d505,Namespace:calico-system,Attempt:0,}" Dec 12 18:44:19.069711 containerd[1769]: time="2025-12-12T18:44:19.067683064Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-pls24,Uid:541d8bd6-57ea-4711-86e4-5819a7795d8f,Namespace:calico-system,Attempt:0,}" Dec 12 18:44:19.171633 systemd-networkd[1674]: cali5688346533d: Link UP Dec 12 18:44:19.172269 systemd-networkd[1674]: cali5688346533d: Gained carrier Dec 12 18:44:19.189168 containerd[1769]: 2025-12-12 18:44:19.111 [INFO][4751] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--4--78a5f49b53-k8s-goldmane--666569f655--pcfdn-eth0 goldmane-666569f655- calico-system 77f6fa4e-0cea-4d83-8d55-8707dfa4d505 821 0 2025-12-12 18:43:57 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-2-4-78a5f49b53 goldmane-666569f655-pcfdn eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali5688346533d [] [] }} ContainerID="ac0c76d1893ad48d0fe575173a8da4a4e8c8adac7e445665d8bd620e105fb3e1" Namespace="calico-system" Pod="goldmane-666569f655-pcfdn" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-goldmane--666569f655--pcfdn-" Dec 12 18:44:19.189168 containerd[1769]: 2025-12-12 18:44:19.111 [INFO][4751] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ac0c76d1893ad48d0fe575173a8da4a4e8c8adac7e445665d8bd620e105fb3e1" Namespace="calico-system" Pod="goldmane-666569f655-pcfdn" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-goldmane--666569f655--pcfdn-eth0" Dec 12 18:44:19.189168 containerd[1769]: 2025-12-12 18:44:19.135 [INFO][4796] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ac0c76d1893ad48d0fe575173a8da4a4e8c8adac7e445665d8bd620e105fb3e1" HandleID="k8s-pod-network.ac0c76d1893ad48d0fe575173a8da4a4e8c8adac7e445665d8bd620e105fb3e1" Workload="ci--4459--2--2--4--78a5f49b53-k8s-goldmane--666569f655--pcfdn-eth0" Dec 12 18:44:19.189168 containerd[1769]: 2025-12-12 18:44:19.135 [INFO][4796] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="ac0c76d1893ad48d0fe575173a8da4a4e8c8adac7e445665d8bd620e105fb3e1" HandleID="k8s-pod-network.ac0c76d1893ad48d0fe575173a8da4a4e8c8adac7e445665d8bd620e105fb3e1" Workload="ci--4459--2--2--4--78a5f49b53-k8s-goldmane--666569f655--pcfdn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003b4690), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-4-78a5f49b53", "pod":"goldmane-666569f655-pcfdn", "timestamp":"2025-12-12 18:44:19.135700748 +0000 UTC"}, Hostname:"ci-4459-2-2-4-78a5f49b53", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 18:44:19.189168 containerd[1769]: 2025-12-12 18:44:19.135 [INFO][4796] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 18:44:19.189168 containerd[1769]: 2025-12-12 18:44:19.136 [INFO][4796] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 18:44:19.189168 containerd[1769]: 2025-12-12 18:44:19.136 [INFO][4796] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-4-78a5f49b53' Dec 12 18:44:19.189168 containerd[1769]: 2025-12-12 18:44:19.142 [INFO][4796] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ac0c76d1893ad48d0fe575173a8da4a4e8c8adac7e445665d8bd620e105fb3e1" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.189168 containerd[1769]: 2025-12-12 18:44:19.147 [INFO][4796] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.189168 containerd[1769]: 2025-12-12 18:44:19.151 [INFO][4796] ipam/ipam.go 511: Trying affinity for 192.168.91.0/26 host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.189168 containerd[1769]: 2025-12-12 18:44:19.153 [INFO][4796] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.0/26 host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.189168 containerd[1769]: 2025-12-12 18:44:19.154 [INFO][4796] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.0/26 host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.189168 containerd[1769]: 2025-12-12 18:44:19.155 [INFO][4796] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.91.0/26 handle="k8s-pod-network.ac0c76d1893ad48d0fe575173a8da4a4e8c8adac7e445665d8bd620e105fb3e1" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.189168 containerd[1769]: 2025-12-12 18:44:19.156 [INFO][4796] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.ac0c76d1893ad48d0fe575173a8da4a4e8c8adac7e445665d8bd620e105fb3e1 Dec 12 18:44:19.189168 containerd[1769]: 2025-12-12 18:44:19.161 [INFO][4796] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.91.0/26 handle="k8s-pod-network.ac0c76d1893ad48d0fe575173a8da4a4e8c8adac7e445665d8bd620e105fb3e1" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.189168 containerd[1769]: 2025-12-12 18:44:19.166 [INFO][4796] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.91.2/26] block=192.168.91.0/26 handle="k8s-pod-network.ac0c76d1893ad48d0fe575173a8da4a4e8c8adac7e445665d8bd620e105fb3e1" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.189168 containerd[1769]: 2025-12-12 18:44:19.166 [INFO][4796] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.2/26] handle="k8s-pod-network.ac0c76d1893ad48d0fe575173a8da4a4e8c8adac7e445665d8bd620e105fb3e1" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.189168 containerd[1769]: 2025-12-12 18:44:19.166 [INFO][4796] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 18:44:19.189168 containerd[1769]: 2025-12-12 18:44:19.166 [INFO][4796] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.91.2/26] IPv6=[] ContainerID="ac0c76d1893ad48d0fe575173a8da4a4e8c8adac7e445665d8bd620e105fb3e1" HandleID="k8s-pod-network.ac0c76d1893ad48d0fe575173a8da4a4e8c8adac7e445665d8bd620e105fb3e1" Workload="ci--4459--2--2--4--78a5f49b53-k8s-goldmane--666569f655--pcfdn-eth0" Dec 12 18:44:19.189720 containerd[1769]: 2025-12-12 18:44:19.169 [INFO][4751] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ac0c76d1893ad48d0fe575173a8da4a4e8c8adac7e445665d8bd620e105fb3e1" Namespace="calico-system" Pod="goldmane-666569f655-pcfdn" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-goldmane--666569f655--pcfdn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--78a5f49b53-k8s-goldmane--666569f655--pcfdn-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"77f6fa4e-0cea-4d83-8d55-8707dfa4d505", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 43, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-78a5f49b53", ContainerID:"", Pod:"goldmane-666569f655-pcfdn", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.91.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali5688346533d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:44:19.189720 containerd[1769]: 2025-12-12 18:44:19.169 [INFO][4751] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.2/32] ContainerID="ac0c76d1893ad48d0fe575173a8da4a4e8c8adac7e445665d8bd620e105fb3e1" Namespace="calico-system" Pod="goldmane-666569f655-pcfdn" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-goldmane--666569f655--pcfdn-eth0" Dec 12 18:44:19.189720 containerd[1769]: 2025-12-12 18:44:19.169 [INFO][4751] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5688346533d ContainerID="ac0c76d1893ad48d0fe575173a8da4a4e8c8adac7e445665d8bd620e105fb3e1" Namespace="calico-system" Pod="goldmane-666569f655-pcfdn" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-goldmane--666569f655--pcfdn-eth0" Dec 12 18:44:19.189720 containerd[1769]: 2025-12-12 18:44:19.176 [INFO][4751] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ac0c76d1893ad48d0fe575173a8da4a4e8c8adac7e445665d8bd620e105fb3e1" Namespace="calico-system" Pod="goldmane-666569f655-pcfdn" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-goldmane--666569f655--pcfdn-eth0" Dec 12 18:44:19.189720 containerd[1769]: 2025-12-12 18:44:19.177 [INFO][4751] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ac0c76d1893ad48d0fe575173a8da4a4e8c8adac7e445665d8bd620e105fb3e1" Namespace="calico-system" Pod="goldmane-666569f655-pcfdn" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-goldmane--666569f655--pcfdn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--78a5f49b53-k8s-goldmane--666569f655--pcfdn-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"77f6fa4e-0cea-4d83-8d55-8707dfa4d505", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 43, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-78a5f49b53", ContainerID:"ac0c76d1893ad48d0fe575173a8da4a4e8c8adac7e445665d8bd620e105fb3e1", Pod:"goldmane-666569f655-pcfdn", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.91.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali5688346533d", MAC:"a6:be:fd:2f:ac:7d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:44:19.189720 containerd[1769]: 2025-12-12 18:44:19.187 [INFO][4751] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ac0c76d1893ad48d0fe575173a8da4a4e8c8adac7e445665d8bd620e105fb3e1" Namespace="calico-system" Pod="goldmane-666569f655-pcfdn" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-goldmane--666569f655--pcfdn-eth0" Dec 12 18:44:19.221723 containerd[1769]: time="2025-12-12T18:44:19.221674978Z" level=info msg="connecting to shim ac0c76d1893ad48d0fe575173a8da4a4e8c8adac7e445665d8bd620e105fb3e1" address="unix:///run/containerd/s/dfb23efc9cef4cf50017e9b352b8d349909ffc836f9216538c246ae288c7074e" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:44:19.244293 systemd[1]: Started cri-containerd-ac0c76d1893ad48d0fe575173a8da4a4e8c8adac7e445665d8bd620e105fb3e1.scope - libcontainer container ac0c76d1893ad48d0fe575173a8da4a4e8c8adac7e445665d8bd620e105fb3e1. Dec 12 18:44:19.273162 systemd-networkd[1674]: cali68a0d8a8369: Link UP Dec 12 18:44:19.276207 systemd-networkd[1674]: cali68a0d8a8369: Gained carrier Dec 12 18:44:19.286823 containerd[1769]: 2025-12-12 18:44:19.107 [INFO][4739] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--2rb65-eth0 coredns-668d6bf9bc- kube-system 17892527-ab06-4143-9edf-e99eaea41942 814 0 2025-12-12 18:43:48 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-4-78a5f49b53 coredns-668d6bf9bc-2rb65 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali68a0d8a8369 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725" Namespace="kube-system" Pod="coredns-668d6bf9bc-2rb65" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--2rb65-" Dec 12 18:44:19.286823 containerd[1769]: 2025-12-12 18:44:19.107 [INFO][4739] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725" Namespace="kube-system" Pod="coredns-668d6bf9bc-2rb65" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--2rb65-eth0" Dec 12 18:44:19.286823 containerd[1769]: 2025-12-12 18:44:19.137 [INFO][4789] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725" HandleID="k8s-pod-network.5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725" Workload="ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--2rb65-eth0" Dec 12 18:44:19.286823 containerd[1769]: 2025-12-12 18:44:19.137 [INFO][4789] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725" HandleID="k8s-pod-network.5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725" Workload="ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--2rb65-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fe30), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-4-78a5f49b53", "pod":"coredns-668d6bf9bc-2rb65", "timestamp":"2025-12-12 18:44:19.136997979 +0000 UTC"}, Hostname:"ci-4459-2-2-4-78a5f49b53", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 18:44:19.286823 containerd[1769]: 2025-12-12 18:44:19.137 [INFO][4789] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 18:44:19.286823 containerd[1769]: 2025-12-12 18:44:19.166 [INFO][4789] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 18:44:19.286823 containerd[1769]: 2025-12-12 18:44:19.166 [INFO][4789] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-4-78a5f49b53' Dec 12 18:44:19.286823 containerd[1769]: 2025-12-12 18:44:19.243 [INFO][4789] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.286823 containerd[1769]: 2025-12-12 18:44:19.248 [INFO][4789] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.286823 containerd[1769]: 2025-12-12 18:44:19.252 [INFO][4789] ipam/ipam.go 511: Trying affinity for 192.168.91.0/26 host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.286823 containerd[1769]: 2025-12-12 18:44:19.254 [INFO][4789] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.0/26 host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.286823 containerd[1769]: 2025-12-12 18:44:19.255 [INFO][4789] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.0/26 host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.286823 containerd[1769]: 2025-12-12 18:44:19.255 [INFO][4789] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.91.0/26 handle="k8s-pod-network.5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.286823 containerd[1769]: 2025-12-12 18:44:19.257 [INFO][4789] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725 Dec 12 18:44:19.286823 containerd[1769]: 2025-12-12 18:44:19.260 [INFO][4789] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.91.0/26 handle="k8s-pod-network.5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.286823 containerd[1769]: 2025-12-12 18:44:19.266 [INFO][4789] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.91.3/26] block=192.168.91.0/26 handle="k8s-pod-network.5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.286823 containerd[1769]: 2025-12-12 18:44:19.266 [INFO][4789] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.3/26] handle="k8s-pod-network.5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.286823 containerd[1769]: 2025-12-12 18:44:19.267 [INFO][4789] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 18:44:19.286823 containerd[1769]: 2025-12-12 18:44:19.267 [INFO][4789] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.91.3/26] IPv6=[] ContainerID="5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725" HandleID="k8s-pod-network.5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725" Workload="ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--2rb65-eth0" Dec 12 18:44:19.287470 containerd[1769]: 2025-12-12 18:44:19.270 [INFO][4739] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725" Namespace="kube-system" Pod="coredns-668d6bf9bc-2rb65" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--2rb65-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--2rb65-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"17892527-ab06-4143-9edf-e99eaea41942", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 43, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-78a5f49b53", ContainerID:"", Pod:"coredns-668d6bf9bc-2rb65", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.91.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali68a0d8a8369", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:44:19.287470 containerd[1769]: 2025-12-12 18:44:19.270 [INFO][4739] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.3/32] ContainerID="5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725" Namespace="kube-system" Pod="coredns-668d6bf9bc-2rb65" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--2rb65-eth0" Dec 12 18:44:19.287470 containerd[1769]: 2025-12-12 18:44:19.270 [INFO][4739] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali68a0d8a8369 ContainerID="5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725" Namespace="kube-system" Pod="coredns-668d6bf9bc-2rb65" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--2rb65-eth0" Dec 12 18:44:19.287470 containerd[1769]: 2025-12-12 18:44:19.274 [INFO][4739] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725" Namespace="kube-system" Pod="coredns-668d6bf9bc-2rb65" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--2rb65-eth0" Dec 12 18:44:19.287470 containerd[1769]: 2025-12-12 18:44:19.274 [INFO][4739] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725" Namespace="kube-system" Pod="coredns-668d6bf9bc-2rb65" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--2rb65-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--2rb65-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"17892527-ab06-4143-9edf-e99eaea41942", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 43, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-78a5f49b53", ContainerID:"5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725", Pod:"coredns-668d6bf9bc-2rb65", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.91.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali68a0d8a8369", MAC:"86:37:42:83:2b:c1", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:44:19.287645 containerd[1769]: 2025-12-12 18:44:19.284 [INFO][4739] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725" Namespace="kube-system" Pod="coredns-668d6bf9bc-2rb65" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--2rb65-eth0" Dec 12 18:44:19.298680 containerd[1769]: time="2025-12-12T18:44:19.298624706Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-pcfdn,Uid:77f6fa4e-0cea-4d83-8d55-8707dfa4d505,Namespace:calico-system,Attempt:0,} returns sandbox id \"ac0c76d1893ad48d0fe575173a8da4a4e8c8adac7e445665d8bd620e105fb3e1\"" Dec 12 18:44:19.300037 containerd[1769]: time="2025-12-12T18:44:19.300004373Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 18:44:19.316060 containerd[1769]: time="2025-12-12T18:44:19.316003678Z" level=info msg="connecting to shim 5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725" address="unix:///run/containerd/s/49afa0b514d31b4929633bc25fdbc8689ad105fc77a6c038556c66b313a81e21" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:44:19.346382 systemd[1]: Started cri-containerd-5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725.scope - libcontainer container 5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725. Dec 12 18:44:19.372363 systemd-networkd[1674]: cali34e0862199e: Link UP Dec 12 18:44:19.373155 systemd-networkd[1674]: cali34e0862199e: Gained carrier Dec 12 18:44:19.384963 containerd[1769]: 2025-12-12 18:44:19.116 [INFO][4762] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--4--78a5f49b53-k8s-csi--node--driver--pls24-eth0 csi-node-driver- calico-system 541d8bd6-57ea-4711-86e4-5819a7795d8f 716 0 2025-12-12 18:44:00 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-2-4-78a5f49b53 csi-node-driver-pls24 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali34e0862199e [] [] }} ContainerID="3b3699baaf8a1cdb3efe842a5b6ecd9083a1ef3cc13ded03359babec33964758" Namespace="calico-system" Pod="csi-node-driver-pls24" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-csi--node--driver--pls24-" Dec 12 18:44:19.384963 containerd[1769]: 2025-12-12 18:44:19.116 [INFO][4762] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3b3699baaf8a1cdb3efe842a5b6ecd9083a1ef3cc13ded03359babec33964758" Namespace="calico-system" Pod="csi-node-driver-pls24" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-csi--node--driver--pls24-eth0" Dec 12 18:44:19.384963 containerd[1769]: 2025-12-12 18:44:19.143 [INFO][4802] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3b3699baaf8a1cdb3efe842a5b6ecd9083a1ef3cc13ded03359babec33964758" HandleID="k8s-pod-network.3b3699baaf8a1cdb3efe842a5b6ecd9083a1ef3cc13ded03359babec33964758" Workload="ci--4459--2--2--4--78a5f49b53-k8s-csi--node--driver--pls24-eth0" Dec 12 18:44:19.384963 containerd[1769]: 2025-12-12 18:44:19.143 [INFO][4802] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="3b3699baaf8a1cdb3efe842a5b6ecd9083a1ef3cc13ded03359babec33964758" HandleID="k8s-pod-network.3b3699baaf8a1cdb3efe842a5b6ecd9083a1ef3cc13ded03359babec33964758" Workload="ci--4459--2--2--4--78a5f49b53-k8s-csi--node--driver--pls24-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000502b60), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-4-78a5f49b53", "pod":"csi-node-driver-pls24", "timestamp":"2025-12-12 18:44:19.14350513 +0000 UTC"}, Hostname:"ci-4459-2-2-4-78a5f49b53", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 18:44:19.384963 containerd[1769]: 2025-12-12 18:44:19.143 [INFO][4802] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 18:44:19.384963 containerd[1769]: 2025-12-12 18:44:19.267 [INFO][4802] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 18:44:19.384963 containerd[1769]: 2025-12-12 18:44:19.267 [INFO][4802] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-4-78a5f49b53' Dec 12 18:44:19.384963 containerd[1769]: 2025-12-12 18:44:19.344 [INFO][4802] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3b3699baaf8a1cdb3efe842a5b6ecd9083a1ef3cc13ded03359babec33964758" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.384963 containerd[1769]: 2025-12-12 18:44:19.348 [INFO][4802] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.384963 containerd[1769]: 2025-12-12 18:44:19.353 [INFO][4802] ipam/ipam.go 511: Trying affinity for 192.168.91.0/26 host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.384963 containerd[1769]: 2025-12-12 18:44:19.355 [INFO][4802] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.0/26 host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.384963 containerd[1769]: 2025-12-12 18:44:19.357 [INFO][4802] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.0/26 host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.384963 containerd[1769]: 2025-12-12 18:44:19.357 [INFO][4802] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.91.0/26 handle="k8s-pod-network.3b3699baaf8a1cdb3efe842a5b6ecd9083a1ef3cc13ded03359babec33964758" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.384963 containerd[1769]: 2025-12-12 18:44:19.358 [INFO][4802] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.3b3699baaf8a1cdb3efe842a5b6ecd9083a1ef3cc13ded03359babec33964758 Dec 12 18:44:19.384963 containerd[1769]: 2025-12-12 18:44:19.362 [INFO][4802] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.91.0/26 handle="k8s-pod-network.3b3699baaf8a1cdb3efe842a5b6ecd9083a1ef3cc13ded03359babec33964758" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.384963 containerd[1769]: 2025-12-12 18:44:19.369 [INFO][4802] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.91.4/26] block=192.168.91.0/26 handle="k8s-pod-network.3b3699baaf8a1cdb3efe842a5b6ecd9083a1ef3cc13ded03359babec33964758" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.384963 containerd[1769]: 2025-12-12 18:44:19.369 [INFO][4802] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.4/26] handle="k8s-pod-network.3b3699baaf8a1cdb3efe842a5b6ecd9083a1ef3cc13ded03359babec33964758" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:19.384963 containerd[1769]: 2025-12-12 18:44:19.369 [INFO][4802] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 18:44:19.384963 containerd[1769]: 2025-12-12 18:44:19.369 [INFO][4802] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.91.4/26] IPv6=[] ContainerID="3b3699baaf8a1cdb3efe842a5b6ecd9083a1ef3cc13ded03359babec33964758" HandleID="k8s-pod-network.3b3699baaf8a1cdb3efe842a5b6ecd9083a1ef3cc13ded03359babec33964758" Workload="ci--4459--2--2--4--78a5f49b53-k8s-csi--node--driver--pls24-eth0" Dec 12 18:44:19.385510 containerd[1769]: 2025-12-12 18:44:19.370 [INFO][4762] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3b3699baaf8a1cdb3efe842a5b6ecd9083a1ef3cc13ded03359babec33964758" Namespace="calico-system" Pod="csi-node-driver-pls24" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-csi--node--driver--pls24-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--78a5f49b53-k8s-csi--node--driver--pls24-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"541d8bd6-57ea-4711-86e4-5819a7795d8f", ResourceVersion:"716", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 44, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-78a5f49b53", ContainerID:"", Pod:"csi-node-driver-pls24", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.91.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali34e0862199e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:44:19.385510 containerd[1769]: 2025-12-12 18:44:19.370 [INFO][4762] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.4/32] ContainerID="3b3699baaf8a1cdb3efe842a5b6ecd9083a1ef3cc13ded03359babec33964758" Namespace="calico-system" Pod="csi-node-driver-pls24" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-csi--node--driver--pls24-eth0" Dec 12 18:44:19.385510 containerd[1769]: 2025-12-12 18:44:19.370 [INFO][4762] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali34e0862199e ContainerID="3b3699baaf8a1cdb3efe842a5b6ecd9083a1ef3cc13ded03359babec33964758" Namespace="calico-system" Pod="csi-node-driver-pls24" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-csi--node--driver--pls24-eth0" Dec 12 18:44:19.385510 containerd[1769]: 2025-12-12 18:44:19.372 [INFO][4762] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3b3699baaf8a1cdb3efe842a5b6ecd9083a1ef3cc13ded03359babec33964758" Namespace="calico-system" Pod="csi-node-driver-pls24" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-csi--node--driver--pls24-eth0" Dec 12 18:44:19.385510 containerd[1769]: 2025-12-12 18:44:19.372 [INFO][4762] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3b3699baaf8a1cdb3efe842a5b6ecd9083a1ef3cc13ded03359babec33964758" Namespace="calico-system" Pod="csi-node-driver-pls24" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-csi--node--driver--pls24-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--78a5f49b53-k8s-csi--node--driver--pls24-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"541d8bd6-57ea-4711-86e4-5819a7795d8f", ResourceVersion:"716", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 44, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-78a5f49b53", ContainerID:"3b3699baaf8a1cdb3efe842a5b6ecd9083a1ef3cc13ded03359babec33964758", Pod:"csi-node-driver-pls24", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.91.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali34e0862199e", MAC:"fe:45:20:f7:39:32", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:44:19.385510 containerd[1769]: 2025-12-12 18:44:19.382 [INFO][4762] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3b3699baaf8a1cdb3efe842a5b6ecd9083a1ef3cc13ded03359babec33964758" Namespace="calico-system" Pod="csi-node-driver-pls24" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-csi--node--driver--pls24-eth0" Dec 12 18:44:19.400987 containerd[1769]: time="2025-12-12T18:44:19.400921474Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-2rb65,Uid:17892527-ab06-4143-9edf-e99eaea41942,Namespace:kube-system,Attempt:0,} returns sandbox id \"5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725\"" Dec 12 18:44:19.404176 containerd[1769]: time="2025-12-12T18:44:19.404144027Z" level=info msg="CreateContainer within sandbox \"5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 12 18:44:19.415853 containerd[1769]: time="2025-12-12T18:44:19.415604698Z" level=info msg="connecting to shim 3b3699baaf8a1cdb3efe842a5b6ecd9083a1ef3cc13ded03359babec33964758" address="unix:///run/containerd/s/6ad9f3355cea66a61dfd12617fda88b8efc929bbde1ea9a7cbe03b7612b83782" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:44:19.419418 containerd[1769]: time="2025-12-12T18:44:19.419389689Z" level=info msg="Container 5fcef7e9c26768193c8b6dcaa83d345b3ed79b2e053b8a06dc52baf09ec55b12: CDI devices from CRI Config.CDIDevices: []" Dec 12 18:44:19.428128 containerd[1769]: time="2025-12-12T18:44:19.428064842Z" level=info msg="CreateContainer within sandbox \"5b23a5a123d2ff0ba3bebc406bb1a6893d8b065ba78fe586a241193a49d2f725\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"5fcef7e9c26768193c8b6dcaa83d345b3ed79b2e053b8a06dc52baf09ec55b12\"" Dec 12 18:44:19.430219 containerd[1769]: time="2025-12-12T18:44:19.430187079Z" level=info msg="StartContainer for \"5fcef7e9c26768193c8b6dcaa83d345b3ed79b2e053b8a06dc52baf09ec55b12\"" Dec 12 18:44:19.430953 containerd[1769]: time="2025-12-12T18:44:19.430920615Z" level=info msg="connecting to shim 5fcef7e9c26768193c8b6dcaa83d345b3ed79b2e053b8a06dc52baf09ec55b12" address="unix:///run/containerd/s/49afa0b514d31b4929633bc25fdbc8689ad105fc77a6c038556c66b313a81e21" protocol=ttrpc version=3 Dec 12 18:44:19.444317 systemd[1]: Started cri-containerd-3b3699baaf8a1cdb3efe842a5b6ecd9083a1ef3cc13ded03359babec33964758.scope - libcontainer container 3b3699baaf8a1cdb3efe842a5b6ecd9083a1ef3cc13ded03359babec33964758. Dec 12 18:44:19.447601 systemd[1]: Started cri-containerd-5fcef7e9c26768193c8b6dcaa83d345b3ed79b2e053b8a06dc52baf09ec55b12.scope - libcontainer container 5fcef7e9c26768193c8b6dcaa83d345b3ed79b2e053b8a06dc52baf09ec55b12. Dec 12 18:44:19.469209 containerd[1769]: time="2025-12-12T18:44:19.469162848Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-pls24,Uid:541d8bd6-57ea-4711-86e4-5819a7795d8f,Namespace:calico-system,Attempt:0,} returns sandbox id \"3b3699baaf8a1cdb3efe842a5b6ecd9083a1ef3cc13ded03359babec33964758\"" Dec 12 18:44:19.477332 containerd[1769]: time="2025-12-12T18:44:19.477294626Z" level=info msg="StartContainer for \"5fcef7e9c26768193c8b6dcaa83d345b3ed79b2e053b8a06dc52baf09ec55b12\" returns successfully" Dec 12 18:44:19.629474 containerd[1769]: time="2025-12-12T18:44:19.629437192Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:44:19.632014 containerd[1769]: time="2025-12-12T18:44:19.631975292Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 18:44:19.632147 containerd[1769]: time="2025-12-12T18:44:19.632018592Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 18:44:19.632395 kubelet[3062]: E1212 18:44:19.632358 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 18:44:19.633178 kubelet[3062]: E1212 18:44:19.632410 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 18:44:19.633178 kubelet[3062]: E1212 18:44:19.632640 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wccvq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-pcfdn_calico-system(77f6fa4e-0cea-4d83-8d55-8707dfa4d505): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 18:44:19.633299 containerd[1769]: time="2025-12-12T18:44:19.632693960Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 18:44:19.633871 kubelet[3062]: E1212 18:44:19.633846 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-pcfdn" podUID="77f6fa4e-0cea-4d83-8d55-8707dfa4d505" Dec 12 18:44:19.994155 containerd[1769]: time="2025-12-12T18:44:19.994035497Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:44:19.996910 containerd[1769]: time="2025-12-12T18:44:19.996855000Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 18:44:19.997060 containerd[1769]: time="2025-12-12T18:44:19.996946453Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 18:44:19.997170 kubelet[3062]: E1212 18:44:19.997126 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 18:44:19.997214 kubelet[3062]: E1212 18:44:19.997184 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 18:44:19.997371 kubelet[3062]: E1212 18:44:19.997328 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m8zv6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-pls24_calico-system(541d8bd6-57ea-4711-86e4-5819a7795d8f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 18:44:19.999258 containerd[1769]: time="2025-12-12T18:44:19.999227416Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 18:44:20.164238 kubelet[3062]: E1212 18:44:20.164201 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-pcfdn" podUID="77f6fa4e-0cea-4d83-8d55-8707dfa4d505" Dec 12 18:44:20.172972 kubelet[3062]: I1212 18:44:20.172911 3062 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-2rb65" podStartSLOduration=32.172894608 podStartE2EDuration="32.172894608s" podCreationTimestamp="2025-12-12 18:43:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 18:44:20.172769394 +0000 UTC m=+38.192506292" watchObservedRunningTime="2025-12-12 18:44:20.172894608 +0000 UTC m=+38.192631505" Dec 12 18:44:20.317284 systemd-networkd[1674]: cali5688346533d: Gained IPv6LL Dec 12 18:44:20.330225 containerd[1769]: time="2025-12-12T18:44:20.330158320Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:44:20.332120 containerd[1769]: time="2025-12-12T18:44:20.332043476Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 18:44:20.332120 containerd[1769]: time="2025-12-12T18:44:20.332103132Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 18:44:20.332521 kubelet[3062]: E1212 18:44:20.332432 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 18:44:20.332521 kubelet[3062]: E1212 18:44:20.332489 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 18:44:20.332696 kubelet[3062]: E1212 18:44:20.332619 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m8zv6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-pls24_calico-system(541d8bd6-57ea-4711-86e4-5819a7795d8f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 18:44:20.333904 kubelet[3062]: E1212 18:44:20.333847 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pls24" podUID="541d8bd6-57ea-4711-86e4-5819a7795d8f" Dec 12 18:44:21.067755 containerd[1769]: time="2025-12-12T18:44:21.067644343Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85cb55bbfc-m7vzj,Uid:20d32882-7b09-425d-879e-55aa1742ac4a,Namespace:calico-apiserver,Attempt:0,}" Dec 12 18:44:21.067755 containerd[1769]: time="2025-12-12T18:44:21.067684062Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6678f769fb-bpdwv,Uid:0d411828-ce18-4a86-9fa9-09a812dd1345,Namespace:calico-system,Attempt:0,}" Dec 12 18:44:21.086241 systemd-networkd[1674]: cali34e0862199e: Gained IPv6LL Dec 12 18:44:21.166598 kubelet[3062]: E1212 18:44:21.166555 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-pcfdn" podUID="77f6fa4e-0cea-4d83-8d55-8707dfa4d505" Dec 12 18:44:21.168418 kubelet[3062]: E1212 18:44:21.167125 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pls24" podUID="541d8bd6-57ea-4711-86e4-5819a7795d8f" Dec 12 18:44:21.180434 systemd-networkd[1674]: cali695397dafe3: Link UP Dec 12 18:44:21.180922 systemd-networkd[1674]: cali695397dafe3: Gained carrier Dec 12 18:44:21.195599 containerd[1769]: 2025-12-12 18:44:21.109 [INFO][5032] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--m7vzj-eth0 calico-apiserver-85cb55bbfc- calico-apiserver 20d32882-7b09-425d-879e-55aa1742ac4a 818 0 2025-12-12 18:43:55 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:85cb55bbfc projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-4-78a5f49b53 calico-apiserver-85cb55bbfc-m7vzj eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali695397dafe3 [] [] }} ContainerID="8530cf0b6d962ea03a986f7a71b8c19148d6a4912099b5c216d8e26ed1ed97f9" Namespace="calico-apiserver" Pod="calico-apiserver-85cb55bbfc-m7vzj" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--m7vzj-" Dec 12 18:44:21.195599 containerd[1769]: 2025-12-12 18:44:21.109 [INFO][5032] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8530cf0b6d962ea03a986f7a71b8c19148d6a4912099b5c216d8e26ed1ed97f9" Namespace="calico-apiserver" Pod="calico-apiserver-85cb55bbfc-m7vzj" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--m7vzj-eth0" Dec 12 18:44:21.195599 containerd[1769]: 2025-12-12 18:44:21.134 [INFO][5065] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8530cf0b6d962ea03a986f7a71b8c19148d6a4912099b5c216d8e26ed1ed97f9" HandleID="k8s-pod-network.8530cf0b6d962ea03a986f7a71b8c19148d6a4912099b5c216d8e26ed1ed97f9" Workload="ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--m7vzj-eth0" Dec 12 18:44:21.195599 containerd[1769]: 2025-12-12 18:44:21.135 [INFO][5065] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="8530cf0b6d962ea03a986f7a71b8c19148d6a4912099b5c216d8e26ed1ed97f9" HandleID="k8s-pod-network.8530cf0b6d962ea03a986f7a71b8c19148d6a4912099b5c216d8e26ed1ed97f9" Workload="ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--m7vzj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0005aa4b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-4-78a5f49b53", "pod":"calico-apiserver-85cb55bbfc-m7vzj", "timestamp":"2025-12-12 18:44:21.134954436 +0000 UTC"}, Hostname:"ci-4459-2-2-4-78a5f49b53", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 18:44:21.195599 containerd[1769]: 2025-12-12 18:44:21.135 [INFO][5065] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 18:44:21.195599 containerd[1769]: 2025-12-12 18:44:21.135 [INFO][5065] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 18:44:21.195599 containerd[1769]: 2025-12-12 18:44:21.135 [INFO][5065] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-4-78a5f49b53' Dec 12 18:44:21.195599 containerd[1769]: 2025-12-12 18:44:21.145 [INFO][5065] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8530cf0b6d962ea03a986f7a71b8c19148d6a4912099b5c216d8e26ed1ed97f9" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:21.195599 containerd[1769]: 2025-12-12 18:44:21.151 [INFO][5065] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:21.195599 containerd[1769]: 2025-12-12 18:44:21.156 [INFO][5065] ipam/ipam.go 511: Trying affinity for 192.168.91.0/26 host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:21.195599 containerd[1769]: 2025-12-12 18:44:21.158 [INFO][5065] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.0/26 host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:21.195599 containerd[1769]: 2025-12-12 18:44:21.160 [INFO][5065] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.0/26 host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:21.195599 containerd[1769]: 2025-12-12 18:44:21.160 [INFO][5065] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.91.0/26 handle="k8s-pod-network.8530cf0b6d962ea03a986f7a71b8c19148d6a4912099b5c216d8e26ed1ed97f9" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:21.195599 containerd[1769]: 2025-12-12 18:44:21.162 [INFO][5065] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.8530cf0b6d962ea03a986f7a71b8c19148d6a4912099b5c216d8e26ed1ed97f9 Dec 12 18:44:21.195599 containerd[1769]: 2025-12-12 18:44:21.166 [INFO][5065] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.91.0/26 handle="k8s-pod-network.8530cf0b6d962ea03a986f7a71b8c19148d6a4912099b5c216d8e26ed1ed97f9" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:21.195599 containerd[1769]: 2025-12-12 18:44:21.174 [INFO][5065] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.91.5/26] block=192.168.91.0/26 handle="k8s-pod-network.8530cf0b6d962ea03a986f7a71b8c19148d6a4912099b5c216d8e26ed1ed97f9" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:21.195599 containerd[1769]: 2025-12-12 18:44:21.174 [INFO][5065] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.5/26] handle="k8s-pod-network.8530cf0b6d962ea03a986f7a71b8c19148d6a4912099b5c216d8e26ed1ed97f9" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:21.195599 containerd[1769]: 2025-12-12 18:44:21.174 [INFO][5065] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 18:44:21.195599 containerd[1769]: 2025-12-12 18:44:21.174 [INFO][5065] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.91.5/26] IPv6=[] ContainerID="8530cf0b6d962ea03a986f7a71b8c19148d6a4912099b5c216d8e26ed1ed97f9" HandleID="k8s-pod-network.8530cf0b6d962ea03a986f7a71b8c19148d6a4912099b5c216d8e26ed1ed97f9" Workload="ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--m7vzj-eth0" Dec 12 18:44:21.196201 containerd[1769]: 2025-12-12 18:44:21.177 [INFO][5032] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8530cf0b6d962ea03a986f7a71b8c19148d6a4912099b5c216d8e26ed1ed97f9" Namespace="calico-apiserver" Pod="calico-apiserver-85cb55bbfc-m7vzj" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--m7vzj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--m7vzj-eth0", GenerateName:"calico-apiserver-85cb55bbfc-", Namespace:"calico-apiserver", SelfLink:"", UID:"20d32882-7b09-425d-879e-55aa1742ac4a", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 43, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"85cb55bbfc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-78a5f49b53", ContainerID:"", Pod:"calico-apiserver-85cb55bbfc-m7vzj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.91.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali695397dafe3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:44:21.196201 containerd[1769]: 2025-12-12 18:44:21.177 [INFO][5032] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.5/32] ContainerID="8530cf0b6d962ea03a986f7a71b8c19148d6a4912099b5c216d8e26ed1ed97f9" Namespace="calico-apiserver" Pod="calico-apiserver-85cb55bbfc-m7vzj" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--m7vzj-eth0" Dec 12 18:44:21.196201 containerd[1769]: 2025-12-12 18:44:21.177 [INFO][5032] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali695397dafe3 ContainerID="8530cf0b6d962ea03a986f7a71b8c19148d6a4912099b5c216d8e26ed1ed97f9" Namespace="calico-apiserver" Pod="calico-apiserver-85cb55bbfc-m7vzj" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--m7vzj-eth0" Dec 12 18:44:21.196201 containerd[1769]: 2025-12-12 18:44:21.181 [INFO][5032] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8530cf0b6d962ea03a986f7a71b8c19148d6a4912099b5c216d8e26ed1ed97f9" Namespace="calico-apiserver" Pod="calico-apiserver-85cb55bbfc-m7vzj" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--m7vzj-eth0" Dec 12 18:44:21.196201 containerd[1769]: 2025-12-12 18:44:21.181 [INFO][5032] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8530cf0b6d962ea03a986f7a71b8c19148d6a4912099b5c216d8e26ed1ed97f9" Namespace="calico-apiserver" Pod="calico-apiserver-85cb55bbfc-m7vzj" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--m7vzj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--m7vzj-eth0", GenerateName:"calico-apiserver-85cb55bbfc-", Namespace:"calico-apiserver", SelfLink:"", UID:"20d32882-7b09-425d-879e-55aa1742ac4a", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 43, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"85cb55bbfc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-78a5f49b53", ContainerID:"8530cf0b6d962ea03a986f7a71b8c19148d6a4912099b5c216d8e26ed1ed97f9", Pod:"calico-apiserver-85cb55bbfc-m7vzj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.91.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali695397dafe3", MAC:"c2:89:c9:22:bb:d7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:44:21.196201 containerd[1769]: 2025-12-12 18:44:21.193 [INFO][5032] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8530cf0b6d962ea03a986f7a71b8c19148d6a4912099b5c216d8e26ed1ed97f9" Namespace="calico-apiserver" Pod="calico-apiserver-85cb55bbfc-m7vzj" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--m7vzj-eth0" Dec 12 18:44:21.221839 containerd[1769]: time="2025-12-12T18:44:21.221780150Z" level=info msg="connecting to shim 8530cf0b6d962ea03a986f7a71b8c19148d6a4912099b5c216d8e26ed1ed97f9" address="unix:///run/containerd/s/11c244d77243bd12867f989c65e7ee5ce93e6036a3092cfe5c3cedbc6da6cd9f" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:44:21.254340 systemd[1]: Started cri-containerd-8530cf0b6d962ea03a986f7a71b8c19148d6a4912099b5c216d8e26ed1ed97f9.scope - libcontainer container 8530cf0b6d962ea03a986f7a71b8c19148d6a4912099b5c216d8e26ed1ed97f9. Dec 12 18:44:21.277239 systemd-networkd[1674]: cali68a0d8a8369: Gained IPv6LL Dec 12 18:44:21.281282 systemd-networkd[1674]: cali24a7bb65356: Link UP Dec 12 18:44:21.281408 systemd-networkd[1674]: cali24a7bb65356: Gained carrier Dec 12 18:44:21.294379 containerd[1769]: 2025-12-12 18:44:21.107 [INFO][5038] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--4--78a5f49b53-k8s-calico--kube--controllers--6678f769fb--bpdwv-eth0 calico-kube-controllers-6678f769fb- calico-system 0d411828-ce18-4a86-9fa9-09a812dd1345 824 0 2025-12-12 18:44:00 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6678f769fb projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-2-4-78a5f49b53 calico-kube-controllers-6678f769fb-bpdwv eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali24a7bb65356 [] [] }} ContainerID="8740731b3176f2b3d313d2df3665ec3569e4223fbfbaf28f8f3e432c630eb8e1" Namespace="calico-system" Pod="calico-kube-controllers-6678f769fb-bpdwv" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-calico--kube--controllers--6678f769fb--bpdwv-" Dec 12 18:44:21.294379 containerd[1769]: 2025-12-12 18:44:21.107 [INFO][5038] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8740731b3176f2b3d313d2df3665ec3569e4223fbfbaf28f8f3e432c630eb8e1" Namespace="calico-system" Pod="calico-kube-controllers-6678f769fb-bpdwv" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-calico--kube--controllers--6678f769fb--bpdwv-eth0" Dec 12 18:44:21.294379 containerd[1769]: 2025-12-12 18:44:21.135 [INFO][5067] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8740731b3176f2b3d313d2df3665ec3569e4223fbfbaf28f8f3e432c630eb8e1" HandleID="k8s-pod-network.8740731b3176f2b3d313d2df3665ec3569e4223fbfbaf28f8f3e432c630eb8e1" Workload="ci--4459--2--2--4--78a5f49b53-k8s-calico--kube--controllers--6678f769fb--bpdwv-eth0" Dec 12 18:44:21.294379 containerd[1769]: 2025-12-12 18:44:21.135 [INFO][5067] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="8740731b3176f2b3d313d2df3665ec3569e4223fbfbaf28f8f3e432c630eb8e1" HandleID="k8s-pod-network.8740731b3176f2b3d313d2df3665ec3569e4223fbfbaf28f8f3e432c630eb8e1" Workload="ci--4459--2--2--4--78a5f49b53-k8s-calico--kube--controllers--6678f769fb--bpdwv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e7d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-4-78a5f49b53", "pod":"calico-kube-controllers-6678f769fb-bpdwv", "timestamp":"2025-12-12 18:44:21.135140158 +0000 UTC"}, Hostname:"ci-4459-2-2-4-78a5f49b53", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 18:44:21.294379 containerd[1769]: 2025-12-12 18:44:21.135 [INFO][5067] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 18:44:21.294379 containerd[1769]: 2025-12-12 18:44:21.174 [INFO][5067] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 18:44:21.294379 containerd[1769]: 2025-12-12 18:44:21.174 [INFO][5067] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-4-78a5f49b53' Dec 12 18:44:21.294379 containerd[1769]: 2025-12-12 18:44:21.243 [INFO][5067] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8740731b3176f2b3d313d2df3665ec3569e4223fbfbaf28f8f3e432c630eb8e1" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:21.294379 containerd[1769]: 2025-12-12 18:44:21.253 [INFO][5067] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:21.294379 containerd[1769]: 2025-12-12 18:44:21.257 [INFO][5067] ipam/ipam.go 511: Trying affinity for 192.168.91.0/26 host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:21.294379 containerd[1769]: 2025-12-12 18:44:21.259 [INFO][5067] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.0/26 host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:21.294379 containerd[1769]: 2025-12-12 18:44:21.261 [INFO][5067] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.0/26 host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:21.294379 containerd[1769]: 2025-12-12 18:44:21.261 [INFO][5067] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.91.0/26 handle="k8s-pod-network.8740731b3176f2b3d313d2df3665ec3569e4223fbfbaf28f8f3e432c630eb8e1" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:21.294379 containerd[1769]: 2025-12-12 18:44:21.264 [INFO][5067] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.8740731b3176f2b3d313d2df3665ec3569e4223fbfbaf28f8f3e432c630eb8e1 Dec 12 18:44:21.294379 containerd[1769]: 2025-12-12 18:44:21.269 [INFO][5067] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.91.0/26 handle="k8s-pod-network.8740731b3176f2b3d313d2df3665ec3569e4223fbfbaf28f8f3e432c630eb8e1" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:21.294379 containerd[1769]: 2025-12-12 18:44:21.277 [INFO][5067] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.91.6/26] block=192.168.91.0/26 handle="k8s-pod-network.8740731b3176f2b3d313d2df3665ec3569e4223fbfbaf28f8f3e432c630eb8e1" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:21.294379 containerd[1769]: 2025-12-12 18:44:21.277 [INFO][5067] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.6/26] handle="k8s-pod-network.8740731b3176f2b3d313d2df3665ec3569e4223fbfbaf28f8f3e432c630eb8e1" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:21.294379 containerd[1769]: 2025-12-12 18:44:21.277 [INFO][5067] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 18:44:21.294379 containerd[1769]: 2025-12-12 18:44:21.277 [INFO][5067] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.91.6/26] IPv6=[] ContainerID="8740731b3176f2b3d313d2df3665ec3569e4223fbfbaf28f8f3e432c630eb8e1" HandleID="k8s-pod-network.8740731b3176f2b3d313d2df3665ec3569e4223fbfbaf28f8f3e432c630eb8e1" Workload="ci--4459--2--2--4--78a5f49b53-k8s-calico--kube--controllers--6678f769fb--bpdwv-eth0" Dec 12 18:44:21.294933 containerd[1769]: 2025-12-12 18:44:21.279 [INFO][5038] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8740731b3176f2b3d313d2df3665ec3569e4223fbfbaf28f8f3e432c630eb8e1" Namespace="calico-system" Pod="calico-kube-controllers-6678f769fb-bpdwv" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-calico--kube--controllers--6678f769fb--bpdwv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--78a5f49b53-k8s-calico--kube--controllers--6678f769fb--bpdwv-eth0", GenerateName:"calico-kube-controllers-6678f769fb-", Namespace:"calico-system", SelfLink:"", UID:"0d411828-ce18-4a86-9fa9-09a812dd1345", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 44, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6678f769fb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-78a5f49b53", ContainerID:"", Pod:"calico-kube-controllers-6678f769fb-bpdwv", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.91.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali24a7bb65356", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:44:21.294933 containerd[1769]: 2025-12-12 18:44:21.279 [INFO][5038] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.6/32] ContainerID="8740731b3176f2b3d313d2df3665ec3569e4223fbfbaf28f8f3e432c630eb8e1" Namespace="calico-system" Pod="calico-kube-controllers-6678f769fb-bpdwv" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-calico--kube--controllers--6678f769fb--bpdwv-eth0" Dec 12 18:44:21.294933 containerd[1769]: 2025-12-12 18:44:21.279 [INFO][5038] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali24a7bb65356 ContainerID="8740731b3176f2b3d313d2df3665ec3569e4223fbfbaf28f8f3e432c630eb8e1" Namespace="calico-system" Pod="calico-kube-controllers-6678f769fb-bpdwv" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-calico--kube--controllers--6678f769fb--bpdwv-eth0" Dec 12 18:44:21.294933 containerd[1769]: 2025-12-12 18:44:21.281 [INFO][5038] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8740731b3176f2b3d313d2df3665ec3569e4223fbfbaf28f8f3e432c630eb8e1" Namespace="calico-system" Pod="calico-kube-controllers-6678f769fb-bpdwv" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-calico--kube--controllers--6678f769fb--bpdwv-eth0" Dec 12 18:44:21.294933 containerd[1769]: 2025-12-12 18:44:21.281 [INFO][5038] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8740731b3176f2b3d313d2df3665ec3569e4223fbfbaf28f8f3e432c630eb8e1" Namespace="calico-system" Pod="calico-kube-controllers-6678f769fb-bpdwv" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-calico--kube--controllers--6678f769fb--bpdwv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--78a5f49b53-k8s-calico--kube--controllers--6678f769fb--bpdwv-eth0", GenerateName:"calico-kube-controllers-6678f769fb-", Namespace:"calico-system", SelfLink:"", UID:"0d411828-ce18-4a86-9fa9-09a812dd1345", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 44, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6678f769fb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-78a5f49b53", ContainerID:"8740731b3176f2b3d313d2df3665ec3569e4223fbfbaf28f8f3e432c630eb8e1", Pod:"calico-kube-controllers-6678f769fb-bpdwv", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.91.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali24a7bb65356", MAC:"16:e0:f6:76:85:43", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:44:21.294933 containerd[1769]: 2025-12-12 18:44:21.292 [INFO][5038] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8740731b3176f2b3d313d2df3665ec3569e4223fbfbaf28f8f3e432c630eb8e1" Namespace="calico-system" Pod="calico-kube-controllers-6678f769fb-bpdwv" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-calico--kube--controllers--6678f769fb--bpdwv-eth0" Dec 12 18:44:21.307032 containerd[1769]: time="2025-12-12T18:44:21.306977124Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85cb55bbfc-m7vzj,Uid:20d32882-7b09-425d-879e-55aa1742ac4a,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"8530cf0b6d962ea03a986f7a71b8c19148d6a4912099b5c216d8e26ed1ed97f9\"" Dec 12 18:44:21.308360 containerd[1769]: time="2025-12-12T18:44:21.308307560Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 18:44:21.327832 containerd[1769]: time="2025-12-12T18:44:21.327639729Z" level=info msg="connecting to shim 8740731b3176f2b3d313d2df3665ec3569e4223fbfbaf28f8f3e432c630eb8e1" address="unix:///run/containerd/s/73daffee9be866caa584cf52b55a6227d96d48c07fc215ba9ca2fb46ab1de612" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:44:21.351349 systemd[1]: Started cri-containerd-8740731b3176f2b3d313d2df3665ec3569e4223fbfbaf28f8f3e432c630eb8e1.scope - libcontainer container 8740731b3176f2b3d313d2df3665ec3569e4223fbfbaf28f8f3e432c630eb8e1. Dec 12 18:44:21.396841 containerd[1769]: time="2025-12-12T18:44:21.396763731Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6678f769fb-bpdwv,Uid:0d411828-ce18-4a86-9fa9-09a812dd1345,Namespace:calico-system,Attempt:0,} returns sandbox id \"8740731b3176f2b3d313d2df3665ec3569e4223fbfbaf28f8f3e432c630eb8e1\"" Dec 12 18:44:21.641345 containerd[1769]: time="2025-12-12T18:44:21.641269260Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:44:21.644316 containerd[1769]: time="2025-12-12T18:44:21.644271985Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 18:44:21.644445 containerd[1769]: time="2025-12-12T18:44:21.644407587Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 18:44:21.644640 kubelet[3062]: E1212 18:44:21.644553 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:44:21.644640 kubelet[3062]: E1212 18:44:21.644625 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:44:21.644946 kubelet[3062]: E1212 18:44:21.644885 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nch2q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-85cb55bbfc-m7vzj_calico-apiserver(20d32882-7b09-425d-879e-55aa1742ac4a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 18:44:21.645169 containerd[1769]: time="2025-12-12T18:44:21.645127968Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 18:44:21.646415 kubelet[3062]: E1212 18:44:21.646372 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-m7vzj" podUID="20d32882-7b09-425d-879e-55aa1742ac4a" Dec 12 18:44:21.976949 containerd[1769]: time="2025-12-12T18:44:21.976794255Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:44:21.979263 containerd[1769]: time="2025-12-12T18:44:21.979195173Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 18:44:21.979355 containerd[1769]: time="2025-12-12T18:44:21.979273606Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 18:44:21.979531 kubelet[3062]: E1212 18:44:21.979483 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 18:44:21.979578 kubelet[3062]: E1212 18:44:21.979540 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 18:44:21.979746 kubelet[3062]: E1212 18:44:21.979695 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ldgzq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-6678f769fb-bpdwv_calico-system(0d411828-ce18-4a86-9fa9-09a812dd1345): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 18:44:21.981166 kubelet[3062]: E1212 18:44:21.981106 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6678f769fb-bpdwv" podUID="0d411828-ce18-4a86-9fa9-09a812dd1345" Dec 12 18:44:22.068620 containerd[1769]: time="2025-12-12T18:44:22.068580514Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85cb55bbfc-bnqwj,Uid:590a3c9e-ec4d-4eb7-8a94-169742d9929c,Namespace:calico-apiserver,Attempt:0,}" Dec 12 18:44:22.069052 containerd[1769]: time="2025-12-12T18:44:22.068639661Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-lt2zp,Uid:e01e5f95-946a-4bdc-b90f-58fb4afdeffd,Namespace:kube-system,Attempt:0,}" Dec 12 18:44:22.170430 kubelet[3062]: E1212 18:44:22.170383 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-m7vzj" podUID="20d32882-7b09-425d-879e-55aa1742ac4a" Dec 12 18:44:22.181307 kubelet[3062]: E1212 18:44:22.181237 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6678f769fb-bpdwv" podUID="0d411828-ce18-4a86-9fa9-09a812dd1345" Dec 12 18:44:22.185023 systemd-networkd[1674]: calia22c7327ac7: Link UP Dec 12 18:44:22.187221 systemd-networkd[1674]: calia22c7327ac7: Gained carrier Dec 12 18:44:22.213358 containerd[1769]: 2025-12-12 18:44:22.107 [INFO][5197] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--bnqwj-eth0 calico-apiserver-85cb55bbfc- calico-apiserver 590a3c9e-ec4d-4eb7-8a94-169742d9929c 825 0 2025-12-12 18:43:55 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:85cb55bbfc projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-4-78a5f49b53 calico-apiserver-85cb55bbfc-bnqwj eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia22c7327ac7 [] [] }} ContainerID="15d02494c9d2a952f3adf9bb102a211114b058910894221d8e9bd1247cbb7f46" Namespace="calico-apiserver" Pod="calico-apiserver-85cb55bbfc-bnqwj" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--bnqwj-" Dec 12 18:44:22.213358 containerd[1769]: 2025-12-12 18:44:22.108 [INFO][5197] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="15d02494c9d2a952f3adf9bb102a211114b058910894221d8e9bd1247cbb7f46" Namespace="calico-apiserver" Pod="calico-apiserver-85cb55bbfc-bnqwj" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--bnqwj-eth0" Dec 12 18:44:22.213358 containerd[1769]: 2025-12-12 18:44:22.132 [INFO][5232] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="15d02494c9d2a952f3adf9bb102a211114b058910894221d8e9bd1247cbb7f46" HandleID="k8s-pod-network.15d02494c9d2a952f3adf9bb102a211114b058910894221d8e9bd1247cbb7f46" Workload="ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--bnqwj-eth0" Dec 12 18:44:22.213358 containerd[1769]: 2025-12-12 18:44:22.132 [INFO][5232] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="15d02494c9d2a952f3adf9bb102a211114b058910894221d8e9bd1247cbb7f46" HandleID="k8s-pod-network.15d02494c9d2a952f3adf9bb102a211114b058910894221d8e9bd1247cbb7f46" Workload="ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--bnqwj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f630), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-4-78a5f49b53", "pod":"calico-apiserver-85cb55bbfc-bnqwj", "timestamp":"2025-12-12 18:44:22.13222443 +0000 UTC"}, Hostname:"ci-4459-2-2-4-78a5f49b53", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 18:44:22.213358 containerd[1769]: 2025-12-12 18:44:22.132 [INFO][5232] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 18:44:22.213358 containerd[1769]: 2025-12-12 18:44:22.132 [INFO][5232] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 18:44:22.213358 containerd[1769]: 2025-12-12 18:44:22.132 [INFO][5232] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-4-78a5f49b53' Dec 12 18:44:22.213358 containerd[1769]: 2025-12-12 18:44:22.139 [INFO][5232] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.15d02494c9d2a952f3adf9bb102a211114b058910894221d8e9bd1247cbb7f46" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:22.213358 containerd[1769]: 2025-12-12 18:44:22.143 [INFO][5232] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:22.213358 containerd[1769]: 2025-12-12 18:44:22.148 [INFO][5232] ipam/ipam.go 511: Trying affinity for 192.168.91.0/26 host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:22.213358 containerd[1769]: 2025-12-12 18:44:22.150 [INFO][5232] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.0/26 host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:22.213358 containerd[1769]: 2025-12-12 18:44:22.152 [INFO][5232] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.0/26 host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:22.213358 containerd[1769]: 2025-12-12 18:44:22.152 [INFO][5232] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.91.0/26 handle="k8s-pod-network.15d02494c9d2a952f3adf9bb102a211114b058910894221d8e9bd1247cbb7f46" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:22.213358 containerd[1769]: 2025-12-12 18:44:22.154 [INFO][5232] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.15d02494c9d2a952f3adf9bb102a211114b058910894221d8e9bd1247cbb7f46 Dec 12 18:44:22.213358 containerd[1769]: 2025-12-12 18:44:22.158 [INFO][5232] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.91.0/26 handle="k8s-pod-network.15d02494c9d2a952f3adf9bb102a211114b058910894221d8e9bd1247cbb7f46" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:22.213358 containerd[1769]: 2025-12-12 18:44:22.175 [INFO][5232] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.91.7/26] block=192.168.91.0/26 handle="k8s-pod-network.15d02494c9d2a952f3adf9bb102a211114b058910894221d8e9bd1247cbb7f46" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:22.213358 containerd[1769]: 2025-12-12 18:44:22.175 [INFO][5232] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.7/26] handle="k8s-pod-network.15d02494c9d2a952f3adf9bb102a211114b058910894221d8e9bd1247cbb7f46" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:22.213358 containerd[1769]: 2025-12-12 18:44:22.175 [INFO][5232] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 18:44:22.213358 containerd[1769]: 2025-12-12 18:44:22.175 [INFO][5232] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.91.7/26] IPv6=[] ContainerID="15d02494c9d2a952f3adf9bb102a211114b058910894221d8e9bd1247cbb7f46" HandleID="k8s-pod-network.15d02494c9d2a952f3adf9bb102a211114b058910894221d8e9bd1247cbb7f46" Workload="ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--bnqwj-eth0" Dec 12 18:44:22.213911 containerd[1769]: 2025-12-12 18:44:22.181 [INFO][5197] cni-plugin/k8s.go 418: Populated endpoint ContainerID="15d02494c9d2a952f3adf9bb102a211114b058910894221d8e9bd1247cbb7f46" Namespace="calico-apiserver" Pod="calico-apiserver-85cb55bbfc-bnqwj" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--bnqwj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--bnqwj-eth0", GenerateName:"calico-apiserver-85cb55bbfc-", Namespace:"calico-apiserver", SelfLink:"", UID:"590a3c9e-ec4d-4eb7-8a94-169742d9929c", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 43, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"85cb55bbfc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-78a5f49b53", ContainerID:"", Pod:"calico-apiserver-85cb55bbfc-bnqwj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.91.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia22c7327ac7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:44:22.213911 containerd[1769]: 2025-12-12 18:44:22.181 [INFO][5197] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.7/32] ContainerID="15d02494c9d2a952f3adf9bb102a211114b058910894221d8e9bd1247cbb7f46" Namespace="calico-apiserver" Pod="calico-apiserver-85cb55bbfc-bnqwj" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--bnqwj-eth0" Dec 12 18:44:22.213911 containerd[1769]: 2025-12-12 18:44:22.181 [INFO][5197] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia22c7327ac7 ContainerID="15d02494c9d2a952f3adf9bb102a211114b058910894221d8e9bd1247cbb7f46" Namespace="calico-apiserver" Pod="calico-apiserver-85cb55bbfc-bnqwj" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--bnqwj-eth0" Dec 12 18:44:22.213911 containerd[1769]: 2025-12-12 18:44:22.183 [INFO][5197] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="15d02494c9d2a952f3adf9bb102a211114b058910894221d8e9bd1247cbb7f46" Namespace="calico-apiserver" Pod="calico-apiserver-85cb55bbfc-bnqwj" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--bnqwj-eth0" Dec 12 18:44:22.213911 containerd[1769]: 2025-12-12 18:44:22.191 [INFO][5197] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="15d02494c9d2a952f3adf9bb102a211114b058910894221d8e9bd1247cbb7f46" Namespace="calico-apiserver" Pod="calico-apiserver-85cb55bbfc-bnqwj" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--bnqwj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--bnqwj-eth0", GenerateName:"calico-apiserver-85cb55bbfc-", Namespace:"calico-apiserver", SelfLink:"", UID:"590a3c9e-ec4d-4eb7-8a94-169742d9929c", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 43, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"85cb55bbfc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-78a5f49b53", ContainerID:"15d02494c9d2a952f3adf9bb102a211114b058910894221d8e9bd1247cbb7f46", Pod:"calico-apiserver-85cb55bbfc-bnqwj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.91.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia22c7327ac7", MAC:"e2:0f:8c:f5:b1:c4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:44:22.213911 containerd[1769]: 2025-12-12 18:44:22.210 [INFO][5197] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="15d02494c9d2a952f3adf9bb102a211114b058910894221d8e9bd1247cbb7f46" Namespace="calico-apiserver" Pod="calico-apiserver-85cb55bbfc-bnqwj" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-calico--apiserver--85cb55bbfc--bnqwj-eth0" Dec 12 18:44:22.239590 containerd[1769]: time="2025-12-12T18:44:22.239476805Z" level=info msg="connecting to shim 15d02494c9d2a952f3adf9bb102a211114b058910894221d8e9bd1247cbb7f46" address="unix:///run/containerd/s/f90dfa3033b8afb2d2cd5f32d9dcf1029f474d5ce04ab004fdacef41905e8e70" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:44:22.271669 systemd[1]: Started cri-containerd-15d02494c9d2a952f3adf9bb102a211114b058910894221d8e9bd1247cbb7f46.scope - libcontainer container 15d02494c9d2a952f3adf9bb102a211114b058910894221d8e9bd1247cbb7f46. Dec 12 18:44:22.272942 systemd-networkd[1674]: cali2b326fc4c30: Link UP Dec 12 18:44:22.273763 systemd-networkd[1674]: cali2b326fc4c30: Gained carrier Dec 12 18:44:22.286181 containerd[1769]: 2025-12-12 18:44:22.109 [INFO][5204] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--lt2zp-eth0 coredns-668d6bf9bc- kube-system e01e5f95-946a-4bdc-b90f-58fb4afdeffd 822 0 2025-12-12 18:43:48 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-4-78a5f49b53 coredns-668d6bf9bc-lt2zp eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali2b326fc4c30 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62" Namespace="kube-system" Pod="coredns-668d6bf9bc-lt2zp" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--lt2zp-" Dec 12 18:44:22.286181 containerd[1769]: 2025-12-12 18:44:22.109 [INFO][5204] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62" Namespace="kube-system" Pod="coredns-668d6bf9bc-lt2zp" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--lt2zp-eth0" Dec 12 18:44:22.286181 containerd[1769]: 2025-12-12 18:44:22.132 [INFO][5234] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62" HandleID="k8s-pod-network.8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62" Workload="ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--lt2zp-eth0" Dec 12 18:44:22.286181 containerd[1769]: 2025-12-12 18:44:22.132 [INFO][5234] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62" HandleID="k8s-pod-network.8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62" Workload="ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--lt2zp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000692410), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-4-78a5f49b53", "pod":"coredns-668d6bf9bc-lt2zp", "timestamp":"2025-12-12 18:44:22.132798554 +0000 UTC"}, Hostname:"ci-4459-2-2-4-78a5f49b53", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 18:44:22.286181 containerd[1769]: 2025-12-12 18:44:22.132 [INFO][5234] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 18:44:22.286181 containerd[1769]: 2025-12-12 18:44:22.175 [INFO][5234] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 18:44:22.286181 containerd[1769]: 2025-12-12 18:44:22.177 [INFO][5234] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-4-78a5f49b53' Dec 12 18:44:22.286181 containerd[1769]: 2025-12-12 18:44:22.240 [INFO][5234] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:22.286181 containerd[1769]: 2025-12-12 18:44:22.247 [INFO][5234] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:22.286181 containerd[1769]: 2025-12-12 18:44:22.251 [INFO][5234] ipam/ipam.go 511: Trying affinity for 192.168.91.0/26 host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:22.286181 containerd[1769]: 2025-12-12 18:44:22.253 [INFO][5234] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.0/26 host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:22.286181 containerd[1769]: 2025-12-12 18:44:22.255 [INFO][5234] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.0/26 host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:22.286181 containerd[1769]: 2025-12-12 18:44:22.255 [INFO][5234] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.91.0/26 handle="k8s-pod-network.8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:22.286181 containerd[1769]: 2025-12-12 18:44:22.257 [INFO][5234] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62 Dec 12 18:44:22.286181 containerd[1769]: 2025-12-12 18:44:22.261 [INFO][5234] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.91.0/26 handle="k8s-pod-network.8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:22.286181 containerd[1769]: 2025-12-12 18:44:22.268 [INFO][5234] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.91.8/26] block=192.168.91.0/26 handle="k8s-pod-network.8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:22.286181 containerd[1769]: 2025-12-12 18:44:22.268 [INFO][5234] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.8/26] handle="k8s-pod-network.8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62" host="ci-4459-2-2-4-78a5f49b53" Dec 12 18:44:22.286181 containerd[1769]: 2025-12-12 18:44:22.268 [INFO][5234] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 18:44:22.286181 containerd[1769]: 2025-12-12 18:44:22.268 [INFO][5234] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.91.8/26] IPv6=[] ContainerID="8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62" HandleID="k8s-pod-network.8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62" Workload="ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--lt2zp-eth0" Dec 12 18:44:22.286763 containerd[1769]: 2025-12-12 18:44:22.269 [INFO][5204] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62" Namespace="kube-system" Pod="coredns-668d6bf9bc-lt2zp" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--lt2zp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--lt2zp-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"e01e5f95-946a-4bdc-b90f-58fb4afdeffd", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 43, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-78a5f49b53", ContainerID:"", Pod:"coredns-668d6bf9bc-lt2zp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.91.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2b326fc4c30", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:44:22.286763 containerd[1769]: 2025-12-12 18:44:22.270 [INFO][5204] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.8/32] ContainerID="8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62" Namespace="kube-system" Pod="coredns-668d6bf9bc-lt2zp" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--lt2zp-eth0" Dec 12 18:44:22.286763 containerd[1769]: 2025-12-12 18:44:22.270 [INFO][5204] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2b326fc4c30 ContainerID="8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62" Namespace="kube-system" Pod="coredns-668d6bf9bc-lt2zp" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--lt2zp-eth0" Dec 12 18:44:22.286763 containerd[1769]: 2025-12-12 18:44:22.274 [INFO][5204] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62" Namespace="kube-system" Pod="coredns-668d6bf9bc-lt2zp" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--lt2zp-eth0" Dec 12 18:44:22.286763 containerd[1769]: 2025-12-12 18:44:22.274 [INFO][5204] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62" Namespace="kube-system" Pod="coredns-668d6bf9bc-lt2zp" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--lt2zp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--lt2zp-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"e01e5f95-946a-4bdc-b90f-58fb4afdeffd", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 43, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-78a5f49b53", ContainerID:"8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62", Pod:"coredns-668d6bf9bc-lt2zp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.91.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2b326fc4c30", MAC:"ee:34:e3:98:4a:c4", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:44:22.286935 containerd[1769]: 2025-12-12 18:44:22.283 [INFO][5204] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62" Namespace="kube-system" Pod="coredns-668d6bf9bc-lt2zp" WorkloadEndpoint="ci--4459--2--2--4--78a5f49b53-k8s-coredns--668d6bf9bc--lt2zp-eth0" Dec 12 18:44:22.313871 containerd[1769]: time="2025-12-12T18:44:22.313818126Z" level=info msg="connecting to shim 8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62" address="unix:///run/containerd/s/ac0efaedae4fd891dab046d5ad4ff17522f99e7b39c456de73d37b64d4697e90" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:44:22.320977 containerd[1769]: time="2025-12-12T18:44:22.320860411Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85cb55bbfc-bnqwj,Uid:590a3c9e-ec4d-4eb7-8a94-169742d9929c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"15d02494c9d2a952f3adf9bb102a211114b058910894221d8e9bd1247cbb7f46\"" Dec 12 18:44:22.328603 containerd[1769]: time="2025-12-12T18:44:22.328497822Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 18:44:22.349347 systemd[1]: Started cri-containerd-8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62.scope - libcontainer container 8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62. Dec 12 18:44:22.394158 containerd[1769]: time="2025-12-12T18:44:22.394043225Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-lt2zp,Uid:e01e5f95-946a-4bdc-b90f-58fb4afdeffd,Namespace:kube-system,Attempt:0,} returns sandbox id \"8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62\"" Dec 12 18:44:22.396900 containerd[1769]: time="2025-12-12T18:44:22.396480200Z" level=info msg="CreateContainer within sandbox \"8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 12 18:44:22.410808 containerd[1769]: time="2025-12-12T18:44:22.410757451Z" level=info msg="Container 8481ef950729deddc546a5441387071dcbf1e5e57c0cd08cb5784d14b2f19b0d: CDI devices from CRI Config.CDIDevices: []" Dec 12 18:44:22.420568 containerd[1769]: time="2025-12-12T18:44:22.420517749Z" level=info msg="CreateContainer within sandbox \"8fe25f13385a9281a364e1d1caebbcb540e07578593eeb4a0c0768a58102ea62\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"8481ef950729deddc546a5441387071dcbf1e5e57c0cd08cb5784d14b2f19b0d\"" Dec 12 18:44:22.421326 containerd[1769]: time="2025-12-12T18:44:22.421091301Z" level=info msg="StartContainer for \"8481ef950729deddc546a5441387071dcbf1e5e57c0cd08cb5784d14b2f19b0d\"" Dec 12 18:44:22.422175 containerd[1769]: time="2025-12-12T18:44:22.422142956Z" level=info msg="connecting to shim 8481ef950729deddc546a5441387071dcbf1e5e57c0cd08cb5784d14b2f19b0d" address="unix:///run/containerd/s/ac0efaedae4fd891dab046d5ad4ff17522f99e7b39c456de73d37b64d4697e90" protocol=ttrpc version=3 Dec 12 18:44:22.442356 systemd[1]: Started cri-containerd-8481ef950729deddc546a5441387071dcbf1e5e57c0cd08cb5784d14b2f19b0d.scope - libcontainer container 8481ef950729deddc546a5441387071dcbf1e5e57c0cd08cb5784d14b2f19b0d. Dec 12 18:44:22.470425 containerd[1769]: time="2025-12-12T18:44:22.470378205Z" level=info msg="StartContainer for \"8481ef950729deddc546a5441387071dcbf1e5e57c0cd08cb5784d14b2f19b0d\" returns successfully" Dec 12 18:44:22.557297 systemd-networkd[1674]: cali695397dafe3: Gained IPv6LL Dec 12 18:44:22.667460 containerd[1769]: time="2025-12-12T18:44:22.667388540Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:44:22.669287 containerd[1769]: time="2025-12-12T18:44:22.669252149Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 18:44:22.669343 containerd[1769]: time="2025-12-12T18:44:22.669290033Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 18:44:22.669558 kubelet[3062]: E1212 18:44:22.669506 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:44:22.669602 kubelet[3062]: E1212 18:44:22.669565 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:44:22.669724 kubelet[3062]: E1212 18:44:22.669689 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-278rl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-85cb55bbfc-bnqwj_calico-apiserver(590a3c9e-ec4d-4eb7-8a94-169742d9929c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 18:44:22.671054 kubelet[3062]: E1212 18:44:22.671009 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-bnqwj" podUID="590a3c9e-ec4d-4eb7-8a94-169742d9929c" Dec 12 18:44:23.174494 kubelet[3062]: E1212 18:44:23.174453 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-bnqwj" podUID="590a3c9e-ec4d-4eb7-8a94-169742d9929c" Dec 12 18:44:23.176783 kubelet[3062]: E1212 18:44:23.176722 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-m7vzj" podUID="20d32882-7b09-425d-879e-55aa1742ac4a" Dec 12 18:44:23.177025 kubelet[3062]: E1212 18:44:23.177007 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6678f769fb-bpdwv" podUID="0d411828-ce18-4a86-9fa9-09a812dd1345" Dec 12 18:44:23.197267 systemd-networkd[1674]: cali24a7bb65356: Gained IPv6LL Dec 12 18:44:23.211644 kubelet[3062]: I1212 18:44:23.211571 3062 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-lt2zp" podStartSLOduration=35.211552766 podStartE2EDuration="35.211552766s" podCreationTimestamp="2025-12-12 18:43:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 18:44:23.211271686 +0000 UTC m=+41.231008578" watchObservedRunningTime="2025-12-12 18:44:23.211552766 +0000 UTC m=+41.231289661" Dec 12 18:44:23.325259 systemd-networkd[1674]: calia22c7327ac7: Gained IPv6LL Dec 12 18:44:23.389257 systemd-networkd[1674]: cali2b326fc4c30: Gained IPv6LL Dec 12 18:44:24.178470 kubelet[3062]: E1212 18:44:24.178428 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-bnqwj" podUID="590a3c9e-ec4d-4eb7-8a94-169742d9929c" Dec 12 18:44:30.068512 containerd[1769]: time="2025-12-12T18:44:30.068456109Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 18:44:30.399756 containerd[1769]: time="2025-12-12T18:44:30.399695306Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:44:30.401652 containerd[1769]: time="2025-12-12T18:44:30.401587467Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 18:44:30.401652 containerd[1769]: time="2025-12-12T18:44:30.401634543Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 18:44:30.401867 kubelet[3062]: E1212 18:44:30.401819 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 18:44:30.402203 kubelet[3062]: E1212 18:44:30.401874 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 18:44:30.402203 kubelet[3062]: E1212 18:44:30.402053 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8b05f8ca7479467c89608bbc825ffb49,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sjpc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5854486f4f-qw8k4_calico-system(c1fa9bf4-cb97-4583-82c6-dc2a04ecc620): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 18:44:30.404243 containerd[1769]: time="2025-12-12T18:44:30.404192710Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 18:44:30.739381 containerd[1769]: time="2025-12-12T18:44:30.739234973Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:44:30.741373 containerd[1769]: time="2025-12-12T18:44:30.741336077Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 18:44:30.741444 containerd[1769]: time="2025-12-12T18:44:30.741383840Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 18:44:30.741634 kubelet[3062]: E1212 18:44:30.741596 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 18:44:30.741688 kubelet[3062]: E1212 18:44:30.741649 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 18:44:30.741803 kubelet[3062]: E1212 18:44:30.741770 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sjpc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5854486f4f-qw8k4_calico-system(c1fa9bf4-cb97-4583-82c6-dc2a04ecc620): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 18:44:30.743270 kubelet[3062]: E1212 18:44:30.743223 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5854486f4f-qw8k4" podUID="c1fa9bf4-cb97-4583-82c6-dc2a04ecc620" Dec 12 18:44:34.068540 containerd[1769]: time="2025-12-12T18:44:34.068480268Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 18:44:34.395570 containerd[1769]: time="2025-12-12T18:44:34.395508568Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:44:34.397466 containerd[1769]: time="2025-12-12T18:44:34.397410361Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 18:44:34.397531 containerd[1769]: time="2025-12-12T18:44:34.397498114Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 18:44:34.397664 kubelet[3062]: E1212 18:44:34.397617 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 18:44:34.397930 kubelet[3062]: E1212 18:44:34.397669 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 18:44:34.397955 kubelet[3062]: E1212 18:44:34.397928 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m8zv6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-pls24_calico-system(541d8bd6-57ea-4711-86e4-5819a7795d8f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 18:44:34.398063 containerd[1769]: time="2025-12-12T18:44:34.397986024Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 18:44:34.740715 containerd[1769]: time="2025-12-12T18:44:34.740447494Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:44:34.743050 containerd[1769]: time="2025-12-12T18:44:34.742956667Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 18:44:34.743050 containerd[1769]: time="2025-12-12T18:44:34.742985387Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 18:44:34.743772 kubelet[3062]: E1212 18:44:34.743206 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 18:44:34.743772 kubelet[3062]: E1212 18:44:34.743758 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 18:44:34.744403 kubelet[3062]: E1212 18:44:34.743986 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wccvq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-pcfdn_calico-system(77f6fa4e-0cea-4d83-8d55-8707dfa4d505): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 18:44:34.744561 containerd[1769]: time="2025-12-12T18:44:34.744041200Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 18:44:34.745150 kubelet[3062]: E1212 18:44:34.745124 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-pcfdn" podUID="77f6fa4e-0cea-4d83-8d55-8707dfa4d505" Dec 12 18:44:35.079320 containerd[1769]: time="2025-12-12T18:44:35.079199862Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:44:35.081138 containerd[1769]: time="2025-12-12T18:44:35.081092193Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 18:44:35.081209 containerd[1769]: time="2025-12-12T18:44:35.081141160Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 18:44:35.081387 kubelet[3062]: E1212 18:44:35.081349 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 18:44:35.081448 kubelet[3062]: E1212 18:44:35.081396 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 18:44:35.081600 kubelet[3062]: E1212 18:44:35.081565 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m8zv6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-pls24_calico-system(541d8bd6-57ea-4711-86e4-5819a7795d8f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 18:44:35.081914 containerd[1769]: time="2025-12-12T18:44:35.081884322Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 18:44:35.083029 kubelet[3062]: E1212 18:44:35.082990 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pls24" podUID="541d8bd6-57ea-4711-86e4-5819a7795d8f" Dec 12 18:44:35.422574 containerd[1769]: time="2025-12-12T18:44:35.422499559Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:44:35.424589 containerd[1769]: time="2025-12-12T18:44:35.424526083Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 18:44:35.424681 containerd[1769]: time="2025-12-12T18:44:35.424599161Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 18:44:35.424782 kubelet[3062]: E1212 18:44:35.424742 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 18:44:35.425039 kubelet[3062]: E1212 18:44:35.424789 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 18:44:35.425039 kubelet[3062]: E1212 18:44:35.424940 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ldgzq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-6678f769fb-bpdwv_calico-system(0d411828-ce18-4a86-9fa9-09a812dd1345): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 18:44:35.426156 kubelet[3062]: E1212 18:44:35.426115 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6678f769fb-bpdwv" podUID="0d411828-ce18-4a86-9fa9-09a812dd1345" Dec 12 18:44:37.068233 containerd[1769]: time="2025-12-12T18:44:37.068043895Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 18:44:37.394097 containerd[1769]: time="2025-12-12T18:44:37.394034574Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:44:37.396175 containerd[1769]: time="2025-12-12T18:44:37.396135053Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 18:44:37.396254 containerd[1769]: time="2025-12-12T18:44:37.396225746Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 18:44:37.396446 kubelet[3062]: E1212 18:44:37.396392 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:44:37.396728 kubelet[3062]: E1212 18:44:37.396453 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:44:37.396728 kubelet[3062]: E1212 18:44:37.396585 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-278rl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-85cb55bbfc-bnqwj_calico-apiserver(590a3c9e-ec4d-4eb7-8a94-169742d9929c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 18:44:37.397856 kubelet[3062]: E1212 18:44:37.397796 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-bnqwj" podUID="590a3c9e-ec4d-4eb7-8a94-169742d9929c" Dec 12 18:44:39.068395 containerd[1769]: time="2025-12-12T18:44:39.068334367Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 18:44:39.413306 containerd[1769]: time="2025-12-12T18:44:39.413250241Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:44:39.415036 containerd[1769]: time="2025-12-12T18:44:39.414981707Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 18:44:39.415119 containerd[1769]: time="2025-12-12T18:44:39.415054838Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 18:44:39.415233 kubelet[3062]: E1212 18:44:39.415190 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:44:39.415496 kubelet[3062]: E1212 18:44:39.415239 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:44:39.415496 kubelet[3062]: E1212 18:44:39.415364 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nch2q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-85cb55bbfc-m7vzj_calico-apiserver(20d32882-7b09-425d-879e-55aa1742ac4a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 18:44:39.416647 kubelet[3062]: E1212 18:44:39.416607 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-m7vzj" podUID="20d32882-7b09-425d-879e-55aa1742ac4a" Dec 12 18:44:45.068881 kubelet[3062]: E1212 18:44:45.068834 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5854486f4f-qw8k4" podUID="c1fa9bf4-cb97-4583-82c6-dc2a04ecc620" Dec 12 18:44:46.069383 kubelet[3062]: E1212 18:44:46.069033 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pls24" podUID="541d8bd6-57ea-4711-86e4-5819a7795d8f" Dec 12 18:44:47.068403 kubelet[3062]: E1212 18:44:47.068358 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-pcfdn" podUID="77f6fa4e-0cea-4d83-8d55-8707dfa4d505" Dec 12 18:44:48.068184 kubelet[3062]: E1212 18:44:48.068103 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-bnqwj" podUID="590a3c9e-ec4d-4eb7-8a94-169742d9929c" Dec 12 18:44:48.068184 kubelet[3062]: E1212 18:44:48.068106 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6678f769fb-bpdwv" podUID="0d411828-ce18-4a86-9fa9-09a812dd1345" Dec 12 18:44:51.067847 kubelet[3062]: E1212 18:44:51.067807 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-m7vzj" podUID="20d32882-7b09-425d-879e-55aa1742ac4a" Dec 12 18:44:56.068686 containerd[1769]: time="2025-12-12T18:44:56.068611870Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 18:44:56.399392 containerd[1769]: time="2025-12-12T18:44:56.399334651Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:44:56.402381 containerd[1769]: time="2025-12-12T18:44:56.402340037Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 18:44:56.402459 containerd[1769]: time="2025-12-12T18:44:56.402422430Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 18:44:56.402734 kubelet[3062]: E1212 18:44:56.402660 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 18:44:56.403101 kubelet[3062]: E1212 18:44:56.402931 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 18:44:56.403101 kubelet[3062]: E1212 18:44:56.403048 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8b05f8ca7479467c89608bbc825ffb49,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sjpc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5854486f4f-qw8k4_calico-system(c1fa9bf4-cb97-4583-82c6-dc2a04ecc620): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 18:44:56.405642 containerd[1769]: time="2025-12-12T18:44:56.405613251Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 18:44:56.735389 containerd[1769]: time="2025-12-12T18:44:56.735254017Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:44:56.737353 containerd[1769]: time="2025-12-12T18:44:56.737299624Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 18:44:56.737486 containerd[1769]: time="2025-12-12T18:44:56.737331143Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 18:44:56.737619 kubelet[3062]: E1212 18:44:56.737577 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 18:44:56.737665 kubelet[3062]: E1212 18:44:56.737638 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 18:44:56.738119 kubelet[3062]: E1212 18:44:56.737769 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sjpc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5854486f4f-qw8k4_calico-system(c1fa9bf4-cb97-4583-82c6-dc2a04ecc620): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 18:44:56.739024 kubelet[3062]: E1212 18:44:56.738963 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5854486f4f-qw8k4" podUID="c1fa9bf4-cb97-4583-82c6-dc2a04ecc620" Dec 12 18:44:59.067565 containerd[1769]: time="2025-12-12T18:44:59.067495545Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 18:44:59.405714 containerd[1769]: time="2025-12-12T18:44:59.405653364Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:44:59.407646 containerd[1769]: time="2025-12-12T18:44:59.407595118Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 18:44:59.407719 containerd[1769]: time="2025-12-12T18:44:59.407671079Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 18:44:59.407892 kubelet[3062]: E1212 18:44:59.407855 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:44:59.408369 kubelet[3062]: E1212 18:44:59.408221 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:44:59.408947 kubelet[3062]: E1212 18:44:59.408461 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-278rl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-85cb55bbfc-bnqwj_calico-apiserver(590a3c9e-ec4d-4eb7-8a94-169742d9929c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 18:44:59.409106 containerd[1769]: time="2025-12-12T18:44:59.408863965Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 18:44:59.409923 kubelet[3062]: E1212 18:44:59.409870 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-bnqwj" podUID="590a3c9e-ec4d-4eb7-8a94-169742d9929c" Dec 12 18:44:59.748830 containerd[1769]: time="2025-12-12T18:44:59.748612572Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:44:59.750796 containerd[1769]: time="2025-12-12T18:44:59.750679479Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 18:44:59.750796 containerd[1769]: time="2025-12-12T18:44:59.750767964Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 18:44:59.752078 kubelet[3062]: E1212 18:44:59.751007 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 18:44:59.752078 kubelet[3062]: E1212 18:44:59.751053 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 18:44:59.752078 kubelet[3062]: E1212 18:44:59.751295 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m8zv6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-pls24_calico-system(541d8bd6-57ea-4711-86e4-5819a7795d8f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 18:44:59.752252 containerd[1769]: time="2025-12-12T18:44:59.751340343Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 18:45:00.080184 containerd[1769]: time="2025-12-12T18:45:00.079877152Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:45:00.082565 containerd[1769]: time="2025-12-12T18:45:00.082512270Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 18:45:00.082660 containerd[1769]: time="2025-12-12T18:45:00.082609784Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 18:45:00.082808 kubelet[3062]: E1212 18:45:00.082763 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 18:45:00.082852 kubelet[3062]: E1212 18:45:00.082817 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 18:45:00.083106 kubelet[3062]: E1212 18:45:00.083054 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wccvq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-pcfdn_calico-system(77f6fa4e-0cea-4d83-8d55-8707dfa4d505): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 18:45:00.083220 containerd[1769]: time="2025-12-12T18:45:00.083194214Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 18:45:00.084908 kubelet[3062]: E1212 18:45:00.084877 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-pcfdn" podUID="77f6fa4e-0cea-4d83-8d55-8707dfa4d505" Dec 12 18:45:00.419633 containerd[1769]: time="2025-12-12T18:45:00.419556188Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:45:00.424583 containerd[1769]: time="2025-12-12T18:45:00.424497566Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 18:45:00.424714 containerd[1769]: time="2025-12-12T18:45:00.424565209Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 18:45:00.424843 kubelet[3062]: E1212 18:45:00.424787 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 18:45:00.425163 kubelet[3062]: E1212 18:45:00.424851 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 18:45:00.425163 kubelet[3062]: E1212 18:45:00.424990 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m8zv6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-pls24_calico-system(541d8bd6-57ea-4711-86e4-5819a7795d8f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 18:45:00.426244 kubelet[3062]: E1212 18:45:00.426190 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pls24" podUID="541d8bd6-57ea-4711-86e4-5819a7795d8f" Dec 12 18:45:03.068396 containerd[1769]: time="2025-12-12T18:45:03.068353128Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 18:45:03.414998 containerd[1769]: time="2025-12-12T18:45:03.414945872Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:45:03.417668 containerd[1769]: time="2025-12-12T18:45:03.417586922Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 18:45:03.417668 containerd[1769]: time="2025-12-12T18:45:03.417646330Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 18:45:03.417942 kubelet[3062]: E1212 18:45:03.417866 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 18:45:03.417942 kubelet[3062]: E1212 18:45:03.417927 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 18:45:03.418378 kubelet[3062]: E1212 18:45:03.418154 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ldgzq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-6678f769fb-bpdwv_calico-system(0d411828-ce18-4a86-9fa9-09a812dd1345): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 18:45:03.419400 kubelet[3062]: E1212 18:45:03.419310 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6678f769fb-bpdwv" podUID="0d411828-ce18-4a86-9fa9-09a812dd1345" Dec 12 18:45:04.070268 containerd[1769]: time="2025-12-12T18:45:04.070188370Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 18:45:04.407038 containerd[1769]: time="2025-12-12T18:45:04.406875536Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:45:04.409335 containerd[1769]: time="2025-12-12T18:45:04.409281911Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 18:45:04.409428 containerd[1769]: time="2025-12-12T18:45:04.409370951Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 18:45:04.409564 kubelet[3062]: E1212 18:45:04.409508 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:45:04.409606 kubelet[3062]: E1212 18:45:04.409570 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:45:04.409736 kubelet[3062]: E1212 18:45:04.409704 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nch2q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-85cb55bbfc-m7vzj_calico-apiserver(20d32882-7b09-425d-879e-55aa1742ac4a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 18:45:04.410881 kubelet[3062]: E1212 18:45:04.410849 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-m7vzj" podUID="20d32882-7b09-425d-879e-55aa1742ac4a" Dec 12 18:45:11.068089 kubelet[3062]: E1212 18:45:11.068019 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-pcfdn" podUID="77f6fa4e-0cea-4d83-8d55-8707dfa4d505" Dec 12 18:45:11.068731 kubelet[3062]: E1212 18:45:11.068693 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5854486f4f-qw8k4" podUID="c1fa9bf4-cb97-4583-82c6-dc2a04ecc620" Dec 12 18:45:12.068790 kubelet[3062]: E1212 18:45:12.068713 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pls24" podUID="541d8bd6-57ea-4711-86e4-5819a7795d8f" Dec 12 18:45:13.069680 kubelet[3062]: E1212 18:45:13.069592 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-bnqwj" podUID="590a3c9e-ec4d-4eb7-8a94-169742d9929c" Dec 12 18:45:15.109916 systemd[1]: Started sshd@9-10.0.8.97:22-167.71.75.147:58608.service - OpenSSH per-connection server daemon (167.71.75.147:58608). Dec 12 18:45:15.183709 sshd[5507]: Connection closed by 167.71.75.147 port 58608 Dec 12 18:45:15.185138 systemd[1]: sshd@9-10.0.8.97:22-167.71.75.147:58608.service: Deactivated successfully. Dec 12 18:45:17.068403 kubelet[3062]: E1212 18:45:17.068355 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6678f769fb-bpdwv" podUID="0d411828-ce18-4a86-9fa9-09a812dd1345" Dec 12 18:45:19.068216 kubelet[3062]: E1212 18:45:19.068146 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-m7vzj" podUID="20d32882-7b09-425d-879e-55aa1742ac4a" Dec 12 18:45:23.068651 kubelet[3062]: E1212 18:45:23.068603 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5854486f4f-qw8k4" podUID="c1fa9bf4-cb97-4583-82c6-dc2a04ecc620" Dec 12 18:45:24.067977 kubelet[3062]: E1212 18:45:24.067830 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-bnqwj" podUID="590a3c9e-ec4d-4eb7-8a94-169742d9929c" Dec 12 18:45:25.068153 kubelet[3062]: E1212 18:45:25.067731 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-pcfdn" podUID="77f6fa4e-0cea-4d83-8d55-8707dfa4d505" Dec 12 18:45:26.068213 kubelet[3062]: E1212 18:45:26.068147 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pls24" podUID="541d8bd6-57ea-4711-86e4-5819a7795d8f" Dec 12 18:45:29.067254 kubelet[3062]: E1212 18:45:29.067210 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6678f769fb-bpdwv" podUID="0d411828-ce18-4a86-9fa9-09a812dd1345" Dec 12 18:45:32.068638 kubelet[3062]: E1212 18:45:32.068601 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-m7vzj" podUID="20d32882-7b09-425d-879e-55aa1742ac4a" Dec 12 18:45:34.072780 kubelet[3062]: E1212 18:45:34.072407 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5854486f4f-qw8k4" podUID="c1fa9bf4-cb97-4583-82c6-dc2a04ecc620" Dec 12 18:45:35.069063 kubelet[3062]: E1212 18:45:35.069024 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-bnqwj" podUID="590a3c9e-ec4d-4eb7-8a94-169742d9929c" Dec 12 18:45:36.069684 kubelet[3062]: E1212 18:45:36.069623 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-pcfdn" podUID="77f6fa4e-0cea-4d83-8d55-8707dfa4d505" Dec 12 18:45:38.068863 kubelet[3062]: E1212 18:45:38.068572 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pls24" podUID="541d8bd6-57ea-4711-86e4-5819a7795d8f" Dec 12 18:45:42.072328 kubelet[3062]: E1212 18:45:42.072271 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6678f769fb-bpdwv" podUID="0d411828-ce18-4a86-9fa9-09a812dd1345" Dec 12 18:45:43.067835 kubelet[3062]: E1212 18:45:43.067781 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-m7vzj" podUID="20d32882-7b09-425d-879e-55aa1742ac4a" Dec 12 18:45:45.068367 containerd[1769]: time="2025-12-12T18:45:45.068279687Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 18:45:45.401657 containerd[1769]: time="2025-12-12T18:45:45.401582149Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:45:45.404050 containerd[1769]: time="2025-12-12T18:45:45.403829338Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 18:45:45.404050 containerd[1769]: time="2025-12-12T18:45:45.403908934Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 18:45:45.404180 kubelet[3062]: E1212 18:45:45.404099 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 18:45:45.404180 kubelet[3062]: E1212 18:45:45.404153 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 18:45:45.404437 kubelet[3062]: E1212 18:45:45.404281 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8b05f8ca7479467c89608bbc825ffb49,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sjpc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5854486f4f-qw8k4_calico-system(c1fa9bf4-cb97-4583-82c6-dc2a04ecc620): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 18:45:45.406384 containerd[1769]: time="2025-12-12T18:45:45.406360583Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 18:45:45.753330 containerd[1769]: time="2025-12-12T18:45:45.753215703Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:45:45.755787 containerd[1769]: time="2025-12-12T18:45:45.755734506Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 18:45:45.755885 containerd[1769]: time="2025-12-12T18:45:45.755809648Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 18:45:45.756006 kubelet[3062]: E1212 18:45:45.755965 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 18:45:45.756054 kubelet[3062]: E1212 18:45:45.756020 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 18:45:45.757020 kubelet[3062]: E1212 18:45:45.756944 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sjpc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5854486f4f-qw8k4_calico-system(c1fa9bf4-cb97-4583-82c6-dc2a04ecc620): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 18:45:45.758166 kubelet[3062]: E1212 18:45:45.758119 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5854486f4f-qw8k4" podUID="c1fa9bf4-cb97-4583-82c6-dc2a04ecc620" Dec 12 18:45:49.070246 containerd[1769]: time="2025-12-12T18:45:49.070204424Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 18:45:49.409940 containerd[1769]: time="2025-12-12T18:45:49.409882891Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:45:49.411843 containerd[1769]: time="2025-12-12T18:45:49.411782005Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 18:45:49.411843 containerd[1769]: time="2025-12-12T18:45:49.411820367Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 18:45:49.412125 kubelet[3062]: E1212 18:45:49.412055 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:45:49.412806 kubelet[3062]: E1212 18:45:49.412134 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:45:49.412806 kubelet[3062]: E1212 18:45:49.412676 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-278rl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-85cb55bbfc-bnqwj_calico-apiserver(590a3c9e-ec4d-4eb7-8a94-169742d9929c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 18:45:49.412926 containerd[1769]: time="2025-12-12T18:45:49.412463412Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 18:45:49.414693 kubelet[3062]: E1212 18:45:49.414635 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-bnqwj" podUID="590a3c9e-ec4d-4eb7-8a94-169742d9929c" Dec 12 18:45:49.746745 containerd[1769]: time="2025-12-12T18:45:49.746362757Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:45:49.748566 containerd[1769]: time="2025-12-12T18:45:49.748528367Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 18:45:49.748622 containerd[1769]: time="2025-12-12T18:45:49.748558983Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 18:45:49.748844 kubelet[3062]: E1212 18:45:49.748759 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 18:45:49.748844 kubelet[3062]: E1212 18:45:49.748815 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 18:45:49.748977 kubelet[3062]: E1212 18:45:49.748942 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m8zv6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-pls24_calico-system(541d8bd6-57ea-4711-86e4-5819a7795d8f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 18:45:49.750896 containerd[1769]: time="2025-12-12T18:45:49.750862301Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 18:45:50.074453 containerd[1769]: time="2025-12-12T18:45:50.074336757Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:45:50.076595 containerd[1769]: time="2025-12-12T18:45:50.076536126Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 18:45:50.076681 containerd[1769]: time="2025-12-12T18:45:50.076618286Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 18:45:50.076749 kubelet[3062]: E1212 18:45:50.076700 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 18:45:50.076749 kubelet[3062]: E1212 18:45:50.076744 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 18:45:50.076979 kubelet[3062]: E1212 18:45:50.076938 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m8zv6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-pls24_calico-system(541d8bd6-57ea-4711-86e4-5819a7795d8f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 18:45:50.077079 containerd[1769]: time="2025-12-12T18:45:50.076994477Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 18:45:50.078301 kubelet[3062]: E1212 18:45:50.078268 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pls24" podUID="541d8bd6-57ea-4711-86e4-5819a7795d8f" Dec 12 18:45:50.409384 containerd[1769]: time="2025-12-12T18:45:50.409332460Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:45:50.411440 containerd[1769]: time="2025-12-12T18:45:50.411389168Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 18:45:50.411529 containerd[1769]: time="2025-12-12T18:45:50.411481211Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 18:45:50.411685 kubelet[3062]: E1212 18:45:50.411644 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 18:45:50.411731 kubelet[3062]: E1212 18:45:50.411699 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 18:45:50.412083 kubelet[3062]: E1212 18:45:50.411874 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wccvq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-pcfdn_calico-system(77f6fa4e-0cea-4d83-8d55-8707dfa4d505): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 18:45:50.413353 kubelet[3062]: E1212 18:45:50.413323 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-pcfdn" podUID="77f6fa4e-0cea-4d83-8d55-8707dfa4d505" Dec 12 18:45:50.757566 update_engine[1754]: I20251212 18:45:50.757408 1754 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Dec 12 18:45:50.757566 update_engine[1754]: I20251212 18:45:50.757468 1754 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Dec 12 18:45:50.758716 update_engine[1754]: I20251212 18:45:50.758675 1754 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Dec 12 18:45:50.759141 update_engine[1754]: I20251212 18:45:50.759120 1754 omaha_request_params.cc:62] Current group set to stable Dec 12 18:45:50.759254 update_engine[1754]: I20251212 18:45:50.759238 1754 update_attempter.cc:499] Already updated boot flags. Skipping. Dec 12 18:45:50.759254 update_engine[1754]: I20251212 18:45:50.759249 1754 update_attempter.cc:643] Scheduling an action processor start. Dec 12 18:45:50.759296 update_engine[1754]: I20251212 18:45:50.759270 1754 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Dec 12 18:45:50.759315 update_engine[1754]: I20251212 18:45:50.759308 1754 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Dec 12 18:45:50.759384 update_engine[1754]: I20251212 18:45:50.759366 1754 omaha_request_action.cc:271] Posting an Omaha request to disabled Dec 12 18:45:50.759384 update_engine[1754]: I20251212 18:45:50.759380 1754 omaha_request_action.cc:272] Request: Dec 12 18:45:50.759384 update_engine[1754]: Dec 12 18:45:50.759384 update_engine[1754]: Dec 12 18:45:50.759384 update_engine[1754]: Dec 12 18:45:50.759384 update_engine[1754]: Dec 12 18:45:50.759384 update_engine[1754]: Dec 12 18:45:50.759384 update_engine[1754]: Dec 12 18:45:50.759384 update_engine[1754]: Dec 12 18:45:50.759384 update_engine[1754]: Dec 12 18:45:50.759571 update_engine[1754]: I20251212 18:45:50.759387 1754 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 12 18:45:50.760091 locksmithd[1800]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Dec 12 18:45:50.760336 update_engine[1754]: I20251212 18:45:50.760313 1754 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 12 18:45:50.760873 update_engine[1754]: I20251212 18:45:50.760852 1754 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 12 18:45:50.767078 update_engine[1754]: E20251212 18:45:50.766659 1754 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Dec 12 18:45:50.767078 update_engine[1754]: I20251212 18:45:50.766742 1754 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Dec 12 18:45:53.854463 systemd[1]: Started sshd@10-10.0.8.97:22-147.75.109.163:38974.service - OpenSSH per-connection server daemon (147.75.109.163:38974). Dec 12 18:45:54.068499 containerd[1769]: time="2025-12-12T18:45:54.068454006Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 18:45:54.401484 containerd[1769]: time="2025-12-12T18:45:54.401343941Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:45:54.403246 containerd[1769]: time="2025-12-12T18:45:54.403190057Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 18:45:54.403342 containerd[1769]: time="2025-12-12T18:45:54.403190596Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 18:45:54.403496 kubelet[3062]: E1212 18:45:54.403455 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 18:45:54.404097 kubelet[3062]: E1212 18:45:54.403517 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 18:45:54.404097 kubelet[3062]: E1212 18:45:54.403758 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ldgzq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-6678f769fb-bpdwv_calico-system(0d411828-ce18-4a86-9fa9-09a812dd1345): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 18:45:54.404219 containerd[1769]: time="2025-12-12T18:45:54.403801047Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 18:45:54.405142 kubelet[3062]: E1212 18:45:54.405117 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6678f769fb-bpdwv" podUID="0d411828-ce18-4a86-9fa9-09a812dd1345" Dec 12 18:45:54.737709 containerd[1769]: time="2025-12-12T18:45:54.737477555Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:45:54.739424 containerd[1769]: time="2025-12-12T18:45:54.739327469Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 18:45:54.739424 containerd[1769]: time="2025-12-12T18:45:54.739414432Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 18:45:54.739587 kubelet[3062]: E1212 18:45:54.739547 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:45:54.739628 kubelet[3062]: E1212 18:45:54.739602 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:45:54.740229 kubelet[3062]: E1212 18:45:54.740184 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nch2q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-85cb55bbfc-m7vzj_calico-apiserver(20d32882-7b09-425d-879e-55aa1742ac4a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 18:45:54.741743 kubelet[3062]: E1212 18:45:54.741699 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-m7vzj" podUID="20d32882-7b09-425d-879e-55aa1742ac4a" Dec 12 18:45:54.816595 sshd[5582]: Accepted publickey for core from 147.75.109.163 port 38974 ssh2: RSA SHA256:zFX4bNqXIKDKZnJ173gM+qIyMrs2hUKPmae9E2loakI Dec 12 18:45:54.817898 sshd-session[5582]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:45:54.822544 systemd-logind[1749]: New session 10 of user core. Dec 12 18:45:54.843424 systemd[1]: Started session-10.scope - Session 10 of User core. Dec 12 18:45:55.550033 sshd[5585]: Connection closed by 147.75.109.163 port 38974 Dec 12 18:45:55.550421 sshd-session[5582]: pam_unix(sshd:session): session closed for user core Dec 12 18:45:55.554133 systemd[1]: sshd@10-10.0.8.97:22-147.75.109.163:38974.service: Deactivated successfully. Dec 12 18:45:55.555778 systemd[1]: session-10.scope: Deactivated successfully. Dec 12 18:45:55.556439 systemd-logind[1749]: Session 10 logged out. Waiting for processes to exit. Dec 12 18:45:55.557305 systemd-logind[1749]: Removed session 10. Dec 12 18:45:57.070084 kubelet[3062]: E1212 18:45:57.069610 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5854486f4f-qw8k4" podUID="c1fa9bf4-cb97-4583-82c6-dc2a04ecc620" Dec 12 18:46:00.716895 systemd[1]: Started sshd@11-10.0.8.97:22-147.75.109.163:38976.service - OpenSSH per-connection server daemon (147.75.109.163:38976). Dec 12 18:46:00.761403 update_engine[1754]: I20251212 18:46:00.761327 1754 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 12 18:46:00.761746 update_engine[1754]: I20251212 18:46:00.761428 1754 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 12 18:46:00.761770 update_engine[1754]: I20251212 18:46:00.761754 1754 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 12 18:46:00.768011 update_engine[1754]: E20251212 18:46:00.767957 1754 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Dec 12 18:46:00.768104 update_engine[1754]: I20251212 18:46:00.768040 1754 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Dec 12 18:46:01.068143 kubelet[3062]: E1212 18:46:01.068014 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-pcfdn" podUID="77f6fa4e-0cea-4d83-8d55-8707dfa4d505" Dec 12 18:46:01.674733 sshd[5616]: Accepted publickey for core from 147.75.109.163 port 38976 ssh2: RSA SHA256:zFX4bNqXIKDKZnJ173gM+qIyMrs2hUKPmae9E2loakI Dec 12 18:46:01.676152 sshd-session[5616]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:46:01.680352 systemd-logind[1749]: New session 11 of user core. Dec 12 18:46:01.697321 systemd[1]: Started session-11.scope - Session 11 of User core. Dec 12 18:46:02.392977 sshd[5619]: Connection closed by 147.75.109.163 port 38976 Dec 12 18:46:02.393301 sshd-session[5616]: pam_unix(sshd:session): session closed for user core Dec 12 18:46:02.396537 systemd[1]: sshd@11-10.0.8.97:22-147.75.109.163:38976.service: Deactivated successfully. Dec 12 18:46:02.398116 systemd[1]: session-11.scope: Deactivated successfully. Dec 12 18:46:02.398710 systemd-logind[1749]: Session 11 logged out. Waiting for processes to exit. Dec 12 18:46:02.399620 systemd-logind[1749]: Removed session 11. Dec 12 18:46:04.070533 kubelet[3062]: E1212 18:46:04.070419 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-bnqwj" podUID="590a3c9e-ec4d-4eb7-8a94-169742d9929c" Dec 12 18:46:04.070944 kubelet[3062]: E1212 18:46:04.070650 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pls24" podUID="541d8bd6-57ea-4711-86e4-5819a7795d8f" Dec 12 18:46:07.067841 kubelet[3062]: E1212 18:46:07.067777 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-m7vzj" podUID="20d32882-7b09-425d-879e-55aa1742ac4a" Dec 12 18:46:07.566161 systemd[1]: Started sshd@12-10.0.8.97:22-147.75.109.163:51544.service - OpenSSH per-connection server daemon (147.75.109.163:51544). Dec 12 18:46:07.567153 systemd[1]: Started sshd@13-10.0.8.97:22-167.71.75.147:36010.service - OpenSSH per-connection server daemon (167.71.75.147:36010). Dec 12 18:46:08.069977 kubelet[3062]: E1212 18:46:08.069919 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5854486f4f-qw8k4" podUID="c1fa9bf4-cb97-4583-82c6-dc2a04ecc620" Dec 12 18:46:08.524962 sshd[5640]: Accepted publickey for core from 147.75.109.163 port 51544 ssh2: RSA SHA256:zFX4bNqXIKDKZnJ173gM+qIyMrs2hUKPmae9E2loakI Dec 12 18:46:08.528626 sshd-session[5640]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:46:08.534367 systemd-logind[1749]: New session 12 of user core. Dec 12 18:46:08.545516 systemd[1]: Started session-12.scope - Session 12 of User core. Dec 12 18:46:08.545884 sshd[5641]: Invalid user admin from 167.71.75.147 port 36010 Dec 12 18:46:08.842325 sshd[5641]: Connection closed by invalid user admin 167.71.75.147 port 36010 [preauth] Dec 12 18:46:08.844368 systemd[1]: sshd@13-10.0.8.97:22-167.71.75.147:36010.service: Deactivated successfully. Dec 12 18:46:09.242086 sshd[5646]: Connection closed by 147.75.109.163 port 51544 Dec 12 18:46:09.242510 sshd-session[5640]: pam_unix(sshd:session): session closed for user core Dec 12 18:46:09.245508 systemd[1]: sshd@12-10.0.8.97:22-147.75.109.163:51544.service: Deactivated successfully. Dec 12 18:46:09.247433 systemd[1]: session-12.scope: Deactivated successfully. Dec 12 18:46:09.248772 systemd-logind[1749]: Session 12 logged out. Waiting for processes to exit. Dec 12 18:46:09.249745 systemd-logind[1749]: Removed session 12. Dec 12 18:46:09.425733 systemd[1]: Started sshd@14-10.0.8.97:22-147.75.109.163:51546.service - OpenSSH per-connection server daemon (147.75.109.163:51546). Dec 12 18:46:10.068575 kubelet[3062]: E1212 18:46:10.067295 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6678f769fb-bpdwv" podUID="0d411828-ce18-4a86-9fa9-09a812dd1345" Dec 12 18:46:10.461188 sshd[5666]: Accepted publickey for core from 147.75.109.163 port 51546 ssh2: RSA SHA256:zFX4bNqXIKDKZnJ173gM+qIyMrs2hUKPmae9E2loakI Dec 12 18:46:10.462412 sshd-session[5666]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:46:10.466927 systemd-logind[1749]: New session 13 of user core. Dec 12 18:46:10.487306 systemd[1]: Started session-13.scope - Session 13 of User core. Dec 12 18:46:10.756560 update_engine[1754]: I20251212 18:46:10.756392 1754 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 12 18:46:10.756560 update_engine[1754]: I20251212 18:46:10.756489 1754 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 12 18:46:10.756904 update_engine[1754]: I20251212 18:46:10.756834 1754 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 12 18:46:10.764263 update_engine[1754]: E20251212 18:46:10.764200 1754 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Dec 12 18:46:10.764403 update_engine[1754]: I20251212 18:46:10.764300 1754 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Dec 12 18:46:11.261536 sshd[5669]: Connection closed by 147.75.109.163 port 51546 Dec 12 18:46:11.262119 sshd-session[5666]: pam_unix(sshd:session): session closed for user core Dec 12 18:46:11.265755 systemd[1]: sshd@14-10.0.8.97:22-147.75.109.163:51546.service: Deactivated successfully. Dec 12 18:46:11.267444 systemd[1]: session-13.scope: Deactivated successfully. Dec 12 18:46:11.268050 systemd-logind[1749]: Session 13 logged out. Waiting for processes to exit. Dec 12 18:46:11.268991 systemd-logind[1749]: Removed session 13. Dec 12 18:46:11.429233 systemd[1]: Started sshd@15-10.0.8.97:22-147.75.109.163:51562.service - OpenSSH per-connection server daemon (147.75.109.163:51562). Dec 12 18:46:12.412753 sshd[5688]: Accepted publickey for core from 147.75.109.163 port 51562 ssh2: RSA SHA256:zFX4bNqXIKDKZnJ173gM+qIyMrs2hUKPmae9E2loakI Dec 12 18:46:12.414206 sshd-session[5688]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:46:12.418975 systemd-logind[1749]: New session 14 of user core. Dec 12 18:46:12.433280 systemd[1]: Started session-14.scope - Session 14 of User core. Dec 12 18:46:13.142919 sshd[5691]: Connection closed by 147.75.109.163 port 51562 Dec 12 18:46:13.143564 sshd-session[5688]: pam_unix(sshd:session): session closed for user core Dec 12 18:46:13.146938 systemd[1]: sshd@15-10.0.8.97:22-147.75.109.163:51562.service: Deactivated successfully. Dec 12 18:46:13.148630 systemd[1]: session-14.scope: Deactivated successfully. Dec 12 18:46:13.149252 systemd-logind[1749]: Session 14 logged out. Waiting for processes to exit. Dec 12 18:46:13.150129 systemd-logind[1749]: Removed session 14. Dec 12 18:46:15.067607 kubelet[3062]: E1212 18:46:15.067538 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-pcfdn" podUID="77f6fa4e-0cea-4d83-8d55-8707dfa4d505" Dec 12 18:46:17.070383 kubelet[3062]: E1212 18:46:17.070320 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pls24" podUID="541d8bd6-57ea-4711-86e4-5819a7795d8f" Dec 12 18:46:18.329320 systemd[1]: Started sshd@16-10.0.8.97:22-147.75.109.163:37396.service - OpenSSH per-connection server daemon (147.75.109.163:37396). Dec 12 18:46:19.067359 kubelet[3062]: E1212 18:46:19.067316 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-bnqwj" podUID="590a3c9e-ec4d-4eb7-8a94-169742d9929c" Dec 12 18:46:19.370838 sshd[5737]: Accepted publickey for core from 147.75.109.163 port 37396 ssh2: RSA SHA256:zFX4bNqXIKDKZnJ173gM+qIyMrs2hUKPmae9E2loakI Dec 12 18:46:19.372093 sshd-session[5737]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:46:19.376312 systemd-logind[1749]: New session 15 of user core. Dec 12 18:46:19.391899 systemd[1]: Started session-15.scope - Session 15 of User core. Dec 12 18:46:20.069715 kubelet[3062]: E1212 18:46:20.069652 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-m7vzj" podUID="20d32882-7b09-425d-879e-55aa1742ac4a" Dec 12 18:46:20.146274 sshd[5742]: Connection closed by 147.75.109.163 port 37396 Dec 12 18:46:20.146693 sshd-session[5737]: pam_unix(sshd:session): session closed for user core Dec 12 18:46:20.149983 systemd[1]: sshd@16-10.0.8.97:22-147.75.109.163:37396.service: Deactivated successfully. Dec 12 18:46:20.151778 systemd[1]: session-15.scope: Deactivated successfully. Dec 12 18:46:20.153580 systemd-logind[1749]: Session 15 logged out. Waiting for processes to exit. Dec 12 18:46:20.154552 systemd-logind[1749]: Removed session 15. Dec 12 18:46:20.316619 systemd[1]: Started sshd@17-10.0.8.97:22-147.75.109.163:37410.service - OpenSSH per-connection server daemon (147.75.109.163:37410). Dec 12 18:46:20.757150 update_engine[1754]: I20251212 18:46:20.756739 1754 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 12 18:46:20.757150 update_engine[1754]: I20251212 18:46:20.756839 1754 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 12 18:46:20.757528 update_engine[1754]: I20251212 18:46:20.757191 1754 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 12 18:46:20.764535 update_engine[1754]: E20251212 18:46:20.764479 1754 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Dec 12 18:46:20.764664 update_engine[1754]: I20251212 18:46:20.764565 1754 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Dec 12 18:46:20.764664 update_engine[1754]: I20251212 18:46:20.764573 1754 omaha_request_action.cc:617] Omaha request response: Dec 12 18:46:20.764664 update_engine[1754]: E20251212 18:46:20.764647 1754 omaha_request_action.cc:636] Omaha request network transfer failed. Dec 12 18:46:20.764724 update_engine[1754]: I20251212 18:46:20.764664 1754 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Dec 12 18:46:20.764724 update_engine[1754]: I20251212 18:46:20.764669 1754 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Dec 12 18:46:20.764724 update_engine[1754]: I20251212 18:46:20.764673 1754 update_attempter.cc:306] Processing Done. Dec 12 18:46:20.764724 update_engine[1754]: E20251212 18:46:20.764688 1754 update_attempter.cc:619] Update failed. Dec 12 18:46:20.764724 update_engine[1754]: I20251212 18:46:20.764691 1754 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Dec 12 18:46:20.764724 update_engine[1754]: I20251212 18:46:20.764696 1754 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Dec 12 18:46:20.764724 update_engine[1754]: I20251212 18:46:20.764700 1754 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Dec 12 18:46:20.764847 update_engine[1754]: I20251212 18:46:20.764763 1754 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Dec 12 18:46:20.764847 update_engine[1754]: I20251212 18:46:20.764783 1754 omaha_request_action.cc:271] Posting an Omaha request to disabled Dec 12 18:46:20.764847 update_engine[1754]: I20251212 18:46:20.764787 1754 omaha_request_action.cc:272] Request: Dec 12 18:46:20.764847 update_engine[1754]: Dec 12 18:46:20.764847 update_engine[1754]: Dec 12 18:46:20.764847 update_engine[1754]: Dec 12 18:46:20.764847 update_engine[1754]: Dec 12 18:46:20.764847 update_engine[1754]: Dec 12 18:46:20.764847 update_engine[1754]: Dec 12 18:46:20.764847 update_engine[1754]: I20251212 18:46:20.764793 1754 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 12 18:46:20.764847 update_engine[1754]: I20251212 18:46:20.764811 1754 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 12 18:46:20.765087 update_engine[1754]: I20251212 18:46:20.765048 1754 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 12 18:46:20.765114 locksmithd[1800]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Dec 12 18:46:20.772199 update_engine[1754]: E20251212 18:46:20.772146 1754 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Dec 12 18:46:20.772303 update_engine[1754]: I20251212 18:46:20.772233 1754 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Dec 12 18:46:20.772303 update_engine[1754]: I20251212 18:46:20.772240 1754 omaha_request_action.cc:617] Omaha request response: Dec 12 18:46:20.772303 update_engine[1754]: I20251212 18:46:20.772248 1754 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Dec 12 18:46:20.772303 update_engine[1754]: I20251212 18:46:20.772251 1754 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Dec 12 18:46:20.772303 update_engine[1754]: I20251212 18:46:20.772256 1754 update_attempter.cc:306] Processing Done. Dec 12 18:46:20.772303 update_engine[1754]: I20251212 18:46:20.772271 1754 update_attempter.cc:310] Error event sent. Dec 12 18:46:20.772303 update_engine[1754]: I20251212 18:46:20.772279 1754 update_check_scheduler.cc:74] Next update check in 47m10s Dec 12 18:46:20.772694 locksmithd[1800]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Dec 12 18:46:21.068118 kubelet[3062]: E1212 18:46:21.067976 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6678f769fb-bpdwv" podUID="0d411828-ce18-4a86-9fa9-09a812dd1345" Dec 12 18:46:21.068760 kubelet[3062]: E1212 18:46:21.068707 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5854486f4f-qw8k4" podUID="c1fa9bf4-cb97-4583-82c6-dc2a04ecc620" Dec 12 18:46:21.306527 sshd[5761]: Accepted publickey for core from 147.75.109.163 port 37410 ssh2: RSA SHA256:zFX4bNqXIKDKZnJ173gM+qIyMrs2hUKPmae9E2loakI Dec 12 18:46:21.307902 sshd-session[5761]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:46:21.312863 systemd-logind[1749]: New session 16 of user core. Dec 12 18:46:21.326334 systemd[1]: Started session-16.scope - Session 16 of User core. Dec 12 18:46:22.063277 sshd[5764]: Connection closed by 147.75.109.163 port 37410 Dec 12 18:46:22.063015 sshd-session[5761]: pam_unix(sshd:session): session closed for user core Dec 12 18:46:22.066279 systemd[1]: sshd@17-10.0.8.97:22-147.75.109.163:37410.service: Deactivated successfully. Dec 12 18:46:22.067925 systemd[1]: session-16.scope: Deactivated successfully. Dec 12 18:46:22.068547 systemd-logind[1749]: Session 16 logged out. Waiting for processes to exit. Dec 12 18:46:22.069580 systemd-logind[1749]: Removed session 16. Dec 12 18:46:22.234597 systemd[1]: Started sshd@18-10.0.8.97:22-147.75.109.163:37418.service - OpenSSH per-connection server daemon (147.75.109.163:37418). Dec 12 18:46:23.220607 sshd[5779]: Accepted publickey for core from 147.75.109.163 port 37418 ssh2: RSA SHA256:zFX4bNqXIKDKZnJ173gM+qIyMrs2hUKPmae9E2loakI Dec 12 18:46:23.221994 sshd-session[5779]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:46:23.225921 systemd-logind[1749]: New session 17 of user core. Dec 12 18:46:23.234308 systemd[1]: Started session-17.scope - Session 17 of User core. Dec 12 18:46:24.242087 sshd[5782]: Connection closed by 147.75.109.163 port 37418 Dec 12 18:46:24.241956 sshd-session[5779]: pam_unix(sshd:session): session closed for user core Dec 12 18:46:24.245259 systemd[1]: sshd@18-10.0.8.97:22-147.75.109.163:37418.service: Deactivated successfully. Dec 12 18:46:24.246856 systemd[1]: session-17.scope: Deactivated successfully. Dec 12 18:46:24.247635 systemd-logind[1749]: Session 17 logged out. Waiting for processes to exit. Dec 12 18:46:24.248954 systemd-logind[1749]: Removed session 17. Dec 12 18:46:24.412787 systemd[1]: Started sshd@19-10.0.8.97:22-147.75.109.163:52416.service - OpenSSH per-connection server daemon (147.75.109.163:52416). Dec 12 18:46:25.382941 sshd[5805]: Accepted publickey for core from 147.75.109.163 port 52416 ssh2: RSA SHA256:zFX4bNqXIKDKZnJ173gM+qIyMrs2hUKPmae9E2loakI Dec 12 18:46:25.385319 sshd-session[5805]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:46:25.389247 systemd-logind[1749]: New session 18 of user core. Dec 12 18:46:25.401299 systemd[1]: Started session-18.scope - Session 18 of User core. Dec 12 18:46:26.202471 sshd[5808]: Connection closed by 147.75.109.163 port 52416 Dec 12 18:46:26.202826 sshd-session[5805]: pam_unix(sshd:session): session closed for user core Dec 12 18:46:26.206951 systemd[1]: sshd@19-10.0.8.97:22-147.75.109.163:52416.service: Deactivated successfully. Dec 12 18:46:26.208637 systemd[1]: session-18.scope: Deactivated successfully. Dec 12 18:46:26.209916 systemd-logind[1749]: Session 18 logged out. Waiting for processes to exit. Dec 12 18:46:26.211085 systemd-logind[1749]: Removed session 18. Dec 12 18:46:26.374142 systemd[1]: Started sshd@20-10.0.8.97:22-147.75.109.163:52424.service - OpenSSH per-connection server daemon (147.75.109.163:52424). Dec 12 18:46:27.351946 sshd[5823]: Accepted publickey for core from 147.75.109.163 port 52424 ssh2: RSA SHA256:zFX4bNqXIKDKZnJ173gM+qIyMrs2hUKPmae9E2loakI Dec 12 18:46:27.353463 sshd-session[5823]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:46:27.358198 systemd-logind[1749]: New session 19 of user core. Dec 12 18:46:27.372357 systemd[1]: Started session-19.scope - Session 19 of User core. Dec 12 18:46:28.071338 sshd[5826]: Connection closed by 147.75.109.163 port 52424 Dec 12 18:46:28.071695 sshd-session[5823]: pam_unix(sshd:session): session closed for user core Dec 12 18:46:28.075211 systemd[1]: sshd@20-10.0.8.97:22-147.75.109.163:52424.service: Deactivated successfully. Dec 12 18:46:28.076981 systemd[1]: session-19.scope: Deactivated successfully. Dec 12 18:46:28.077611 systemd-logind[1749]: Session 19 logged out. Waiting for processes to exit. Dec 12 18:46:28.078431 systemd-logind[1749]: Removed session 19. Dec 12 18:46:29.069342 kubelet[3062]: E1212 18:46:29.069283 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pls24" podUID="541d8bd6-57ea-4711-86e4-5819a7795d8f" Dec 12 18:46:30.068332 kubelet[3062]: E1212 18:46:30.068236 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-pcfdn" podUID="77f6fa4e-0cea-4d83-8d55-8707dfa4d505" Dec 12 18:46:32.072698 kubelet[3062]: E1212 18:46:32.072639 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5854486f4f-qw8k4" podUID="c1fa9bf4-cb97-4583-82c6-dc2a04ecc620" Dec 12 18:46:33.246905 systemd[1]: Started sshd@21-10.0.8.97:22-147.75.109.163:33354.service - OpenSSH per-connection server daemon (147.75.109.163:33354). Dec 12 18:46:34.070178 kubelet[3062]: E1212 18:46:34.070122 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-bnqwj" podUID="590a3c9e-ec4d-4eb7-8a94-169742d9929c" Dec 12 18:46:34.070646 kubelet[3062]: E1212 18:46:34.070388 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-m7vzj" podUID="20d32882-7b09-425d-879e-55aa1742ac4a" Dec 12 18:46:34.224088 sshd[5845]: Accepted publickey for core from 147.75.109.163 port 33354 ssh2: RSA SHA256:zFX4bNqXIKDKZnJ173gM+qIyMrs2hUKPmae9E2loakI Dec 12 18:46:34.225756 sshd-session[5845]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:46:34.231603 systemd-logind[1749]: New session 20 of user core. Dec 12 18:46:34.243403 systemd[1]: Started session-20.scope - Session 20 of User core. Dec 12 18:46:34.949372 sshd[5848]: Connection closed by 147.75.109.163 port 33354 Dec 12 18:46:34.949691 sshd-session[5845]: pam_unix(sshd:session): session closed for user core Dec 12 18:46:34.953086 systemd[1]: sshd@21-10.0.8.97:22-147.75.109.163:33354.service: Deactivated successfully. Dec 12 18:46:34.954700 systemd[1]: session-20.scope: Deactivated successfully. Dec 12 18:46:34.955360 systemd-logind[1749]: Session 20 logged out. Waiting for processes to exit. Dec 12 18:46:34.956115 systemd-logind[1749]: Removed session 20. Dec 12 18:46:35.068034 kubelet[3062]: E1212 18:46:35.067991 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6678f769fb-bpdwv" podUID="0d411828-ce18-4a86-9fa9-09a812dd1345" Dec 12 18:46:40.129483 systemd[1]: Started sshd@22-10.0.8.97:22-147.75.109.163:33364.service - OpenSSH per-connection server daemon (147.75.109.163:33364). Dec 12 18:46:41.067658 kubelet[3062]: E1212 18:46:41.067598 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-pcfdn" podUID="77f6fa4e-0cea-4d83-8d55-8707dfa4d505" Dec 12 18:46:41.068148 kubelet[3062]: E1212 18:46:41.067844 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pls24" podUID="541d8bd6-57ea-4711-86e4-5819a7795d8f" Dec 12 18:46:41.178420 sshd[5865]: Accepted publickey for core from 147.75.109.163 port 33364 ssh2: RSA SHA256:zFX4bNqXIKDKZnJ173gM+qIyMrs2hUKPmae9E2loakI Dec 12 18:46:41.179563 sshd-session[5865]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:46:41.184100 systemd-logind[1749]: New session 21 of user core. Dec 12 18:46:41.192291 systemd[1]: Started session-21.scope - Session 21 of User core. Dec 12 18:46:41.952248 sshd[5868]: Connection closed by 147.75.109.163 port 33364 Dec 12 18:46:41.952666 sshd-session[5865]: pam_unix(sshd:session): session closed for user core Dec 12 18:46:41.955959 systemd[1]: sshd@22-10.0.8.97:22-147.75.109.163:33364.service: Deactivated successfully. Dec 12 18:46:41.957650 systemd[1]: session-21.scope: Deactivated successfully. Dec 12 18:46:41.959439 systemd-logind[1749]: Session 21 logged out. Waiting for processes to exit. Dec 12 18:46:41.960445 systemd-logind[1749]: Removed session 21. Dec 12 18:46:45.069316 kubelet[3062]: E1212 18:46:45.069272 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-m7vzj" podUID="20d32882-7b09-425d-879e-55aa1742ac4a" Dec 12 18:46:45.072784 kubelet[3062]: E1212 18:46:45.072738 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5854486f4f-qw8k4" podUID="c1fa9bf4-cb97-4583-82c6-dc2a04ecc620" Dec 12 18:46:47.127573 systemd[1]: Started sshd@23-10.0.8.97:22-147.75.109.163:34224.service - OpenSSH per-connection server daemon (147.75.109.163:34224). Dec 12 18:46:48.068255 kubelet[3062]: E1212 18:46:48.068188 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-bnqwj" podUID="590a3c9e-ec4d-4eb7-8a94-169742d9929c" Dec 12 18:46:48.115107 sshd[5915]: Accepted publickey for core from 147.75.109.163 port 34224 ssh2: RSA SHA256:zFX4bNqXIKDKZnJ173gM+qIyMrs2hUKPmae9E2loakI Dec 12 18:46:48.116584 sshd-session[5915]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:46:48.125485 systemd-logind[1749]: New session 22 of user core. Dec 12 18:46:48.138313 systemd[1]: Started session-22.scope - Session 22 of User core. Dec 12 18:46:48.849862 sshd[5918]: Connection closed by 147.75.109.163 port 34224 Dec 12 18:46:48.850291 sshd-session[5915]: pam_unix(sshd:session): session closed for user core Dec 12 18:46:48.853348 systemd[1]: sshd@23-10.0.8.97:22-147.75.109.163:34224.service: Deactivated successfully. Dec 12 18:46:48.855240 systemd[1]: session-22.scope: Deactivated successfully. Dec 12 18:46:48.856564 systemd-logind[1749]: Session 22 logged out. Waiting for processes to exit. Dec 12 18:46:48.857918 systemd-logind[1749]: Removed session 22. Dec 12 18:46:49.068247 kubelet[3062]: E1212 18:46:49.068197 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6678f769fb-bpdwv" podUID="0d411828-ce18-4a86-9fa9-09a812dd1345" Dec 12 18:46:54.020742 systemd[1]: Started sshd@24-10.0.8.97:22-147.75.109.163:42230.service - OpenSSH per-connection server daemon (147.75.109.163:42230). Dec 12 18:46:54.069831 kubelet[3062]: E1212 18:46:54.069785 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pls24" podUID="541d8bd6-57ea-4711-86e4-5819a7795d8f" Dec 12 18:46:55.001710 sshd[5943]: Accepted publickey for core from 147.75.109.163 port 42230 ssh2: RSA SHA256:zFX4bNqXIKDKZnJ173gM+qIyMrs2hUKPmae9E2loakI Dec 12 18:46:55.004187 sshd-session[5943]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:46:55.009185 systemd-logind[1749]: New session 23 of user core. Dec 12 18:46:55.019412 systemd[1]: Started session-23.scope - Session 23 of User core. Dec 12 18:46:55.068443 kubelet[3062]: E1212 18:46:55.068398 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-pcfdn" podUID="77f6fa4e-0cea-4d83-8d55-8707dfa4d505" Dec 12 18:46:55.719856 sshd[5946]: Connection closed by 147.75.109.163 port 42230 Dec 12 18:46:55.720263 sshd-session[5943]: pam_unix(sshd:session): session closed for user core Dec 12 18:46:55.723339 systemd[1]: sshd@24-10.0.8.97:22-147.75.109.163:42230.service: Deactivated successfully. Dec 12 18:46:55.725020 systemd[1]: session-23.scope: Deactivated successfully. Dec 12 18:46:55.726249 systemd-logind[1749]: Session 23 logged out. Waiting for processes to exit. Dec 12 18:46:55.727087 systemd-logind[1749]: Removed session 23. Dec 12 18:46:57.067837 kubelet[3062]: E1212 18:46:57.067486 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-m7vzj" podUID="20d32882-7b09-425d-879e-55aa1742ac4a" Dec 12 18:46:57.938121 systemd[1]: Started sshd@25-10.0.8.97:22-167.71.75.147:48528.service - OpenSSH per-connection server daemon (167.71.75.147:48528). Dec 12 18:46:58.069344 kubelet[3062]: E1212 18:46:58.069201 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5854486f4f-qw8k4" podUID="c1fa9bf4-cb97-4583-82c6-dc2a04ecc620" Dec 12 18:46:58.930287 sshd[5963]: Invalid user admin from 167.71.75.147 port 48528 Dec 12 18:46:59.058961 sshd[5963]: Connection closed by invalid user admin 167.71.75.147 port 48528 [preauth] Dec 12 18:46:59.061021 systemd[1]: sshd@25-10.0.8.97:22-167.71.75.147:48528.service: Deactivated successfully. Dec 12 18:46:59.067901 kubelet[3062]: E1212 18:46:59.067869 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-bnqwj" podUID="590a3c9e-ec4d-4eb7-8a94-169742d9929c" Dec 12 18:47:01.067376 kubelet[3062]: E1212 18:47:01.067332 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6678f769fb-bpdwv" podUID="0d411828-ce18-4a86-9fa9-09a812dd1345" Dec 12 18:47:06.069162 kubelet[3062]: E1212 18:47:06.069116 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pls24" podUID="541d8bd6-57ea-4711-86e4-5819a7795d8f" Dec 12 18:47:08.068099 kubelet[3062]: E1212 18:47:08.068037 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-pcfdn" podUID="77f6fa4e-0cea-4d83-8d55-8707dfa4d505" Dec 12 18:47:10.068636 containerd[1769]: time="2025-12-12T18:47:10.068593708Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 18:47:10.407397 containerd[1769]: time="2025-12-12T18:47:10.407326679Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:47:10.409424 containerd[1769]: time="2025-12-12T18:47:10.409344602Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 18:47:10.409535 containerd[1769]: time="2025-12-12T18:47:10.409443949Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 18:47:10.409661 kubelet[3062]: E1212 18:47:10.409609 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 18:47:10.410006 kubelet[3062]: E1212 18:47:10.409668 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 18:47:10.410006 kubelet[3062]: E1212 18:47:10.409782 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8b05f8ca7479467c89608bbc825ffb49,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sjpc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5854486f4f-qw8k4_calico-system(c1fa9bf4-cb97-4583-82c6-dc2a04ecc620): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 18:47:10.411699 containerd[1769]: time="2025-12-12T18:47:10.411674119Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 18:47:10.751752 containerd[1769]: time="2025-12-12T18:47:10.751608567Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:47:10.757612 containerd[1769]: time="2025-12-12T18:47:10.757537300Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 18:47:10.757750 containerd[1769]: time="2025-12-12T18:47:10.757617889Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 18:47:10.757843 kubelet[3062]: E1212 18:47:10.757793 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 18:47:10.757883 kubelet[3062]: E1212 18:47:10.757852 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 18:47:10.758037 kubelet[3062]: E1212 18:47:10.758001 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sjpc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5854486f4f-qw8k4_calico-system(c1fa9bf4-cb97-4583-82c6-dc2a04ecc620): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 18:47:10.759330 kubelet[3062]: E1212 18:47:10.759278 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5854486f4f-qw8k4" podUID="c1fa9bf4-cb97-4583-82c6-dc2a04ecc620" Dec 12 18:47:11.068354 kubelet[3062]: E1212 18:47:11.068219 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-m7vzj" podUID="20d32882-7b09-425d-879e-55aa1742ac4a" Dec 12 18:47:11.068517 containerd[1769]: time="2025-12-12T18:47:11.068463990Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 18:47:11.408612 containerd[1769]: time="2025-12-12T18:47:11.408528949Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:47:11.410366 containerd[1769]: time="2025-12-12T18:47:11.410290770Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 18:47:11.410465 containerd[1769]: time="2025-12-12T18:47:11.410329857Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 18:47:11.410576 kubelet[3062]: E1212 18:47:11.410534 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:47:11.410829 kubelet[3062]: E1212 18:47:11.410587 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:47:11.410829 kubelet[3062]: E1212 18:47:11.410735 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-278rl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-85cb55bbfc-bnqwj_calico-apiserver(590a3c9e-ec4d-4eb7-8a94-169742d9929c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 18:47:11.411975 kubelet[3062]: E1212 18:47:11.411928 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-bnqwj" podUID="590a3c9e-ec4d-4eb7-8a94-169742d9929c" Dec 12 18:47:16.068114 containerd[1769]: time="2025-12-12T18:47:16.068038846Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 18:47:16.402713 containerd[1769]: time="2025-12-12T18:47:16.402640980Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:47:16.404901 containerd[1769]: time="2025-12-12T18:47:16.404745873Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 18:47:16.404901 containerd[1769]: time="2025-12-12T18:47:16.404847443Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 18:47:16.405064 kubelet[3062]: E1212 18:47:16.404988 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 18:47:16.405064 kubelet[3062]: E1212 18:47:16.405046 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 18:47:16.405396 kubelet[3062]: E1212 18:47:16.405209 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ldgzq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-6678f769fb-bpdwv_calico-system(0d411828-ce18-4a86-9fa9-09a812dd1345): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 18:47:16.406462 kubelet[3062]: E1212 18:47:16.406406 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6678f769fb-bpdwv" podUID="0d411828-ce18-4a86-9fa9-09a812dd1345" Dec 12 18:47:17.068408 containerd[1769]: time="2025-12-12T18:47:17.068353154Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 18:47:17.408371 containerd[1769]: time="2025-12-12T18:47:17.408315013Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:47:17.410941 containerd[1769]: time="2025-12-12T18:47:17.410874864Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 18:47:17.411027 containerd[1769]: time="2025-12-12T18:47:17.410972984Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 18:47:17.411161 kubelet[3062]: E1212 18:47:17.411122 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 18:47:17.411464 kubelet[3062]: E1212 18:47:17.411171 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 18:47:17.411464 kubelet[3062]: E1212 18:47:17.411291 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m8zv6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-pls24_calico-system(541d8bd6-57ea-4711-86e4-5819a7795d8f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 18:47:17.413995 containerd[1769]: time="2025-12-12T18:47:17.413948758Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 18:47:17.907463 containerd[1769]: time="2025-12-12T18:47:17.907376574Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:47:17.909463 containerd[1769]: time="2025-12-12T18:47:17.909404679Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 18:47:17.909708 containerd[1769]: time="2025-12-12T18:47:17.909457211Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 18:47:17.909737 kubelet[3062]: E1212 18:47:17.909679 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 18:47:17.909789 kubelet[3062]: E1212 18:47:17.909739 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 18:47:17.909961 kubelet[3062]: E1212 18:47:17.909865 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m8zv6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-pls24_calico-system(541d8bd6-57ea-4711-86e4-5819a7795d8f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 18:47:17.911410 kubelet[3062]: E1212 18:47:17.911346 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pls24" podUID="541d8bd6-57ea-4711-86e4-5819a7795d8f" Dec 12 18:47:20.087702 systemd[1]: cri-containerd-430b1e0a319c9dae63bff3f99661248bd2cc48d7b5d310158ff293428330051f.scope: Deactivated successfully. Dec 12 18:47:20.088097 systemd[1]: cri-containerd-430b1e0a319c9dae63bff3f99661248bd2cc48d7b5d310158ff293428330051f.scope: Consumed 27.674s CPU time, 96.5M memory peak. Dec 12 18:47:20.088988 containerd[1769]: time="2025-12-12T18:47:20.088943084Z" level=info msg="received container exit event container_id:\"430b1e0a319c9dae63bff3f99661248bd2cc48d7b5d310158ff293428330051f\" id:\"430b1e0a319c9dae63bff3f99661248bd2cc48d7b5d310158ff293428330051f\" pid:3401 exit_status:1 exited_at:{seconds:1765565240 nanos:88603421}" Dec 12 18:47:20.108829 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-430b1e0a319c9dae63bff3f99661248bd2cc48d7b5d310158ff293428330051f-rootfs.mount: Deactivated successfully. Dec 12 18:47:20.522623 kubelet[3062]: I1212 18:47:20.522570 3062 scope.go:117] "RemoveContainer" containerID="430b1e0a319c9dae63bff3f99661248bd2cc48d7b5d310158ff293428330051f" Dec 12 18:47:20.523953 containerd[1769]: time="2025-12-12T18:47:20.523898519Z" level=info msg="CreateContainer within sandbox \"d88cfc52bbb7af53cadd834591290a279fd9cda8988bba39be4ecfd66865b315\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Dec 12 18:47:20.537900 containerd[1769]: time="2025-12-12T18:47:20.537363521Z" level=info msg="Container bd07bba335293ed8f419ccc0e703ff6925969b9877c5a36be2ec3aade25c404e: CDI devices from CRI Config.CDIDevices: []" Dec 12 18:47:20.544286 kubelet[3062]: E1212 18:47:20.544177 3062 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.8.97:37418->10.0.8.99:2379: read: connection timed out" Dec 12 18:47:20.547101 systemd[1]: cri-containerd-d41abe8e3d62df0622cbbb1f097455ca2efdb02722843f0a22cbf0fc5e1b74a0.scope: Deactivated successfully. Dec 12 18:47:20.548018 systemd[1]: cri-containerd-d41abe8e3d62df0622cbbb1f097455ca2efdb02722843f0a22cbf0fc5e1b74a0.scope: Consumed 1.514s CPU time, 27M memory peak. Dec 12 18:47:20.548643 containerd[1769]: time="2025-12-12T18:47:20.548597474Z" level=info msg="received container exit event container_id:\"d41abe8e3d62df0622cbbb1f097455ca2efdb02722843f0a22cbf0fc5e1b74a0\" id:\"d41abe8e3d62df0622cbbb1f097455ca2efdb02722843f0a22cbf0fc5e1b74a0\" pid:2868 exit_status:1 exited_at:{seconds:1765565240 nanos:548263723}" Dec 12 18:47:20.548695 containerd[1769]: time="2025-12-12T18:47:20.548666006Z" level=info msg="CreateContainer within sandbox \"d88cfc52bbb7af53cadd834591290a279fd9cda8988bba39be4ecfd66865b315\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"bd07bba335293ed8f419ccc0e703ff6925969b9877c5a36be2ec3aade25c404e\"" Dec 12 18:47:20.549461 containerd[1769]: time="2025-12-12T18:47:20.549423870Z" level=info msg="StartContainer for \"bd07bba335293ed8f419ccc0e703ff6925969b9877c5a36be2ec3aade25c404e\"" Dec 12 18:47:20.550569 containerd[1769]: time="2025-12-12T18:47:20.550464381Z" level=info msg="connecting to shim bd07bba335293ed8f419ccc0e703ff6925969b9877c5a36be2ec3aade25c404e" address="unix:///run/containerd/s/c86b23421add47bb414fca27a4abc8e978423e43b123f698ce17ab599e53e090" protocol=ttrpc version=3 Dec 12 18:47:20.571468 systemd[1]: Started cri-containerd-bd07bba335293ed8f419ccc0e703ff6925969b9877c5a36be2ec3aade25c404e.scope - libcontainer container bd07bba335293ed8f419ccc0e703ff6925969b9877c5a36be2ec3aade25c404e. Dec 12 18:47:20.576461 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d41abe8e3d62df0622cbbb1f097455ca2efdb02722843f0a22cbf0fc5e1b74a0-rootfs.mount: Deactivated successfully. Dec 12 18:47:20.601318 containerd[1769]: time="2025-12-12T18:47:20.601253293Z" level=info msg="StartContainer for \"bd07bba335293ed8f419ccc0e703ff6925969b9877c5a36be2ec3aade25c404e\" returns successfully" Dec 12 18:47:20.854683 systemd[1]: cri-containerd-2c4189e9da9cee3fe89ff8680b728d7063486d27f00175ca6babfb134472f80f.scope: Deactivated successfully. Dec 12 18:47:20.854965 systemd[1]: cri-containerd-2c4189e9da9cee3fe89ff8680b728d7063486d27f00175ca6babfb134472f80f.scope: Consumed 4.095s CPU time, 57.5M memory peak. Dec 12 18:47:20.856587 containerd[1769]: time="2025-12-12T18:47:20.856554631Z" level=info msg="received container exit event container_id:\"2c4189e9da9cee3fe89ff8680b728d7063486d27f00175ca6babfb134472f80f\" id:\"2c4189e9da9cee3fe89ff8680b728d7063486d27f00175ca6babfb134472f80f\" pid:2907 exit_status:1 exited_at:{seconds:1765565240 nanos:856250540}" Dec 12 18:47:21.068181 containerd[1769]: time="2025-12-12T18:47:21.068146566Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 18:47:21.108904 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2c4189e9da9cee3fe89ff8680b728d7063486d27f00175ca6babfb134472f80f-rootfs.mount: Deactivated successfully. Dec 12 18:47:21.388768 containerd[1769]: time="2025-12-12T18:47:21.388698309Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:47:21.390686 containerd[1769]: time="2025-12-12T18:47:21.390651806Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 18:47:21.390799 containerd[1769]: time="2025-12-12T18:47:21.390742923Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 18:47:21.390983 kubelet[3062]: E1212 18:47:21.390931 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 18:47:21.391057 kubelet[3062]: E1212 18:47:21.390992 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 18:47:21.391200 kubelet[3062]: E1212 18:47:21.391156 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wccvq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-pcfdn_calico-system(77f6fa4e-0cea-4d83-8d55-8707dfa4d505): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 18:47:21.392419 kubelet[3062]: E1212 18:47:21.392363 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-pcfdn" podUID="77f6fa4e-0cea-4d83-8d55-8707dfa4d505" Dec 12 18:47:21.503330 kubelet[3062]: E1212 18:47:21.503177 3062 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.8.97:37232->10.0.8.99:2379: read: connection timed out" event="&Event{ObjectMeta:{calico-apiserver-85cb55bbfc-m7vzj.18808c11d680d374 calico-apiserver 1755 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:calico-apiserver,Name:calico-apiserver-85cb55bbfc-m7vzj,UID:20d32882-7b09-425d-879e-55aa1742ac4a,APIVersion:v1,ResourceVersion:809,FieldPath:spec.containers{calico-apiserver},},Reason:BackOff,Message:Back-off pulling image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\",Source:EventSource{Component:kubelet,Host:ci-4459-2-2-4-78a5f49b53,},FirstTimestamp:2025-12-12 18:44:22 +0000 UTC,LastTimestamp:2025-12-12 18:47:11.068174088 +0000 UTC m=+209.087910966,Count:12,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-4-78a5f49b53,}" Dec 12 18:47:21.526447 kubelet[3062]: I1212 18:47:21.526405 3062 scope.go:117] "RemoveContainer" containerID="d41abe8e3d62df0622cbbb1f097455ca2efdb02722843f0a22cbf0fc5e1b74a0" Dec 12 18:47:21.527643 kubelet[3062]: I1212 18:47:21.527613 3062 scope.go:117] "RemoveContainer" containerID="2c4189e9da9cee3fe89ff8680b728d7063486d27f00175ca6babfb134472f80f" Dec 12 18:47:21.528126 containerd[1769]: time="2025-12-12T18:47:21.528095448Z" level=info msg="CreateContainer within sandbox \"d92c4afe7398a41dfbcc1b1ea3288e2d206124a5d9cdf673b07ca58429055ccc\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Dec 12 18:47:21.529552 containerd[1769]: time="2025-12-12T18:47:21.529513941Z" level=info msg="CreateContainer within sandbox \"47cd697f9a67fcf2b5ccc2a807412d37bd6a8b42be5db24e7b93e1af080476d1\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Dec 12 18:47:21.540772 containerd[1769]: time="2025-12-12T18:47:21.540717681Z" level=info msg="Container b2bc174b6324049d013dd44116c546fe1a75153592cb1be23d64d4fdd2567da9: CDI devices from CRI Config.CDIDevices: []" Dec 12 18:47:21.544872 containerd[1769]: time="2025-12-12T18:47:21.544844681Z" level=info msg="Container 0649900dcf34e68806803207419e107f18daf281ff0fbfec51abeff7887c55af: CDI devices from CRI Config.CDIDevices: []" Dec 12 18:47:21.558059 containerd[1769]: time="2025-12-12T18:47:21.558013259Z" level=info msg="CreateContainer within sandbox \"47cd697f9a67fcf2b5ccc2a807412d37bd6a8b42be5db24e7b93e1af080476d1\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"0649900dcf34e68806803207419e107f18daf281ff0fbfec51abeff7887c55af\"" Dec 12 18:47:21.558524 containerd[1769]: time="2025-12-12T18:47:21.558498843Z" level=info msg="CreateContainer within sandbox \"d92c4afe7398a41dfbcc1b1ea3288e2d206124a5d9cdf673b07ca58429055ccc\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"b2bc174b6324049d013dd44116c546fe1a75153592cb1be23d64d4fdd2567da9\"" Dec 12 18:47:21.558655 containerd[1769]: time="2025-12-12T18:47:21.558635330Z" level=info msg="StartContainer for \"0649900dcf34e68806803207419e107f18daf281ff0fbfec51abeff7887c55af\"" Dec 12 18:47:21.558820 containerd[1769]: time="2025-12-12T18:47:21.558785020Z" level=info msg="StartContainer for \"b2bc174b6324049d013dd44116c546fe1a75153592cb1be23d64d4fdd2567da9\"" Dec 12 18:47:21.559617 containerd[1769]: time="2025-12-12T18:47:21.559599306Z" level=info msg="connecting to shim b2bc174b6324049d013dd44116c546fe1a75153592cb1be23d64d4fdd2567da9" address="unix:///run/containerd/s/37491b236897be3031d0b90696f2993637d9994c2df23cdb737d9c0438c95c15" protocol=ttrpc version=3 Dec 12 18:47:21.560639 containerd[1769]: time="2025-12-12T18:47:21.560605978Z" level=info msg="connecting to shim 0649900dcf34e68806803207419e107f18daf281ff0fbfec51abeff7887c55af" address="unix:///run/containerd/s/ef1750b6067fb2455b26d21baeaf22ae1d4b13334aeec4411dbb8de576ae18be" protocol=ttrpc version=3 Dec 12 18:47:21.581446 systemd[1]: Started cri-containerd-0649900dcf34e68806803207419e107f18daf281ff0fbfec51abeff7887c55af.scope - libcontainer container 0649900dcf34e68806803207419e107f18daf281ff0fbfec51abeff7887c55af. Dec 12 18:47:21.582450 systemd[1]: Started cri-containerd-b2bc174b6324049d013dd44116c546fe1a75153592cb1be23d64d4fdd2567da9.scope - libcontainer container b2bc174b6324049d013dd44116c546fe1a75153592cb1be23d64d4fdd2567da9. Dec 12 18:47:21.632309 containerd[1769]: time="2025-12-12T18:47:21.631794924Z" level=info msg="StartContainer for \"0649900dcf34e68806803207419e107f18daf281ff0fbfec51abeff7887c55af\" returns successfully" Dec 12 18:47:21.633142 containerd[1769]: time="2025-12-12T18:47:21.632602714Z" level=info msg="StartContainer for \"b2bc174b6324049d013dd44116c546fe1a75153592cb1be23d64d4fdd2567da9\" returns successfully" Dec 12 18:47:22.068054 containerd[1769]: time="2025-12-12T18:47:22.067860186Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 18:47:22.409050 containerd[1769]: time="2025-12-12T18:47:22.408897005Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:47:22.410803 containerd[1769]: time="2025-12-12T18:47:22.410745759Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 18:47:22.410874 containerd[1769]: time="2025-12-12T18:47:22.410822805Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 18:47:22.411005 kubelet[3062]: E1212 18:47:22.410966 3062 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:47:22.411045 kubelet[3062]: E1212 18:47:22.411013 3062 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:47:22.411257 kubelet[3062]: E1212 18:47:22.411168 3062 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nch2q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-85cb55bbfc-m7vzj_calico-apiserver(20d32882-7b09-425d-879e-55aa1742ac4a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 18:47:22.412491 kubelet[3062]: E1212 18:47:22.412455 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-m7vzj" podUID="20d32882-7b09-425d-879e-55aa1742ac4a" Dec 12 18:47:23.068132 kubelet[3062]: E1212 18:47:23.068076 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5854486f4f-qw8k4" podUID="c1fa9bf4-cb97-4583-82c6-dc2a04ecc620" Dec 12 18:47:26.067597 kubelet[3062]: E1212 18:47:26.067522 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-85cb55bbfc-bnqwj" podUID="590a3c9e-ec4d-4eb7-8a94-169742d9929c" Dec 12 18:47:26.431342 kubelet[3062]: I1212 18:47:26.431283 3062 status_manager.go:890] "Failed to get status for pod" podUID="eef14882-8b6b-4161-a7c8-d2d8cb94c6ea" pod="tigera-operator/tigera-operator-7dcd859c48-xdz8b" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.8.97:37332->10.0.8.99:2379: read: connection timed out" Dec 12 18:47:28.068691 kubelet[3062]: E1212 18:47:28.068167 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6678f769fb-bpdwv" podUID="0d411828-ce18-4a86-9fa9-09a812dd1345" Dec 12 18:47:30.068134 kubelet[3062]: E1212 18:47:30.068044 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-pls24" podUID="541d8bd6-57ea-4711-86e4-5819a7795d8f" Dec 12 18:47:30.544637 kubelet[3062]: E1212 18:47:30.544552 3062 controller.go:195] "Failed to update lease" err="Put \"https://10.0.8.97:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-4-78a5f49b53?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 12 18:47:31.757919 systemd[1]: cri-containerd-bd07bba335293ed8f419ccc0e703ff6925969b9877c5a36be2ec3aade25c404e.scope: Deactivated successfully. Dec 12 18:47:31.759641 containerd[1769]: time="2025-12-12T18:47:31.759510224Z" level=info msg="received container exit event container_id:\"bd07bba335293ed8f419ccc0e703ff6925969b9877c5a36be2ec3aade25c404e\" id:\"bd07bba335293ed8f419ccc0e703ff6925969b9877c5a36be2ec3aade25c404e\" pid:6038 exit_status:1 exited_at:{seconds:1765565251 nanos:759255474}" Dec 12 18:47:31.778056 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bd07bba335293ed8f419ccc0e703ff6925969b9877c5a36be2ec3aade25c404e-rootfs.mount: Deactivated successfully. Dec 12 18:47:32.556551 kubelet[3062]: I1212 18:47:32.556508 3062 scope.go:117] "RemoveContainer" containerID="430b1e0a319c9dae63bff3f99661248bd2cc48d7b5d310158ff293428330051f" Dec 12 18:47:32.556968 kubelet[3062]: I1212 18:47:32.556755 3062 scope.go:117] "RemoveContainer" containerID="bd07bba335293ed8f419ccc0e703ff6925969b9877c5a36be2ec3aade25c404e" Dec 12 18:47:32.556968 kubelet[3062]: E1212 18:47:32.556919 3062 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-7dcd859c48-xdz8b_tigera-operator(eef14882-8b6b-4161-a7c8-d2d8cb94c6ea)\"" pod="tigera-operator/tigera-operator-7dcd859c48-xdz8b" podUID="eef14882-8b6b-4161-a7c8-d2d8cb94c6ea" Dec 12 18:47:32.557745 containerd[1769]: time="2025-12-12T18:47:32.557710497Z" level=info msg="RemoveContainer for \"430b1e0a319c9dae63bff3f99661248bd2cc48d7b5d310158ff293428330051f\"" Dec 12 18:47:32.563988 containerd[1769]: time="2025-12-12T18:47:32.563948213Z" level=info msg="RemoveContainer for \"430b1e0a319c9dae63bff3f99661248bd2cc48d7b5d310158ff293428330051f\" returns successfully"