Dec 16 13:15:52.855551 kernel: Linux version 6.12.61-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Dec 12 15:21:28 -00 2025 Dec 16 13:15:52.855656 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=a214a2d85e162c493e8b13db2df50a43e1005a0e4854a1ae089a14f442a30022 Dec 16 13:15:52.855692 kernel: BIOS-provided physical RAM map: Dec 16 13:15:52.855706 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Dec 16 13:15:52.855719 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable Dec 16 13:15:52.855732 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Dec 16 13:15:52.855747 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable Dec 16 13:15:52.855760 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Dec 16 13:15:52.855772 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable Dec 16 13:15:52.855788 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Dec 16 13:15:52.855801 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000007e73efff] usable Dec 16 13:15:52.855813 kernel: BIOS-e820: [mem 0x000000007e73f000-0x000000007e7fffff] reserved Dec 16 13:15:52.855826 kernel: BIOS-e820: [mem 0x000000007e800000-0x000000007ea70fff] usable Dec 16 13:15:52.855838 kernel: BIOS-e820: [mem 0x000000007ea71000-0x000000007eb84fff] reserved Dec 16 13:15:52.855854 kernel: BIOS-e820: [mem 0x000000007eb85000-0x000000007f6ecfff] usable Dec 16 13:15:52.855871 kernel: BIOS-e820: [mem 0x000000007f6ed000-0x000000007f96cfff] reserved Dec 16 13:15:52.855884 kernel: BIOS-e820: [mem 0x000000007f96d000-0x000000007f97efff] ACPI data Dec 16 13:15:52.855897 kernel: BIOS-e820: [mem 0x000000007f97f000-0x000000007f9fefff] ACPI NVS Dec 16 13:15:52.855910 kernel: BIOS-e820: [mem 0x000000007f9ff000-0x000000007fe4efff] usable Dec 16 13:15:52.855923 kernel: BIOS-e820: [mem 0x000000007fe4f000-0x000000007fe52fff] reserved Dec 16 13:15:52.855936 kernel: BIOS-e820: [mem 0x000000007fe53000-0x000000007fe54fff] ACPI NVS Dec 16 13:15:52.855949 kernel: BIOS-e820: [mem 0x000000007fe55000-0x000000007febbfff] usable Dec 16 13:15:52.855962 kernel: BIOS-e820: [mem 0x000000007febc000-0x000000007ff3ffff] reserved Dec 16 13:15:52.855974 kernel: BIOS-e820: [mem 0x000000007ff40000-0x000000007fffffff] ACPI NVS Dec 16 13:15:52.855988 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Dec 16 13:15:52.856004 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Dec 16 13:15:52.856017 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Dec 16 13:15:52.856030 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000047fffffff] usable Dec 16 13:15:52.856043 kernel: NX (Execute Disable) protection: active Dec 16 13:15:52.856056 kernel: APIC: Static calls initialized Dec 16 13:15:52.856069 kernel: e820: update [mem 0x7dd4e018-0x7dd57a57] usable ==> usable Dec 16 13:15:52.856083 kernel: e820: update [mem 0x7dd26018-0x7dd4d457] usable ==> usable Dec 16 13:15:52.856096 kernel: extended physical RAM map: Dec 16 13:15:52.856110 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Dec 16 13:15:52.856123 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable Dec 16 13:15:52.856136 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Dec 16 13:15:52.856153 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable Dec 16 13:15:52.856166 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Dec 16 13:15:52.856179 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable Dec 16 13:15:52.856192 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Dec 16 13:15:52.856212 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000007dd26017] usable Dec 16 13:15:52.856226 kernel: reserve setup_data: [mem 0x000000007dd26018-0x000000007dd4d457] usable Dec 16 13:15:52.856239 kernel: reserve setup_data: [mem 0x000000007dd4d458-0x000000007dd4e017] usable Dec 16 13:15:52.856256 kernel: reserve setup_data: [mem 0x000000007dd4e018-0x000000007dd57a57] usable Dec 16 13:15:52.856270 kernel: reserve setup_data: [mem 0x000000007dd57a58-0x000000007e73efff] usable Dec 16 13:15:52.856284 kernel: reserve setup_data: [mem 0x000000007e73f000-0x000000007e7fffff] reserved Dec 16 13:15:52.856298 kernel: reserve setup_data: [mem 0x000000007e800000-0x000000007ea70fff] usable Dec 16 13:15:52.856311 kernel: reserve setup_data: [mem 0x000000007ea71000-0x000000007eb84fff] reserved Dec 16 13:15:52.856325 kernel: reserve setup_data: [mem 0x000000007eb85000-0x000000007f6ecfff] usable Dec 16 13:15:52.856339 kernel: reserve setup_data: [mem 0x000000007f6ed000-0x000000007f96cfff] reserved Dec 16 13:15:52.856353 kernel: reserve setup_data: [mem 0x000000007f96d000-0x000000007f97efff] ACPI data Dec 16 13:15:52.856366 kernel: reserve setup_data: [mem 0x000000007f97f000-0x000000007f9fefff] ACPI NVS Dec 16 13:15:52.856383 kernel: reserve setup_data: [mem 0x000000007f9ff000-0x000000007fe4efff] usable Dec 16 13:15:52.856397 kernel: reserve setup_data: [mem 0x000000007fe4f000-0x000000007fe52fff] reserved Dec 16 13:15:52.856410 kernel: reserve setup_data: [mem 0x000000007fe53000-0x000000007fe54fff] ACPI NVS Dec 16 13:15:52.856424 kernel: reserve setup_data: [mem 0x000000007fe55000-0x000000007febbfff] usable Dec 16 13:15:52.856438 kernel: reserve setup_data: [mem 0x000000007febc000-0x000000007ff3ffff] reserved Dec 16 13:15:52.856467 kernel: reserve setup_data: [mem 0x000000007ff40000-0x000000007fffffff] ACPI NVS Dec 16 13:15:52.856481 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Dec 16 13:15:52.856495 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Dec 16 13:15:52.856508 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Dec 16 13:15:52.856522 kernel: reserve setup_data: [mem 0x0000000100000000-0x000000047fffffff] usable Dec 16 13:15:52.856536 kernel: efi: EFI v2.7 by EDK II Dec 16 13:15:52.856554 kernel: efi: SMBIOS=0x7f772000 ACPI=0x7f97e000 ACPI 2.0=0x7f97e014 MEMATTR=0x7e282018 RNG=0x7f972018 Dec 16 13:15:52.856568 kernel: random: crng init done Dec 16 13:15:52.856582 kernel: efi: Remove mem152: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Dec 16 13:15:52.856596 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Dec 16 13:15:52.856609 kernel: secureboot: Secure boot disabled Dec 16 13:15:52.856623 kernel: SMBIOS 2.8 present. Dec 16 13:15:52.856637 kernel: DMI: STACKIT Cloud OpenStack Nova/Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 Dec 16 13:15:52.856651 kernel: DMI: Memory slots populated: 1/1 Dec 16 13:15:52.856665 kernel: Hypervisor detected: KVM Dec 16 13:15:52.856679 kernel: last_pfn = 0x7febc max_arch_pfn = 0x10000000000 Dec 16 13:15:52.856692 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Dec 16 13:15:52.856707 kernel: kvm-clock: using sched offset of 7448009930 cycles Dec 16 13:15:52.856725 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Dec 16 13:15:52.856739 kernel: tsc: Detected 2294.608 MHz processor Dec 16 13:15:52.856754 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Dec 16 13:15:52.856769 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Dec 16 13:15:52.856783 kernel: last_pfn = 0x480000 max_arch_pfn = 0x10000000000 Dec 16 13:15:52.856798 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Dec 16 13:15:52.856813 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Dec 16 13:15:52.856827 kernel: last_pfn = 0x7febc max_arch_pfn = 0x10000000000 Dec 16 13:15:52.856841 kernel: Using GB pages for direct mapping Dec 16 13:15:52.856859 kernel: ACPI: Early table checksum verification disabled Dec 16 13:15:52.856873 kernel: ACPI: RSDP 0x000000007F97E014 000024 (v02 BOCHS ) Dec 16 13:15:52.856887 kernel: ACPI: XSDT 0x000000007F97D0E8 00004C (v01 BOCHS BXPC 00000001 01000013) Dec 16 13:15:52.856902 kernel: ACPI: FACP 0x000000007F977000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 13:15:52.856916 kernel: ACPI: DSDT 0x000000007F978000 004441 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 13:15:52.856930 kernel: ACPI: FACS 0x000000007F9DD000 000040 Dec 16 13:15:52.856944 kernel: ACPI: APIC 0x000000007F976000 0000B0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 13:15:52.856959 kernel: ACPI: MCFG 0x000000007F975000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 13:15:52.856973 kernel: ACPI: WAET 0x000000007F974000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 13:15:52.856991 kernel: ACPI: BGRT 0x000000007F973000 000038 (v01 INTEL EDK2 00000002 01000013) Dec 16 13:15:52.857005 kernel: ACPI: Reserving FACP table memory at [mem 0x7f977000-0x7f9770f3] Dec 16 13:15:52.857019 kernel: ACPI: Reserving DSDT table memory at [mem 0x7f978000-0x7f97c440] Dec 16 13:15:52.857033 kernel: ACPI: Reserving FACS table memory at [mem 0x7f9dd000-0x7f9dd03f] Dec 16 13:15:52.857047 kernel: ACPI: Reserving APIC table memory at [mem 0x7f976000-0x7f9760af] Dec 16 13:15:52.857065 kernel: ACPI: Reserving MCFG table memory at [mem 0x7f975000-0x7f97503b] Dec 16 13:15:52.857079 kernel: ACPI: Reserving WAET table memory at [mem 0x7f974000-0x7f974027] Dec 16 13:15:52.857096 kernel: ACPI: Reserving BGRT table memory at [mem 0x7f973000-0x7f973037] Dec 16 13:15:52.857111 kernel: No NUMA configuration found Dec 16 13:15:52.857129 kernel: Faking a node at [mem 0x0000000000000000-0x000000047fffffff] Dec 16 13:15:52.857143 kernel: NODE_DATA(0) allocated [mem 0x47fff6dc0-0x47fffdfff] Dec 16 13:15:52.857160 kernel: Zone ranges: Dec 16 13:15:52.857178 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Dec 16 13:15:52.857196 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Dec 16 13:15:52.857216 kernel: Normal [mem 0x0000000100000000-0x000000047fffffff] Dec 16 13:15:52.857236 kernel: Device empty Dec 16 13:15:52.857251 kernel: Movable zone start for each node Dec 16 13:15:52.857264 kernel: Early memory node ranges Dec 16 13:15:52.857282 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Dec 16 13:15:52.857296 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] Dec 16 13:15:52.857310 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] Dec 16 13:15:52.857325 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] Dec 16 13:15:52.857339 kernel: node 0: [mem 0x0000000000900000-0x000000007e73efff] Dec 16 13:15:52.857352 kernel: node 0: [mem 0x000000007e800000-0x000000007ea70fff] Dec 16 13:15:52.857367 kernel: node 0: [mem 0x000000007eb85000-0x000000007f6ecfff] Dec 16 13:15:52.857397 kernel: node 0: [mem 0x000000007f9ff000-0x000000007fe4efff] Dec 16 13:15:52.857413 kernel: node 0: [mem 0x000000007fe55000-0x000000007febbfff] Dec 16 13:15:52.857428 kernel: node 0: [mem 0x0000000100000000-0x000000047fffffff] Dec 16 13:15:52.857454 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000047fffffff] Dec 16 13:15:52.857470 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Dec 16 13:15:52.857496 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Dec 16 13:15:52.857513 kernel: On node 0, zone DMA: 8 pages in unavailable ranges Dec 16 13:15:52.857536 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Dec 16 13:15:52.857551 kernel: On node 0, zone DMA: 239 pages in unavailable ranges Dec 16 13:15:52.857567 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Dec 16 13:15:52.857583 kernel: On node 0, zone DMA32: 276 pages in unavailable ranges Dec 16 13:15:52.857602 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Dec 16 13:15:52.857617 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges Dec 16 13:15:52.857633 kernel: On node 0, zone Normal: 324 pages in unavailable ranges Dec 16 13:15:52.857649 kernel: ACPI: PM-Timer IO Port: 0x608 Dec 16 13:15:52.857665 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Dec 16 13:15:52.857680 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Dec 16 13:15:52.857696 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Dec 16 13:15:52.857711 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Dec 16 13:15:52.857727 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Dec 16 13:15:52.857746 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Dec 16 13:15:52.857761 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Dec 16 13:15:52.857777 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Dec 16 13:15:52.857793 kernel: TSC deadline timer available Dec 16 13:15:52.857809 kernel: CPU topo: Max. logical packages: 8 Dec 16 13:15:52.857824 kernel: CPU topo: Max. logical dies: 8 Dec 16 13:15:52.857840 kernel: CPU topo: Max. dies per package: 1 Dec 16 13:15:52.857855 kernel: CPU topo: Max. threads per core: 1 Dec 16 13:15:52.857871 kernel: CPU topo: Num. cores per package: 1 Dec 16 13:15:52.857889 kernel: CPU topo: Num. threads per package: 1 Dec 16 13:15:52.857905 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs Dec 16 13:15:52.857920 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Dec 16 13:15:52.857936 kernel: kvm-guest: KVM setup pv remote TLB flush Dec 16 13:15:52.857951 kernel: kvm-guest: setup PV sched yield Dec 16 13:15:52.857967 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Dec 16 13:15:52.857983 kernel: Booting paravirtualized kernel on KVM Dec 16 13:15:52.857998 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Dec 16 13:15:52.858015 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1 Dec 16 13:15:52.858033 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Dec 16 13:15:52.858049 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Dec 16 13:15:52.858065 kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 Dec 16 13:15:52.858080 kernel: kvm-guest: PV spinlocks enabled Dec 16 13:15:52.858095 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Dec 16 13:15:52.858113 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=a214a2d85e162c493e8b13db2df50a43e1005a0e4854a1ae089a14f442a30022 Dec 16 13:15:52.858129 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Dec 16 13:15:52.858145 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Dec 16 13:15:52.858164 kernel: Fallback order for Node 0: 0 Dec 16 13:15:52.858180 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4192374 Dec 16 13:15:52.858195 kernel: Policy zone: Normal Dec 16 13:15:52.858211 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 16 13:15:52.858227 kernel: software IO TLB: area num 8. Dec 16 13:15:52.858242 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1 Dec 16 13:15:52.858258 kernel: ftrace: allocating 40103 entries in 157 pages Dec 16 13:15:52.858274 kernel: ftrace: allocated 157 pages with 5 groups Dec 16 13:15:52.858290 kernel: Dynamic Preempt: voluntary Dec 16 13:15:52.858308 kernel: rcu: Preemptible hierarchical RCU implementation. Dec 16 13:15:52.858325 kernel: rcu: RCU event tracing is enabled. Dec 16 13:15:52.858342 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=8. Dec 16 13:15:52.858358 kernel: Trampoline variant of Tasks RCU enabled. Dec 16 13:15:52.858374 kernel: Rude variant of Tasks RCU enabled. Dec 16 13:15:52.858390 kernel: Tracing variant of Tasks RCU enabled. Dec 16 13:15:52.858405 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 16 13:15:52.858421 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8 Dec 16 13:15:52.858437 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8. Dec 16 13:15:52.858468 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8. Dec 16 13:15:52.858483 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8. Dec 16 13:15:52.858499 kernel: NR_IRQS: 33024, nr_irqs: 488, preallocated irqs: 16 Dec 16 13:15:52.858515 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 16 13:15:52.858530 kernel: Console: colour dummy device 80x25 Dec 16 13:15:52.858546 kernel: printk: legacy console [tty0] enabled Dec 16 13:15:52.858561 kernel: printk: legacy console [ttyS0] enabled Dec 16 13:15:52.858577 kernel: ACPI: Core revision 20240827 Dec 16 13:15:52.858593 kernel: APIC: Switch to symmetric I/O mode setup Dec 16 13:15:52.858612 kernel: x2apic enabled Dec 16 13:15:52.858697 kernel: APIC: Switched APIC routing to: physical x2apic Dec 16 13:15:52.858713 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Dec 16 13:15:52.858729 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Dec 16 13:15:52.858745 kernel: kvm-guest: setup PV IPIs Dec 16 13:15:52.858761 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x21134f58f0d, max_idle_ns: 440795217993 ns Dec 16 13:15:52.858777 kernel: Calibrating delay loop (skipped) preset value.. 4589.21 BogoMIPS (lpj=2294608) Dec 16 13:15:52.858793 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Dec 16 13:15:52.858808 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Dec 16 13:15:52.858828 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Dec 16 13:15:52.858843 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Dec 16 13:15:52.858858 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Dec 16 13:15:52.858873 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Dec 16 13:15:52.858889 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Dec 16 13:15:52.858905 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Dec 16 13:15:52.858920 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Dec 16 13:15:52.858935 kernel: TAA: Mitigation: Clear CPU buffers Dec 16 13:15:52.858950 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Dec 16 13:15:52.858965 kernel: active return thunk: its_return_thunk Dec 16 13:15:52.858980 kernel: ITS: Mitigation: Aligned branch/return thunks Dec 16 13:15:52.858998 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Dec 16 13:15:52.859014 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Dec 16 13:15:52.859029 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Dec 16 13:15:52.859043 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Dec 16 13:15:52.859059 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Dec 16 13:15:52.859074 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Dec 16 13:15:52.859089 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Dec 16 13:15:52.859104 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Dec 16 13:15:52.859119 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Dec 16 13:15:52.859134 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Dec 16 13:15:52.859149 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Dec 16 13:15:52.859167 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Dec 16 13:15:52.859182 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Dec 16 13:15:52.859198 kernel: Freeing SMP alternatives memory: 32K Dec 16 13:15:52.859213 kernel: pid_max: default: 32768 minimum: 301 Dec 16 13:15:52.859228 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Dec 16 13:15:52.859243 kernel: landlock: Up and running. Dec 16 13:15:52.859258 kernel: SELinux: Initializing. Dec 16 13:15:52.859273 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 16 13:15:52.859288 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 16 13:15:52.859303 kernel: smpboot: CPU0: Intel(R) Xeon(R) Silver 4316 CPU @ 2.30GHz (family: 0x6, model: 0x6a, stepping: 0x6) Dec 16 13:15:52.859319 kernel: Performance Events: PEBS fmt0-, Icelake events, full-width counters, Intel PMU driver. Dec 16 13:15:52.859337 kernel: ... version: 2 Dec 16 13:15:52.859354 kernel: ... bit width: 48 Dec 16 13:15:52.859369 kernel: ... generic registers: 8 Dec 16 13:15:52.859385 kernel: ... value mask: 0000ffffffffffff Dec 16 13:15:52.859401 kernel: ... max period: 00007fffffffffff Dec 16 13:15:52.859416 kernel: ... fixed-purpose events: 3 Dec 16 13:15:52.859431 kernel: ... event mask: 00000007000000ff Dec 16 13:15:52.859461 kernel: signal: max sigframe size: 3632 Dec 16 13:15:52.859477 kernel: rcu: Hierarchical SRCU implementation. Dec 16 13:15:52.859493 kernel: rcu: Max phase no-delay instances is 400. Dec 16 13:15:52.859512 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Dec 16 13:15:52.859528 kernel: smp: Bringing up secondary CPUs ... Dec 16 13:15:52.859544 kernel: smpboot: x86: Booting SMP configuration: Dec 16 13:15:52.859560 kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 Dec 16 13:15:52.859583 kernel: smp: Brought up 1 node, 8 CPUs Dec 16 13:15:52.859614 kernel: smpboot: Total of 8 processors activated (36713.72 BogoMIPS) Dec 16 13:15:52.859631 kernel: Memory: 16308696K/16769496K available (14336K kernel code, 2444K rwdata, 26064K rodata, 46188K init, 2572K bss, 453244K reserved, 0K cma-reserved) Dec 16 13:15:52.859654 kernel: devtmpfs: initialized Dec 16 13:15:52.859670 kernel: x86/mm: Memory block size: 128MB Dec 16 13:15:52.859697 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) Dec 16 13:15:52.859719 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) Dec 16 13:15:52.859743 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) Dec 16 13:15:52.859759 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7f97f000-0x7f9fefff] (524288 bytes) Dec 16 13:15:52.859775 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fe53000-0x7fe54fff] (8192 bytes) Dec 16 13:15:52.859797 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7ff40000-0x7fffffff] (786432 bytes) Dec 16 13:15:52.859814 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 16 13:15:52.859830 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear) Dec 16 13:15:52.859855 kernel: pinctrl core: initialized pinctrl subsystem Dec 16 13:15:52.859872 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 16 13:15:52.859900 kernel: audit: initializing netlink subsys (disabled) Dec 16 13:15:52.859924 kernel: audit: type=2000 audit(1765890949.450:1): state=initialized audit_enabled=0 res=1 Dec 16 13:15:52.859939 kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 16 13:15:52.859953 kernel: thermal_sys: Registered thermal governor 'user_space' Dec 16 13:15:52.859968 kernel: cpuidle: using governor menu Dec 16 13:15:52.859983 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 16 13:15:52.859997 kernel: dca service started, version 1.12.1 Dec 16 13:15:52.860024 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Dec 16 13:15:52.860039 kernel: PCI: Using configuration type 1 for base access Dec 16 13:15:52.860054 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Dec 16 13:15:52.860068 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Dec 16 13:15:52.860083 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Dec 16 13:15:52.860098 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Dec 16 13:15:52.860112 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Dec 16 13:15:52.860127 kernel: ACPI: Added _OSI(Module Device) Dec 16 13:15:52.860141 kernel: ACPI: Added _OSI(Processor Device) Dec 16 13:15:52.860158 kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 16 13:15:52.860173 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 16 13:15:52.860187 kernel: ACPI: Interpreter enabled Dec 16 13:15:52.860201 kernel: ACPI: PM: (supports S0 S3 S5) Dec 16 13:15:52.860216 kernel: ACPI: Using IOAPIC for interrupt routing Dec 16 13:15:52.860231 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Dec 16 13:15:52.860245 kernel: PCI: Using E820 reservations for host bridge windows Dec 16 13:15:52.860260 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Dec 16 13:15:52.860274 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 16 13:15:52.860524 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Dec 16 13:15:52.860672 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Dec 16 13:15:52.860806 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Dec 16 13:15:52.860825 kernel: PCI host bridge to bus 0000:00 Dec 16 13:15:52.860958 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Dec 16 13:15:52.861080 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Dec 16 13:15:52.861198 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Dec 16 13:15:52.861322 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Dec 16 13:15:52.861439 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Dec 16 13:15:52.861648 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x38e800003fff window] Dec 16 13:15:52.861772 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Dec 16 13:15:52.861926 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Dec 16 13:15:52.862081 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Dec 16 13:15:52.862227 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80000000-0x807fffff pref] Dec 16 13:15:52.862365 kernel: pci 0000:00:01.0: BAR 2 [mem 0x38e800000000-0x38e800003fff 64bit pref] Dec 16 13:15:52.862519 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8439e000-0x8439efff] Dec 16 13:15:52.862657 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Dec 16 13:15:52.862792 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Dec 16 13:15:52.862939 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.863076 kernel: pci 0000:00:02.0: BAR 0 [mem 0x8439d000-0x8439dfff] Dec 16 13:15:52.863215 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Dec 16 13:15:52.863350 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Dec 16 13:15:52.863498 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Dec 16 13:15:52.863650 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Dec 16 13:15:52.863795 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.863922 kernel: pci 0000:00:02.1: BAR 0 [mem 0x8439c000-0x8439cfff] Dec 16 13:15:52.864052 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Dec 16 13:15:52.864177 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Dec 16 13:15:52.864300 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Dec 16 13:15:52.864436 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.864597 kernel: pci 0000:00:02.2: BAR 0 [mem 0x8439b000-0x8439bfff] Dec 16 13:15:52.864735 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Dec 16 13:15:52.864862 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Dec 16 13:15:52.864986 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Dec 16 13:15:52.865129 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.865256 kernel: pci 0000:00:02.3: BAR 0 [mem 0x8439a000-0x8439afff] Dec 16 13:15:52.865381 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Dec 16 13:15:52.865518 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Dec 16 13:15:52.865642 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Dec 16 13:15:52.865774 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.865899 kernel: pci 0000:00:02.4: BAR 0 [mem 0x84399000-0x84399fff] Dec 16 13:15:52.866028 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Dec 16 13:15:52.866152 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Dec 16 13:15:52.866275 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Dec 16 13:15:52.866405 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.866547 kernel: pci 0000:00:02.5: BAR 0 [mem 0x84398000-0x84398fff] Dec 16 13:15:52.866671 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Dec 16 13:15:52.866796 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Dec 16 13:15:52.866927 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Dec 16 13:15:52.867058 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.867182 kernel: pci 0000:00:02.6: BAR 0 [mem 0x84397000-0x84397fff] Dec 16 13:15:52.867321 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Dec 16 13:15:52.867457 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Dec 16 13:15:52.867584 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Dec 16 13:15:52.867737 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.867856 kernel: pci 0000:00:02.7: BAR 0 [mem 0x84396000-0x84396fff] Dec 16 13:15:52.867972 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Dec 16 13:15:52.868088 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Dec 16 13:15:52.868202 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Dec 16 13:15:52.868325 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.868459 kernel: pci 0000:00:03.0: BAR 0 [mem 0x84395000-0x84395fff] Dec 16 13:15:52.868591 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Dec 16 13:15:52.868708 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Dec 16 13:15:52.868824 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Dec 16 13:15:52.868957 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.869099 kernel: pci 0000:00:03.1: BAR 0 [mem 0x84394000-0x84394fff] Dec 16 13:15:52.869218 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Dec 16 13:15:52.869334 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Dec 16 13:15:52.869466 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Dec 16 13:15:52.869591 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.869710 kernel: pci 0000:00:03.2: BAR 0 [mem 0x84393000-0x84393fff] Dec 16 13:15:52.869827 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Dec 16 13:15:52.869941 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Dec 16 13:15:52.870058 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Dec 16 13:15:52.870187 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.870310 kernel: pci 0000:00:03.3: BAR 0 [mem 0x84392000-0x84392fff] Dec 16 13:15:52.870432 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Dec 16 13:15:52.870564 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Dec 16 13:15:52.870680 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Dec 16 13:15:52.870803 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.870925 kernel: pci 0000:00:03.4: BAR 0 [mem 0x84391000-0x84391fff] Dec 16 13:15:52.871042 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Dec 16 13:15:52.871159 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Dec 16 13:15:52.871275 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Dec 16 13:15:52.871398 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.871528 kernel: pci 0000:00:03.5: BAR 0 [mem 0x84390000-0x84390fff] Dec 16 13:15:52.871659 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Dec 16 13:15:52.871782 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Dec 16 13:15:52.871897 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Dec 16 13:15:52.872020 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.872131 kernel: pci 0000:00:03.6: BAR 0 [mem 0x8438f000-0x8438ffff] Dec 16 13:15:52.872241 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Dec 16 13:15:52.872349 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Dec 16 13:15:52.872468 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Dec 16 13:15:52.872588 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.872700 kernel: pci 0000:00:03.7: BAR 0 [mem 0x8438e000-0x8438efff] Dec 16 13:15:52.872811 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Dec 16 13:15:52.872920 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Dec 16 13:15:52.873028 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Dec 16 13:15:52.873145 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.873256 kernel: pci 0000:00:04.0: BAR 0 [mem 0x8438d000-0x8438dfff] Dec 16 13:15:52.873369 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Dec 16 13:15:52.873491 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Dec 16 13:15:52.873603 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Dec 16 13:15:52.873720 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.873830 kernel: pci 0000:00:04.1: BAR 0 [mem 0x8438c000-0x8438cfff] Dec 16 13:15:52.873939 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Dec 16 13:15:52.874050 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Dec 16 13:15:52.874162 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Dec 16 13:15:52.874280 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.874391 kernel: pci 0000:00:04.2: BAR 0 [mem 0x8438b000-0x8438bfff] Dec 16 13:15:52.874513 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Dec 16 13:15:52.874623 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Dec 16 13:15:52.874732 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Dec 16 13:15:52.874852 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.874966 kernel: pci 0000:00:04.3: BAR 0 [mem 0x8438a000-0x8438afff] Dec 16 13:15:52.875075 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Dec 16 13:15:52.875184 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Dec 16 13:15:52.875293 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Dec 16 13:15:52.875413 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.875538 kernel: pci 0000:00:04.4: BAR 0 [mem 0x84389000-0x84389fff] Dec 16 13:15:52.875663 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Dec 16 13:15:52.875774 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Dec 16 13:15:52.875888 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Dec 16 13:15:52.876004 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.876115 kernel: pci 0000:00:04.5: BAR 0 [mem 0x84388000-0x84388fff] Dec 16 13:15:52.876222 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Dec 16 13:15:52.876325 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Dec 16 13:15:52.876429 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Dec 16 13:15:52.876561 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.876672 kernel: pci 0000:00:04.6: BAR 0 [mem 0x84387000-0x84387fff] Dec 16 13:15:52.876776 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Dec 16 13:15:52.876878 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Dec 16 13:15:52.876981 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Dec 16 13:15:52.877094 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.877199 kernel: pci 0000:00:04.7: BAR 0 [mem 0x84386000-0x84386fff] Dec 16 13:15:52.877302 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Dec 16 13:15:52.877406 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Dec 16 13:15:52.877540 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Dec 16 13:15:52.877650 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.877754 kernel: pci 0000:00:05.0: BAR 0 [mem 0x84385000-0x84385fff] Dec 16 13:15:52.877863 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Dec 16 13:15:52.877966 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Dec 16 13:15:52.878071 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Dec 16 13:15:52.878182 kernel: pci 0000:00:05.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.878286 kernel: pci 0000:00:05.1: BAR 0 [mem 0x84384000-0x84384fff] Dec 16 13:15:52.878390 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Dec 16 13:15:52.878505 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Dec 16 13:15:52.878612 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Dec 16 13:15:52.878723 kernel: pci 0000:00:05.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.878826 kernel: pci 0000:00:05.2: BAR 0 [mem 0x84383000-0x84383fff] Dec 16 13:15:52.878930 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Dec 16 13:15:52.879032 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Dec 16 13:15:52.879135 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Dec 16 13:15:52.879244 kernel: pci 0000:00:05.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.879353 kernel: pci 0000:00:05.3: BAR 0 [mem 0x84382000-0x84382fff] Dec 16 13:15:52.879468 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Dec 16 13:15:52.879572 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Dec 16 13:15:52.879689 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Dec 16 13:15:52.879799 kernel: pci 0000:00:05.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:15:52.879895 kernel: pci 0000:00:05.4: BAR 0 [mem 0x84381000-0x84381fff] Dec 16 13:15:52.879989 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Dec 16 13:15:52.880087 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Dec 16 13:15:52.880183 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Dec 16 13:15:52.880286 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Dec 16 13:15:52.880381 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Dec 16 13:15:52.880494 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Dec 16 13:15:52.880591 kernel: pci 0000:00:1f.2: BAR 4 [io 0x7040-0x705f] Dec 16 13:15:52.880684 kernel: pci 0000:00:1f.2: BAR 5 [mem 0x84380000-0x84380fff] Dec 16 13:15:52.880786 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Dec 16 13:15:52.880881 kernel: pci 0000:00:1f.3: BAR 4 [io 0x7000-0x703f] Dec 16 13:15:52.880984 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Dec 16 13:15:52.881083 kernel: pci 0000:01:00.0: BAR 0 [mem 0x84200000-0x842000ff 64bit] Dec 16 13:15:52.881180 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Dec 16 13:15:52.881276 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Dec 16 13:15:52.881373 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Dec 16 13:15:52.881481 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Dec 16 13:15:52.881579 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Dec 16 13:15:52.881688 kernel: pci_bus 0000:02: extended config space not accessible Dec 16 13:15:52.881702 kernel: acpiphp: Slot [1] registered Dec 16 13:15:52.881713 kernel: acpiphp: Slot [0] registered Dec 16 13:15:52.881727 kernel: acpiphp: Slot [2] registered Dec 16 13:15:52.881738 kernel: acpiphp: Slot [3] registered Dec 16 13:15:52.881748 kernel: acpiphp: Slot [4] registered Dec 16 13:15:52.881761 kernel: acpiphp: Slot [5] registered Dec 16 13:15:52.881771 kernel: acpiphp: Slot [6] registered Dec 16 13:15:52.881782 kernel: acpiphp: Slot [7] registered Dec 16 13:15:52.881793 kernel: acpiphp: Slot [8] registered Dec 16 13:15:52.881803 kernel: acpiphp: Slot [9] registered Dec 16 13:15:52.881814 kernel: acpiphp: Slot [10] registered Dec 16 13:15:52.881824 kernel: acpiphp: Slot [11] registered Dec 16 13:15:52.881835 kernel: acpiphp: Slot [12] registered Dec 16 13:15:52.881845 kernel: acpiphp: Slot [13] registered Dec 16 13:15:52.881856 kernel: acpiphp: Slot [14] registered Dec 16 13:15:52.881869 kernel: acpiphp: Slot [15] registered Dec 16 13:15:52.881880 kernel: acpiphp: Slot [16] registered Dec 16 13:15:52.881890 kernel: acpiphp: Slot [17] registered Dec 16 13:15:52.881901 kernel: acpiphp: Slot [18] registered Dec 16 13:15:52.881912 kernel: acpiphp: Slot [19] registered Dec 16 13:15:52.881922 kernel: acpiphp: Slot [20] registered Dec 16 13:15:52.881933 kernel: acpiphp: Slot [21] registered Dec 16 13:15:52.881943 kernel: acpiphp: Slot [22] registered Dec 16 13:15:52.881954 kernel: acpiphp: Slot [23] registered Dec 16 13:15:52.881966 kernel: acpiphp: Slot [24] registered Dec 16 13:15:52.881977 kernel: acpiphp: Slot [25] registered Dec 16 13:15:52.881988 kernel: acpiphp: Slot [26] registered Dec 16 13:15:52.881999 kernel: acpiphp: Slot [27] registered Dec 16 13:15:52.882009 kernel: acpiphp: Slot [28] registered Dec 16 13:15:52.882019 kernel: acpiphp: Slot [29] registered Dec 16 13:15:52.882030 kernel: acpiphp: Slot [30] registered Dec 16 13:15:52.882040 kernel: acpiphp: Slot [31] registered Dec 16 13:15:52.882147 kernel: pci 0000:02:01.0: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint Dec 16 13:15:52.882253 kernel: pci 0000:02:01.0: BAR 4 [io 0x6000-0x601f] Dec 16 13:15:52.882365 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Dec 16 13:15:52.882384 kernel: acpiphp: Slot [0-2] registered Dec 16 13:15:52.882505 kernel: pci 0000:03:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 16 13:15:52.882606 kernel: pci 0000:03:00.0: BAR 1 [mem 0x83e00000-0x83e00fff] Dec 16 13:15:52.882706 kernel: pci 0000:03:00.0: BAR 4 [mem 0x380800000000-0x380800003fff 64bit pref] Dec 16 13:15:52.882803 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref] Dec 16 13:15:52.882901 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Dec 16 13:15:52.882919 kernel: acpiphp: Slot [0-3] registered Dec 16 13:15:52.883021 kernel: pci 0000:04:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Dec 16 13:15:52.883121 kernel: pci 0000:04:00.0: BAR 1 [mem 0x83c00000-0x83c00fff] Dec 16 13:15:52.883220 kernel: pci 0000:04:00.0: BAR 4 [mem 0x381000000000-0x381000003fff 64bit pref] Dec 16 13:15:52.883318 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Dec 16 13:15:52.883332 kernel: acpiphp: Slot [0-4] registered Dec 16 13:15:52.883438 kernel: pci 0000:05:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Dec 16 13:15:52.883559 kernel: pci 0000:05:00.0: BAR 4 [mem 0x381800000000-0x381800003fff 64bit pref] Dec 16 13:15:52.883666 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Dec 16 13:15:52.883681 kernel: acpiphp: Slot [0-5] registered Dec 16 13:15:52.883789 kernel: pci 0000:06:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Dec 16 13:15:52.883908 kernel: pci 0000:06:00.0: BAR 1 [mem 0x83800000-0x83800fff] Dec 16 13:15:52.884042 kernel: pci 0000:06:00.0: BAR 4 [mem 0x382000000000-0x382000003fff 64bit pref] Dec 16 13:15:52.884165 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Dec 16 13:15:52.884183 kernel: acpiphp: Slot [0-6] registered Dec 16 13:15:52.884274 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Dec 16 13:15:52.884287 kernel: acpiphp: Slot [0-7] registered Dec 16 13:15:52.884376 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Dec 16 13:15:52.884389 kernel: acpiphp: Slot [0-8] registered Dec 16 13:15:52.884488 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Dec 16 13:15:52.884501 kernel: acpiphp: Slot [0-9] registered Dec 16 13:15:52.884590 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Dec 16 13:15:52.884606 kernel: acpiphp: Slot [0-10] registered Dec 16 13:15:52.884695 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Dec 16 13:15:52.884708 kernel: acpiphp: Slot [0-11] registered Dec 16 13:15:52.884796 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Dec 16 13:15:52.884810 kernel: acpiphp: Slot [0-12] registered Dec 16 13:15:52.884898 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Dec 16 13:15:52.884911 kernel: acpiphp: Slot [0-13] registered Dec 16 13:15:52.885003 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Dec 16 13:15:52.885016 kernel: acpiphp: Slot [0-14] registered Dec 16 13:15:52.885104 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Dec 16 13:15:52.885117 kernel: acpiphp: Slot [0-15] registered Dec 16 13:15:52.885206 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Dec 16 13:15:52.885219 kernel: acpiphp: Slot [0-16] registered Dec 16 13:15:52.885309 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Dec 16 13:15:52.885322 kernel: acpiphp: Slot [0-17] registered Dec 16 13:15:52.885414 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Dec 16 13:15:52.885427 kernel: acpiphp: Slot [0-18] registered Dec 16 13:15:52.885526 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Dec 16 13:15:52.885540 kernel: acpiphp: Slot [0-19] registered Dec 16 13:15:52.885628 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Dec 16 13:15:52.885641 kernel: acpiphp: Slot [0-20] registered Dec 16 13:15:52.885740 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Dec 16 13:15:52.885754 kernel: acpiphp: Slot [0-21] registered Dec 16 13:15:52.885852 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Dec 16 13:15:52.885866 kernel: acpiphp: Slot [0-22] registered Dec 16 13:15:52.885954 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Dec 16 13:15:52.885967 kernel: acpiphp: Slot [0-23] registered Dec 16 13:15:52.886056 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Dec 16 13:15:52.886069 kernel: acpiphp: Slot [0-24] registered Dec 16 13:15:52.886157 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Dec 16 13:15:52.886170 kernel: acpiphp: Slot [0-25] registered Dec 16 13:15:52.886260 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Dec 16 13:15:52.886273 kernel: acpiphp: Slot [0-26] registered Dec 16 13:15:52.886361 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Dec 16 13:15:52.886374 kernel: acpiphp: Slot [0-27] registered Dec 16 13:15:52.886469 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Dec 16 13:15:52.886483 kernel: acpiphp: Slot [0-28] registered Dec 16 13:15:52.886571 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Dec 16 13:15:52.886585 kernel: acpiphp: Slot [0-29] registered Dec 16 13:15:52.886675 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Dec 16 13:15:52.886688 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Dec 16 13:15:52.886699 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Dec 16 13:15:52.886709 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Dec 16 13:15:52.886719 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Dec 16 13:15:52.886729 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Dec 16 13:15:52.886739 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Dec 16 13:15:52.886750 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Dec 16 13:15:52.886763 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Dec 16 13:15:52.886773 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Dec 16 13:15:52.886783 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Dec 16 13:15:52.886793 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Dec 16 13:15:52.886803 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Dec 16 13:15:52.886813 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Dec 16 13:15:52.886823 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Dec 16 13:15:52.886833 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Dec 16 13:15:52.886843 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Dec 16 13:15:52.886856 kernel: iommu: Default domain type: Translated Dec 16 13:15:52.886866 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Dec 16 13:15:52.886876 kernel: efivars: Registered efivars operations Dec 16 13:15:52.886886 kernel: PCI: Using ACPI for IRQ routing Dec 16 13:15:52.886896 kernel: PCI: pci_cache_line_size set to 64 bytes Dec 16 13:15:52.886907 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] Dec 16 13:15:52.886916 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] Dec 16 13:15:52.886926 kernel: e820: reserve RAM buffer [mem 0x7dd26018-0x7fffffff] Dec 16 13:15:52.886936 kernel: e820: reserve RAM buffer [mem 0x7dd4e018-0x7fffffff] Dec 16 13:15:52.886946 kernel: e820: reserve RAM buffer [mem 0x7e73f000-0x7fffffff] Dec 16 13:15:52.886958 kernel: e820: reserve RAM buffer [mem 0x7ea71000-0x7fffffff] Dec 16 13:15:52.886968 kernel: e820: reserve RAM buffer [mem 0x7f6ed000-0x7fffffff] Dec 16 13:15:52.886978 kernel: e820: reserve RAM buffer [mem 0x7fe4f000-0x7fffffff] Dec 16 13:15:52.886988 kernel: e820: reserve RAM buffer [mem 0x7febc000-0x7fffffff] Dec 16 13:15:52.887079 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Dec 16 13:15:52.887167 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Dec 16 13:15:52.887256 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Dec 16 13:15:52.887268 kernel: vgaarb: loaded Dec 16 13:15:52.887281 kernel: clocksource: Switched to clocksource kvm-clock Dec 16 13:15:52.887291 kernel: VFS: Disk quotas dquot_6.6.0 Dec 16 13:15:52.887302 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 16 13:15:52.887312 kernel: pnp: PnP ACPI init Dec 16 13:15:52.887408 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Dec 16 13:15:52.887422 kernel: pnp: PnP ACPI: found 5 devices Dec 16 13:15:52.887433 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Dec 16 13:15:52.887456 kernel: NET: Registered PF_INET protocol family Dec 16 13:15:52.887470 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 16 13:15:52.887480 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Dec 16 13:15:52.887491 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 16 13:15:52.887501 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Dec 16 13:15:52.887511 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Dec 16 13:15:52.887521 kernel: TCP: Hash tables configured (established 131072 bind 65536) Dec 16 13:15:52.887531 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 16 13:15:52.887541 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 16 13:15:52.887552 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 16 13:15:52.887564 kernel: NET: Registered PF_XDP protocol family Dec 16 13:15:52.887670 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Dec 16 13:15:52.887762 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Dec 16 13:15:52.887856 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Dec 16 13:15:52.887948 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Dec 16 13:15:52.888037 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Dec 16 13:15:52.888123 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Dec 16 13:15:52.888209 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Dec 16 13:15:52.888298 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Dec 16 13:15:52.888384 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Dec 16 13:15:52.888478 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Dec 16 13:15:52.888565 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Dec 16 13:15:52.888650 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Dec 16 13:15:52.888736 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Dec 16 13:15:52.888821 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Dec 16 13:15:52.888968 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Dec 16 13:15:52.889062 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Dec 16 13:15:52.889149 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Dec 16 13:15:52.889235 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Dec 16 13:15:52.889324 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Dec 16 13:15:52.889414 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Dec 16 13:15:52.889527 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Dec 16 13:15:52.889634 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Dec 16 13:15:52.889744 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Dec 16 13:15:52.889848 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Dec 16 13:15:52.889955 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Dec 16 13:15:52.890060 kernel: pci 0000:00:05.1: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Dec 16 13:15:52.890161 kernel: pci 0000:00:05.2: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Dec 16 13:15:52.890266 kernel: pci 0000:00:05.3: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Dec 16 13:15:52.890371 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Dec 16 13:15:52.890481 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x1fff]: assigned Dec 16 13:15:52.890591 kernel: pci 0000:00:02.2: bridge window [io 0x2000-0x2fff]: assigned Dec 16 13:15:52.890694 kernel: pci 0000:00:02.3: bridge window [io 0x3000-0x3fff]: assigned Dec 16 13:15:52.890795 kernel: pci 0000:00:02.4: bridge window [io 0x4000-0x4fff]: assigned Dec 16 13:15:52.890890 kernel: pci 0000:00:02.5: bridge window [io 0x5000-0x5fff]: assigned Dec 16 13:15:52.891002 kernel: pci 0000:00:02.6: bridge window [io 0x8000-0x8fff]: assigned Dec 16 13:15:52.891112 kernel: pci 0000:00:02.7: bridge window [io 0x9000-0x9fff]: assigned Dec 16 13:15:52.891199 kernel: pci 0000:00:03.0: bridge window [io 0xa000-0xafff]: assigned Dec 16 13:15:52.891286 kernel: pci 0000:00:03.1: bridge window [io 0xb000-0xbfff]: assigned Dec 16 13:15:52.891377 kernel: pci 0000:00:03.2: bridge window [io 0xc000-0xcfff]: assigned Dec 16 13:15:52.891484 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff]: assigned Dec 16 13:15:52.891574 kernel: pci 0000:00:03.4: bridge window [io 0xe000-0xefff]: assigned Dec 16 13:15:52.891674 kernel: pci 0000:00:03.5: bridge window [io 0xf000-0xffff]: assigned Dec 16 13:15:52.891760 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.891845 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.891932 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.892017 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.892102 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.892183 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.892264 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.892345 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.892427 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.892517 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.892603 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.892685 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.892772 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.892854 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.892935 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.893016 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.893098 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.893180 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.893263 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.893345 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.893430 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.893521 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.893610 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.893697 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.893784 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.893866 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.893947 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.894032 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.894112 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.894193 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.894274 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff]: assigned Dec 16 13:15:52.894356 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff]: assigned Dec 16 13:15:52.894436 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff]: assigned Dec 16 13:15:52.894527 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff]: assigned Dec 16 13:15:52.894609 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff]: assigned Dec 16 13:15:52.894694 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff]: assigned Dec 16 13:15:52.894775 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff]: assigned Dec 16 13:15:52.894856 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff]: assigned Dec 16 13:15:52.894937 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff]: assigned Dec 16 13:15:52.895019 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff]: assigned Dec 16 13:15:52.895099 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff]: assigned Dec 16 13:15:52.895180 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff]: assigned Dec 16 13:15:52.895261 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff]: assigned Dec 16 13:15:52.895346 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.895431 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.895525 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.895623 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.895720 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.895829 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.895919 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.896008 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.896101 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.896181 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.896256 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.896332 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.896408 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.896492 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.896568 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.896643 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.896718 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.896797 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.896873 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.896985 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.897061 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.897137 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.897222 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.897299 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.897382 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.897470 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.897546 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.897622 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.897698 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:15:52.897773 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Dec 16 13:15:52.897886 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Dec 16 13:15:52.897988 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Dec 16 13:15:52.898067 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Dec 16 13:15:52.898150 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Dec 16 13:15:52.898228 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Dec 16 13:15:52.898303 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Dec 16 13:15:52.898379 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Dec 16 13:15:52.898462 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Dec 16 13:15:52.898545 kernel: pci 0000:03:00.0: ROM [mem 0x83e80000-0x83efffff pref]: assigned Dec 16 13:15:52.898636 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Dec 16 13:15:52.898714 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Dec 16 13:15:52.898789 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Dec 16 13:15:52.898869 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Dec 16 13:15:52.898946 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Dec 16 13:15:52.899022 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Dec 16 13:15:52.899096 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Dec 16 13:15:52.899172 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Dec 16 13:15:52.899246 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Dec 16 13:15:52.899322 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Dec 16 13:15:52.899397 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Dec 16 13:15:52.899481 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Dec 16 13:15:52.899556 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Dec 16 13:15:52.899645 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Dec 16 13:15:52.899721 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Dec 16 13:15:52.899800 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Dec 16 13:15:52.899876 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Dec 16 13:15:52.899952 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Dec 16 13:15:52.900026 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Dec 16 13:15:52.900100 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Dec 16 13:15:52.900173 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Dec 16 13:15:52.900245 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Dec 16 13:15:52.900316 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Dec 16 13:15:52.900389 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Dec 16 13:15:52.900469 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Dec 16 13:15:52.900542 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Dec 16 13:15:52.900614 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Dec 16 13:15:52.900687 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Dec 16 13:15:52.900760 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Dec 16 13:15:52.900835 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Dec 16 13:15:52.900909 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Dec 16 13:15:52.900980 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Dec 16 13:15:52.901054 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Dec 16 13:15:52.901128 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Dec 16 13:15:52.901200 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Dec 16 13:15:52.901273 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Dec 16 13:15:52.901346 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Dec 16 13:15:52.901419 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Dec 16 13:15:52.901498 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Dec 16 13:15:52.901575 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Dec 16 13:15:52.901647 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Dec 16 13:15:52.901720 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Dec 16 13:15:52.901793 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Dec 16 13:15:52.901865 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Dec 16 13:15:52.901939 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Dec 16 13:15:52.902013 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Dec 16 13:15:52.902088 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff] Dec 16 13:15:52.902163 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Dec 16 13:15:52.902236 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Dec 16 13:15:52.902310 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Dec 16 13:15:52.902382 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff] Dec 16 13:15:52.902462 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Dec 16 13:15:52.902535 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Dec 16 13:15:52.902611 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Dec 16 13:15:52.902687 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff] Dec 16 13:15:52.902760 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Dec 16 13:15:52.902833 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Dec 16 13:15:52.902906 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Dec 16 13:15:52.902978 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff] Dec 16 13:15:52.903051 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Dec 16 13:15:52.903124 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Dec 16 13:15:52.903202 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Dec 16 13:15:52.903275 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff] Dec 16 13:15:52.903348 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Dec 16 13:15:52.903420 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Dec 16 13:15:52.903512 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Dec 16 13:15:52.903602 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff] Dec 16 13:15:52.903677 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Dec 16 13:15:52.903749 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Dec 16 13:15:52.903827 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Dec 16 13:15:52.903899 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff] Dec 16 13:15:52.903972 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Dec 16 13:15:52.904045 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Dec 16 13:15:52.904119 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Dec 16 13:15:52.904191 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff] Dec 16 13:15:52.904262 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Dec 16 13:15:52.904331 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Dec 16 13:15:52.904405 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Dec 16 13:15:52.904484 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff] Dec 16 13:15:52.904554 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Dec 16 13:15:52.904624 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Dec 16 13:15:52.904696 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Dec 16 13:15:52.904767 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff] Dec 16 13:15:52.904836 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Dec 16 13:15:52.904909 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Dec 16 13:15:52.904981 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Dec 16 13:15:52.905051 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff] Dec 16 13:15:52.905121 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Dec 16 13:15:52.905193 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Dec 16 13:15:52.905264 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Dec 16 13:15:52.905334 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff] Dec 16 13:15:52.905406 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Dec 16 13:15:52.905483 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Dec 16 13:15:52.905554 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Dec 16 13:15:52.905623 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff] Dec 16 13:15:52.905693 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Dec 16 13:15:52.905762 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Dec 16 13:15:52.905832 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Dec 16 13:15:52.905899 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Dec 16 13:15:52.905962 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Dec 16 13:15:52.906025 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Dec 16 13:15:52.906087 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Dec 16 13:15:52.906148 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x38e800003fff window] Dec 16 13:15:52.906221 kernel: pci_bus 0000:01: resource 0 [io 0x6000-0x6fff] Dec 16 13:15:52.906287 kernel: pci_bus 0000:01: resource 1 [mem 0x84000000-0x842fffff] Dec 16 13:15:52.906355 kernel: pci_bus 0000:01: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Dec 16 13:15:52.906427 kernel: pci_bus 0000:02: resource 0 [io 0x6000-0x6fff] Dec 16 13:15:52.906514 kernel: pci_bus 0000:02: resource 1 [mem 0x84000000-0x841fffff] Dec 16 13:15:52.906583 kernel: pci_bus 0000:02: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Dec 16 13:15:52.906654 kernel: pci_bus 0000:03: resource 1 [mem 0x83e00000-0x83ffffff] Dec 16 13:15:52.906721 kernel: pci_bus 0000:03: resource 2 [mem 0x380800000000-0x380fffffffff 64bit pref] Dec 16 13:15:52.906792 kernel: pci_bus 0000:04: resource 1 [mem 0x83c00000-0x83dfffff] Dec 16 13:15:52.906861 kernel: pci_bus 0000:04: resource 2 [mem 0x381000000000-0x3817ffffffff 64bit pref] Dec 16 13:15:52.906932 kernel: pci_bus 0000:05: resource 1 [mem 0x83a00000-0x83bfffff] Dec 16 13:15:52.906999 kernel: pci_bus 0000:05: resource 2 [mem 0x381800000000-0x381fffffffff 64bit pref] Dec 16 13:15:52.907073 kernel: pci_bus 0000:06: resource 1 [mem 0x83800000-0x839fffff] Dec 16 13:15:52.907139 kernel: pci_bus 0000:06: resource 2 [mem 0x382000000000-0x3827ffffffff 64bit pref] Dec 16 13:15:52.907211 kernel: pci_bus 0000:07: resource 1 [mem 0x83600000-0x837fffff] Dec 16 13:15:52.907279 kernel: pci_bus 0000:07: resource 2 [mem 0x382800000000-0x382fffffffff 64bit pref] Dec 16 13:15:52.907349 kernel: pci_bus 0000:08: resource 1 [mem 0x83400000-0x835fffff] Dec 16 13:15:52.907416 kernel: pci_bus 0000:08: resource 2 [mem 0x383000000000-0x3837ffffffff 64bit pref] Dec 16 13:15:52.907493 kernel: pci_bus 0000:09: resource 1 [mem 0x83200000-0x833fffff] Dec 16 13:15:52.907560 kernel: pci_bus 0000:09: resource 2 [mem 0x383800000000-0x383fffffffff 64bit pref] Dec 16 13:15:52.907644 kernel: pci_bus 0000:0a: resource 1 [mem 0x83000000-0x831fffff] Dec 16 13:15:52.907710 kernel: pci_bus 0000:0a: resource 2 [mem 0x384000000000-0x3847ffffffff 64bit pref] Dec 16 13:15:52.907782 kernel: pci_bus 0000:0b: resource 1 [mem 0x82e00000-0x82ffffff] Dec 16 13:15:52.907848 kernel: pci_bus 0000:0b: resource 2 [mem 0x384800000000-0x384fffffffff 64bit pref] Dec 16 13:15:52.907917 kernel: pci_bus 0000:0c: resource 1 [mem 0x82c00000-0x82dfffff] Dec 16 13:15:52.907984 kernel: pci_bus 0000:0c: resource 2 [mem 0x385000000000-0x3857ffffffff 64bit pref] Dec 16 13:15:52.908052 kernel: pci_bus 0000:0d: resource 1 [mem 0x82a00000-0x82bfffff] Dec 16 13:15:52.908119 kernel: pci_bus 0000:0d: resource 2 [mem 0x385800000000-0x385fffffffff 64bit pref] Dec 16 13:15:52.908186 kernel: pci_bus 0000:0e: resource 1 [mem 0x82800000-0x829fffff] Dec 16 13:15:52.908250 kernel: pci_bus 0000:0e: resource 2 [mem 0x386000000000-0x3867ffffffff 64bit pref] Dec 16 13:15:52.908322 kernel: pci_bus 0000:0f: resource 1 [mem 0x82600000-0x827fffff] Dec 16 13:15:52.908385 kernel: pci_bus 0000:0f: resource 2 [mem 0x386800000000-0x386fffffffff 64bit pref] Dec 16 13:15:52.908458 kernel: pci_bus 0000:10: resource 1 [mem 0x82400000-0x825fffff] Dec 16 13:15:52.908525 kernel: pci_bus 0000:10: resource 2 [mem 0x387000000000-0x3877ffffffff 64bit pref] Dec 16 13:15:52.908592 kernel: pci_bus 0000:11: resource 1 [mem 0x82200000-0x823fffff] Dec 16 13:15:52.908656 kernel: pci_bus 0000:11: resource 2 [mem 0x387800000000-0x387fffffffff 64bit pref] Dec 16 13:15:52.908723 kernel: pci_bus 0000:12: resource 0 [io 0xf000-0xffff] Dec 16 13:15:52.908786 kernel: pci_bus 0000:12: resource 1 [mem 0x82000000-0x821fffff] Dec 16 13:15:52.908848 kernel: pci_bus 0000:12: resource 2 [mem 0x388000000000-0x3887ffffffff 64bit pref] Dec 16 13:15:52.908916 kernel: pci_bus 0000:13: resource 0 [io 0xe000-0xefff] Dec 16 13:15:52.908980 kernel: pci_bus 0000:13: resource 1 [mem 0x81e00000-0x81ffffff] Dec 16 13:15:52.909042 kernel: pci_bus 0000:13: resource 2 [mem 0x388800000000-0x388fffffffff 64bit pref] Dec 16 13:15:52.909110 kernel: pci_bus 0000:14: resource 0 [io 0xd000-0xdfff] Dec 16 13:15:52.909173 kernel: pci_bus 0000:14: resource 1 [mem 0x81c00000-0x81dfffff] Dec 16 13:15:52.909235 kernel: pci_bus 0000:14: resource 2 [mem 0x389000000000-0x3897ffffffff 64bit pref] Dec 16 13:15:52.909300 kernel: pci_bus 0000:15: resource 0 [io 0xc000-0xcfff] Dec 16 13:15:52.909367 kernel: pci_bus 0000:15: resource 1 [mem 0x81a00000-0x81bfffff] Dec 16 13:15:52.909429 kernel: pci_bus 0000:15: resource 2 [mem 0x389800000000-0x389fffffffff 64bit pref] Dec 16 13:15:52.909502 kernel: pci_bus 0000:16: resource 0 [io 0xb000-0xbfff] Dec 16 13:15:52.909566 kernel: pci_bus 0000:16: resource 1 [mem 0x81800000-0x819fffff] Dec 16 13:15:52.909628 kernel: pci_bus 0000:16: resource 2 [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Dec 16 13:15:52.909696 kernel: pci_bus 0000:17: resource 0 [io 0xa000-0xafff] Dec 16 13:15:52.909760 kernel: pci_bus 0000:17: resource 1 [mem 0x81600000-0x817fffff] Dec 16 13:15:52.909825 kernel: pci_bus 0000:17: resource 2 [mem 0x38a800000000-0x38afffffffff 64bit pref] Dec 16 13:15:52.909897 kernel: pci_bus 0000:18: resource 0 [io 0x9000-0x9fff] Dec 16 13:15:52.909960 kernel: pci_bus 0000:18: resource 1 [mem 0x81400000-0x815fffff] Dec 16 13:15:52.910077 kernel: pci_bus 0000:18: resource 2 [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Dec 16 13:15:52.910149 kernel: pci_bus 0000:19: resource 0 [io 0x8000-0x8fff] Dec 16 13:15:52.910213 kernel: pci_bus 0000:19: resource 1 [mem 0x81200000-0x813fffff] Dec 16 13:15:52.910275 kernel: pci_bus 0000:19: resource 2 [mem 0x38b800000000-0x38bfffffffff 64bit pref] Dec 16 13:15:52.910345 kernel: pci_bus 0000:1a: resource 0 [io 0x5000-0x5fff] Dec 16 13:15:52.910408 kernel: pci_bus 0000:1a: resource 1 [mem 0x81000000-0x811fffff] Dec 16 13:15:52.910480 kernel: pci_bus 0000:1a: resource 2 [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Dec 16 13:15:52.910547 kernel: pci_bus 0000:1b: resource 0 [io 0x4000-0x4fff] Dec 16 13:15:52.910610 kernel: pci_bus 0000:1b: resource 1 [mem 0x80e00000-0x80ffffff] Dec 16 13:15:52.910673 kernel: pci_bus 0000:1b: resource 2 [mem 0x38c800000000-0x38cfffffffff 64bit pref] Dec 16 13:15:52.910744 kernel: pci_bus 0000:1c: resource 0 [io 0x3000-0x3fff] Dec 16 13:15:52.910807 kernel: pci_bus 0000:1c: resource 1 [mem 0x80c00000-0x80dfffff] Dec 16 13:15:52.910870 kernel: pci_bus 0000:1c: resource 2 [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Dec 16 13:15:52.910938 kernel: pci_bus 0000:1d: resource 0 [io 0x2000-0x2fff] Dec 16 13:15:52.911002 kernel: pci_bus 0000:1d: resource 1 [mem 0x80a00000-0x80bfffff] Dec 16 13:15:52.911065 kernel: pci_bus 0000:1d: resource 2 [mem 0x38d800000000-0x38dfffffffff 64bit pref] Dec 16 13:15:52.911134 kernel: pci_bus 0000:1e: resource 0 [io 0x1000-0x1fff] Dec 16 13:15:52.911200 kernel: pci_bus 0000:1e: resource 1 [mem 0x80800000-0x809fffff] Dec 16 13:15:52.911287 kernel: pci_bus 0000:1e: resource 2 [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Dec 16 13:15:52.911297 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Dec 16 13:15:52.911305 kernel: PCI: CLS 0 bytes, default 64 Dec 16 13:15:52.911313 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Dec 16 13:15:52.911321 kernel: software IO TLB: mapped [mem 0x0000000077e7e000-0x000000007be7e000] (64MB) Dec 16 13:15:52.911329 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Dec 16 13:15:52.911338 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x21134f58f0d, max_idle_ns: 440795217993 ns Dec 16 13:15:52.911348 kernel: Initialise system trusted keyrings Dec 16 13:15:52.911356 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Dec 16 13:15:52.911364 kernel: Key type asymmetric registered Dec 16 13:15:52.911371 kernel: Asymmetric key parser 'x509' registered Dec 16 13:15:52.911379 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Dec 16 13:15:52.911387 kernel: io scheduler mq-deadline registered Dec 16 13:15:52.911395 kernel: io scheduler kyber registered Dec 16 13:15:52.911403 kernel: io scheduler bfq registered Dec 16 13:15:52.911497 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Dec 16 13:15:52.911572 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Dec 16 13:15:52.911655 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Dec 16 13:15:52.911725 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Dec 16 13:15:52.911795 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Dec 16 13:15:52.911864 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Dec 16 13:15:52.911934 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Dec 16 13:15:52.912006 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Dec 16 13:15:52.912076 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Dec 16 13:15:52.912144 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Dec 16 13:15:52.912213 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Dec 16 13:15:52.912281 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Dec 16 13:15:52.912350 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Dec 16 13:15:52.912420 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Dec 16 13:15:52.912503 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Dec 16 13:15:52.912573 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Dec 16 13:15:52.912583 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Dec 16 13:15:52.912650 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Dec 16 13:15:52.912717 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Dec 16 13:15:52.912790 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 33 Dec 16 13:15:52.912858 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 33 Dec 16 13:15:52.912942 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 34 Dec 16 13:15:52.913016 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 34 Dec 16 13:15:52.913086 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 35 Dec 16 13:15:52.913156 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 35 Dec 16 13:15:52.913225 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 36 Dec 16 13:15:52.913292 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 36 Dec 16 13:15:52.913361 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 37 Dec 16 13:15:52.913431 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 37 Dec 16 13:15:52.913509 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 38 Dec 16 13:15:52.913580 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 38 Dec 16 13:15:52.913647 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 39 Dec 16 13:15:52.913715 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 39 Dec 16 13:15:52.913725 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Dec 16 13:15:52.913791 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 40 Dec 16 13:15:52.913860 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 40 Dec 16 13:15:52.913928 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 41 Dec 16 13:15:52.913996 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 41 Dec 16 13:15:52.914069 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 42 Dec 16 13:15:52.914136 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 42 Dec 16 13:15:52.914207 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 43 Dec 16 13:15:52.914276 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 43 Dec 16 13:15:52.914344 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 44 Dec 16 13:15:52.914412 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 44 Dec 16 13:15:52.914487 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 45 Dec 16 13:15:52.914555 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 45 Dec 16 13:15:52.914623 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 46 Dec 16 13:15:52.914693 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 46 Dec 16 13:15:52.914762 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 47 Dec 16 13:15:52.914829 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 47 Dec 16 13:15:52.914839 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Dec 16 13:15:52.914906 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 48 Dec 16 13:15:52.914974 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 48 Dec 16 13:15:52.915042 kernel: pcieport 0000:00:05.1: PME: Signaling with IRQ 49 Dec 16 13:15:52.915109 kernel: pcieport 0000:00:05.1: AER: enabled with IRQ 49 Dec 16 13:15:52.915180 kernel: pcieport 0000:00:05.2: PME: Signaling with IRQ 50 Dec 16 13:15:52.915247 kernel: pcieport 0000:00:05.2: AER: enabled with IRQ 50 Dec 16 13:15:52.915315 kernel: pcieport 0000:00:05.3: PME: Signaling with IRQ 51 Dec 16 13:15:52.915383 kernel: pcieport 0000:00:05.3: AER: enabled with IRQ 51 Dec 16 13:15:52.915476 kernel: pcieport 0000:00:05.4: PME: Signaling with IRQ 52 Dec 16 13:15:52.915558 kernel: pcieport 0000:00:05.4: AER: enabled with IRQ 52 Dec 16 13:15:52.915569 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Dec 16 13:15:52.915578 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 16 13:15:52.915589 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Dec 16 13:15:52.915608 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Dec 16 13:15:52.915616 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Dec 16 13:15:52.915624 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Dec 16 13:15:52.915697 kernel: rtc_cmos 00:03: RTC can wake from S4 Dec 16 13:15:52.915709 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Dec 16 13:15:52.915770 kernel: rtc_cmos 00:03: registered as rtc0 Dec 16 13:15:52.915833 kernel: rtc_cmos 00:03: setting system clock to 2025-12-16T13:15:52 UTC (1765890952) Dec 16 13:15:52.915898 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Dec 16 13:15:52.915907 kernel: intel_pstate: CPU model not supported Dec 16 13:15:52.915915 kernel: efifb: probing for efifb Dec 16 13:15:52.915923 kernel: efifb: framebuffer at 0x80000000, using 4000k, total 4000k Dec 16 13:15:52.915931 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Dec 16 13:15:52.915939 kernel: efifb: scrolling: redraw Dec 16 13:15:52.915946 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Dec 16 13:15:52.915954 kernel: Console: switching to colour frame buffer device 160x50 Dec 16 13:15:52.915962 kernel: fb0: EFI VGA frame buffer device Dec 16 13:15:52.915972 kernel: pstore: Using crash dump compression: deflate Dec 16 13:15:52.915980 kernel: pstore: Registered efi_pstore as persistent store backend Dec 16 13:15:52.915988 kernel: NET: Registered PF_INET6 protocol family Dec 16 13:15:52.915995 kernel: Segment Routing with IPv6 Dec 16 13:15:52.916003 kernel: In-situ OAM (IOAM) with IPv6 Dec 16 13:15:52.916011 kernel: NET: Registered PF_PACKET protocol family Dec 16 13:15:52.916018 kernel: Key type dns_resolver registered Dec 16 13:15:52.916026 kernel: IPI shorthand broadcast: enabled Dec 16 13:15:52.916034 kernel: sched_clock: Marking stable (4040004316, 163947109)->(4445068786, -241117361) Dec 16 13:15:52.916044 kernel: registered taskstats version 1 Dec 16 13:15:52.916053 kernel: Loading compiled-in X.509 certificates Dec 16 13:15:52.916060 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.61-flatcar: 0d0c78e6590cb40d27f1cef749ef9f2f3425f38d' Dec 16 13:15:52.916068 kernel: Demotion targets for Node 0: null Dec 16 13:15:52.916076 kernel: Key type .fscrypt registered Dec 16 13:15:52.916083 kernel: Key type fscrypt-provisioning registered Dec 16 13:15:52.916092 kernel: ima: No TPM chip found, activating TPM-bypass! Dec 16 13:15:52.916099 kernel: ima: Allocated hash algorithm: sha1 Dec 16 13:15:52.916107 kernel: ima: No architecture policies found Dec 16 13:15:52.916116 kernel: clk: Disabling unused clocks Dec 16 13:15:52.916124 kernel: Warning: unable to open an initial console. Dec 16 13:15:52.916132 kernel: Freeing unused kernel image (initmem) memory: 46188K Dec 16 13:15:52.916140 kernel: Write protecting the kernel read-only data: 40960k Dec 16 13:15:52.916148 kernel: Freeing unused kernel image (rodata/data gap) memory: 560K Dec 16 13:15:52.916155 kernel: Run /init as init process Dec 16 13:15:52.916163 kernel: with arguments: Dec 16 13:15:52.916171 kernel: /init Dec 16 13:15:52.916179 kernel: with environment: Dec 16 13:15:52.916186 kernel: HOME=/ Dec 16 13:15:52.916195 kernel: TERM=linux Dec 16 13:15:52.916204 systemd[1]: Successfully made /usr/ read-only. Dec 16 13:15:52.916216 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 13:15:52.916225 systemd[1]: Detected virtualization kvm. Dec 16 13:15:52.916233 systemd[1]: Detected architecture x86-64. Dec 16 13:15:52.916241 systemd[1]: Running in initrd. Dec 16 13:15:52.916248 systemd[1]: No hostname configured, using default hostname. Dec 16 13:15:52.916258 systemd[1]: Hostname set to . Dec 16 13:15:52.916267 systemd[1]: Initializing machine ID from VM UUID. Dec 16 13:15:52.916284 systemd[1]: Queued start job for default target initrd.target. Dec 16 13:15:52.916294 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 13:15:52.916303 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 13:15:52.916311 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Dec 16 13:15:52.916320 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 13:15:52.916328 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Dec 16 13:15:52.916337 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Dec 16 13:15:52.916348 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Dec 16 13:15:52.916356 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Dec 16 13:15:52.916364 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 13:15:52.916373 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 13:15:52.916381 systemd[1]: Reached target paths.target - Path Units. Dec 16 13:15:52.916389 systemd[1]: Reached target slices.target - Slice Units. Dec 16 13:15:52.916397 systemd[1]: Reached target swap.target - Swaps. Dec 16 13:15:52.916406 systemd[1]: Reached target timers.target - Timer Units. Dec 16 13:15:52.916415 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 13:15:52.916424 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 13:15:52.916432 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Dec 16 13:15:52.916441 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Dec 16 13:15:52.916459 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 13:15:52.916467 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 13:15:52.916476 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 13:15:52.916484 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 13:15:52.916492 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Dec 16 13:15:52.916503 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 13:15:52.916511 systemd[1]: Finished network-cleanup.service - Network Cleanup. Dec 16 13:15:52.916520 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Dec 16 13:15:52.916528 systemd[1]: Starting systemd-fsck-usr.service... Dec 16 13:15:52.916536 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 13:15:52.916544 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 13:15:52.916552 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 13:15:52.916561 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 13:15:52.916571 systemd[1]: Finished systemd-fsck-usr.service. Dec 16 13:15:52.916580 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 16 13:15:52.916590 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Dec 16 13:15:52.916598 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 13:15:52.916607 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 13:15:52.916639 systemd-journald[276]: Collecting audit messages is disabled. Dec 16 13:15:52.916661 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 13:15:52.916671 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 16 13:15:52.916680 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 13:15:52.916688 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 16 13:15:52.916697 kernel: Bridge firewalling registered Dec 16 13:15:52.916705 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 13:15:52.916713 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 13:15:52.916721 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 13:15:52.916730 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Dec 16 13:15:52.916741 systemd-journald[276]: Journal started Dec 16 13:15:52.916761 systemd-journald[276]: Runtime Journal (/run/log/journal/9f950044b44b49b0887ecb5f731386bb) is 8M, max 319.5M, 311.5M free. Dec 16 13:15:52.857584 systemd-modules-load[277]: Inserted module 'overlay' Dec 16 13:15:52.929056 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 13:15:52.897376 systemd-modules-load[277]: Inserted module 'br_netfilter' Dec 16 13:15:52.930587 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 13:15:52.934679 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 13:15:52.937919 systemd-tmpfiles[313]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Dec 16 13:15:52.941097 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 13:15:52.942068 dracut-cmdline[309]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=a214a2d85e162c493e8b13db2df50a43e1005a0e4854a1ae089a14f442a30022 Dec 16 13:15:52.942975 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 13:15:52.978270 systemd-resolved[338]: Positive Trust Anchors: Dec 16 13:15:52.978285 systemd-resolved[338]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 13:15:52.978322 systemd-resolved[338]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 13:15:52.980790 systemd-resolved[338]: Defaulting to hostname 'linux'. Dec 16 13:15:52.981651 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 13:15:52.982216 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 13:15:53.039507 kernel: SCSI subsystem initialized Dec 16 13:15:53.051486 kernel: Loading iSCSI transport class v2.0-870. Dec 16 13:15:53.063497 kernel: iscsi: registered transport (tcp) Dec 16 13:15:53.113011 kernel: iscsi: registered transport (qla4xxx) Dec 16 13:15:53.113109 kernel: QLogic iSCSI HBA Driver Dec 16 13:15:53.132963 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 13:15:53.152758 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 13:15:53.155722 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 13:15:53.207945 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Dec 16 13:15:53.209724 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Dec 16 13:15:53.275501 kernel: raid6: avx512x4 gen() 34003 MB/s Dec 16 13:15:53.293486 kernel: raid6: avx512x2 gen() 40434 MB/s Dec 16 13:15:53.310484 kernel: raid6: avx512x1 gen() 44077 MB/s Dec 16 13:15:53.328495 kernel: raid6: avx2x4 gen() 33723 MB/s Dec 16 13:15:53.345493 kernel: raid6: avx2x2 gen() 32147 MB/s Dec 16 13:15:53.363615 kernel: raid6: avx2x1 gen() 26358 MB/s Dec 16 13:15:53.363687 kernel: raid6: using algorithm avx512x1 gen() 44077 MB/s Dec 16 13:15:53.382560 kernel: raid6: .... xor() 24730 MB/s, rmw enabled Dec 16 13:15:53.382628 kernel: raid6: using avx512x2 recovery algorithm Dec 16 13:15:53.403475 kernel: xor: automatically using best checksumming function avx Dec 16 13:15:53.548497 kernel: Btrfs loaded, zoned=no, fsverity=no Dec 16 13:15:53.560400 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Dec 16 13:15:53.564686 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 13:15:53.598936 systemd-udevd[536]: Using default interface naming scheme 'v255'. Dec 16 13:15:53.606146 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 13:15:53.608706 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Dec 16 13:15:53.634957 dracut-pre-trigger[544]: rd.md=0: removing MD RAID activation Dec 16 13:15:53.675851 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 13:15:53.680748 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 13:15:53.802202 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 13:15:53.807207 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Dec 16 13:15:53.864482 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues Dec 16 13:15:53.893470 kernel: cryptd: max_cpu_qlen set to 1000 Dec 16 13:15:53.898460 kernel: ACPI: bus type USB registered Dec 16 13:15:53.898511 kernel: virtio_blk virtio2: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Dec 16 13:15:53.916577 kernel: usbcore: registered new interface driver usbfs Dec 16 13:15:53.916602 kernel: usbcore: registered new interface driver hub Dec 16 13:15:53.916615 kernel: usbcore: registered new device driver usb Dec 16 13:15:53.917458 kernel: libata version 3.00 loaded. Dec 16 13:15:53.929431 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Dec 16 13:15:53.929507 kernel: GPT:17805311 != 104857599 Dec 16 13:15:53.929521 kernel: GPT:Alternate GPT header not at the end of the disk. Dec 16 13:15:53.930123 kernel: GPT:17805311 != 104857599 Dec 16 13:15:53.931476 kernel: GPT: Use GNU Parted to correct GPT errors. Dec 16 13:15:53.932004 kernel: ahci 0000:00:1f.2: version 3.0 Dec 16 13:15:53.932183 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 16 13:15:53.932459 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Dec 16 13:15:53.933107 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 13:15:53.933278 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 13:15:53.964685 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Dec 16 13:15:53.964847 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Dec 16 13:15:53.964941 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Dec 16 13:15:53.965027 kernel: scsi host0: ahci Dec 16 13:15:53.965132 kernel: scsi host1: ahci Dec 16 13:15:53.965214 kernel: scsi host2: ahci Dec 16 13:15:53.965293 kernel: scsi host3: ahci Dec 16 13:15:53.965371 kernel: uhci_hcd 0000:02:01.0: UHCI Host Controller Dec 16 13:15:53.965478 kernel: uhci_hcd 0000:02:01.0: new USB bus registered, assigned bus number 1 Dec 16 13:15:53.965566 kernel: uhci_hcd 0000:02:01.0: detected 2 ports Dec 16 13:15:53.965652 kernel: uhci_hcd 0000:02:01.0: irq 22, io port 0x00006000 Dec 16 13:15:53.965740 kernel: scsi host4: ahci Dec 16 13:15:53.965873 kernel: AES CTR mode by8 optimization enabled Dec 16 13:15:53.965884 kernel: hub 1-0:1.0: USB hub found Dec 16 13:15:53.965992 kernel: scsi host5: ahci Dec 16 13:15:53.966078 kernel: hub 1-0:1.0: 2 ports detected Dec 16 13:15:53.966172 kernel: ata1: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380100 irq 67 lpm-pol 1 Dec 16 13:15:53.966185 kernel: ata2: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380180 irq 67 lpm-pol 1 Dec 16 13:15:53.966195 kernel: ata3: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380200 irq 67 lpm-pol 1 Dec 16 13:15:53.966204 kernel: ata4: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380280 irq 67 lpm-pol 1 Dec 16 13:15:53.966214 kernel: ata5: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380300 irq 67 lpm-pol 1 Dec 16 13:15:53.966223 kernel: ata6: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380380 irq 67 lpm-pol 1 Dec 16 13:15:53.966233 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Dec 16 13:15:53.939921 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 13:15:53.965857 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 13:15:53.966981 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Dec 16 13:15:53.983948 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 13:15:53.984038 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 13:15:53.985514 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 13:15:54.003156 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 13:15:54.023222 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Dec 16 13:15:54.035243 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Dec 16 13:15:54.042611 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 16 13:15:54.049378 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Dec 16 13:15:54.049798 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Dec 16 13:15:54.051240 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Dec 16 13:15:54.088069 disk-uuid[757]: Primary Header is updated. Dec 16 13:15:54.088069 disk-uuid[757]: Secondary Entries is updated. Dec 16 13:15:54.088069 disk-uuid[757]: Secondary Header is updated. Dec 16 13:15:54.098487 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 16 13:15:54.167474 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd Dec 16 13:15:54.269501 kernel: ata2: SATA link down (SStatus 0 SControl 300) Dec 16 13:15:54.275488 kernel: ata6: SATA link down (SStatus 0 SControl 300) Dec 16 13:15:54.278487 kernel: ata3: SATA link down (SStatus 0 SControl 300) Dec 16 13:15:54.278514 kernel: ata5: SATA link down (SStatus 0 SControl 300) Dec 16 13:15:54.280489 kernel: ata1: SATA link down (SStatus 0 SControl 300) Dec 16 13:15:54.283504 kernel: ata4: SATA link down (SStatus 0 SControl 300) Dec 16 13:15:54.301502 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Dec 16 13:15:54.303316 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 13:15:54.304489 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 13:15:54.305144 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 13:15:54.307456 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Dec 16 13:15:54.354199 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Dec 16 13:15:54.362485 kernel: hid: raw HID events driver (C) Jiri Kosina Dec 16 13:15:54.377305 kernel: usbcore: registered new interface driver usbhid Dec 16 13:15:54.377386 kernel: usbhid: USB HID core driver Dec 16 13:15:54.387959 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.0/0000:01:00.0/0000:02:01.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Dec 16 13:15:54.388011 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:01.0-1/input0 Dec 16 13:15:55.112508 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 16 13:15:55.112599 disk-uuid[758]: The operation has completed successfully. Dec 16 13:15:55.176721 systemd[1]: disk-uuid.service: Deactivated successfully. Dec 16 13:15:55.176850 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Dec 16 13:15:55.214533 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Dec 16 13:15:55.250824 sh[784]: Success Dec 16 13:15:55.289183 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 16 13:15:55.289295 kernel: device-mapper: uevent: version 1.0.3 Dec 16 13:15:55.291846 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Dec 16 13:15:55.312522 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Dec 16 13:15:55.458298 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Dec 16 13:15:55.464579 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Dec 16 13:15:55.498035 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Dec 16 13:15:55.533581 kernel: BTRFS: device fsid a6ae7f96-a076-4d3c-81ed-46dd341492f8 devid 1 transid 37 /dev/mapper/usr (253:0) scanned by mount (796) Dec 16 13:15:55.541372 kernel: BTRFS info (device dm-0): first mount of filesystem a6ae7f96-a076-4d3c-81ed-46dd341492f8 Dec 16 13:15:55.541498 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Dec 16 13:15:55.577539 kernel: BTRFS info (device dm-0): disabling log replay at mount time Dec 16 13:15:55.577608 kernel: BTRFS info (device dm-0): enabling free space tree Dec 16 13:15:55.592437 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Dec 16 13:15:55.594526 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Dec 16 13:15:55.596081 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Dec 16 13:15:55.597900 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Dec 16 13:15:55.602178 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Dec 16 13:15:55.688499 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (831) Dec 16 13:15:55.697889 kernel: BTRFS info (device vda6): first mount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 16 13:15:55.697997 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Dec 16 13:15:55.709745 kernel: BTRFS info (device vda6): turning on async discard Dec 16 13:15:55.709853 kernel: BTRFS info (device vda6): enabling free space tree Dec 16 13:15:55.720548 kernel: BTRFS info (device vda6): last unmount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 16 13:15:55.722376 systemd[1]: Finished ignition-setup.service - Ignition (setup). Dec 16 13:15:55.724607 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Dec 16 13:15:55.803148 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 13:15:55.806279 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 13:15:55.860028 systemd-networkd[967]: lo: Link UP Dec 16 13:15:55.860037 systemd-networkd[967]: lo: Gained carrier Dec 16 13:15:55.862793 systemd-networkd[967]: Enumeration completed Dec 16 13:15:55.863026 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 13:15:55.863111 systemd-networkd[967]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 13:15:55.863116 systemd-networkd[967]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 13:15:55.863482 systemd-networkd[967]: eth0: Link UP Dec 16 13:15:55.864179 systemd-networkd[967]: eth0: Gained carrier Dec 16 13:15:55.864191 systemd-networkd[967]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 13:15:55.865645 systemd[1]: Reached target network.target - Network. Dec 16 13:15:55.901576 systemd-networkd[967]: eth0: DHCPv4 address 10.0.23.154/25, gateway 10.0.23.129 acquired from 10.0.23.129 Dec 16 13:15:55.922406 ignition[906]: Ignition 2.22.0 Dec 16 13:15:55.922436 ignition[906]: Stage: fetch-offline Dec 16 13:15:55.922547 ignition[906]: no configs at "/usr/lib/ignition/base.d" Dec 16 13:15:55.924803 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 13:15:55.922568 ignition[906]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 13:15:55.927540 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Dec 16 13:15:55.922759 ignition[906]: parsed url from cmdline: "" Dec 16 13:15:55.922768 ignition[906]: no config URL provided Dec 16 13:15:55.922779 ignition[906]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 13:15:55.922795 ignition[906]: no config at "/usr/lib/ignition/user.ign" Dec 16 13:15:55.922806 ignition[906]: failed to fetch config: resource requires networking Dec 16 13:15:55.923159 ignition[906]: Ignition finished successfully Dec 16 13:15:55.967046 ignition[991]: Ignition 2.22.0 Dec 16 13:15:55.967061 ignition[991]: Stage: fetch Dec 16 13:15:55.967214 ignition[991]: no configs at "/usr/lib/ignition/base.d" Dec 16 13:15:55.967225 ignition[991]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 13:15:55.967319 ignition[991]: parsed url from cmdline: "" Dec 16 13:15:55.967323 ignition[991]: no config URL provided Dec 16 13:15:55.967337 ignition[991]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 13:15:55.967345 ignition[991]: no config at "/usr/lib/ignition/user.ign" Dec 16 13:15:55.967474 ignition[991]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Dec 16 13:15:55.967853 ignition[991]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Dec 16 13:15:55.967951 ignition[991]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Dec 16 13:15:56.295087 ignition[991]: GET result: OK Dec 16 13:15:56.295280 ignition[991]: parsing config with SHA512: dc15c0d94d52d4502d1fed89bfaa28e43864e77fe561eb97eb89b86485edd9e018188352f3ed8609f363c131102f53c3a3f1609c2f90ef334214aab637abab19 Dec 16 13:15:56.305035 unknown[991]: fetched base config from "system" Dec 16 13:15:56.305052 unknown[991]: fetched base config from "system" Dec 16 13:15:56.305683 ignition[991]: fetch: fetch complete Dec 16 13:15:56.305061 unknown[991]: fetched user config from "openstack" Dec 16 13:15:56.305705 ignition[991]: fetch: fetch passed Dec 16 13:15:56.305766 ignition[991]: Ignition finished successfully Dec 16 13:15:56.311830 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Dec 16 13:15:56.317265 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Dec 16 13:15:56.389618 ignition[1002]: Ignition 2.22.0 Dec 16 13:15:56.389642 ignition[1002]: Stage: kargs Dec 16 13:15:56.389947 ignition[1002]: no configs at "/usr/lib/ignition/base.d" Dec 16 13:15:56.389966 ignition[1002]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 13:15:56.393314 ignition[1002]: kargs: kargs passed Dec 16 13:15:56.393629 ignition[1002]: Ignition finished successfully Dec 16 13:15:56.396732 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Dec 16 13:15:56.402293 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Dec 16 13:15:56.476241 ignition[1014]: Ignition 2.22.0 Dec 16 13:15:56.476267 ignition[1014]: Stage: disks Dec 16 13:15:56.476561 ignition[1014]: no configs at "/usr/lib/ignition/base.d" Dec 16 13:15:56.476579 ignition[1014]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 13:15:56.480384 ignition[1014]: disks: disks passed Dec 16 13:15:56.480552 ignition[1014]: Ignition finished successfully Dec 16 13:15:56.483616 systemd[1]: Finished ignition-disks.service - Ignition (disks). Dec 16 13:15:56.485814 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Dec 16 13:15:56.486911 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Dec 16 13:15:56.488914 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 13:15:56.490793 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 13:15:56.492967 systemd[1]: Reached target basic.target - Basic System. Dec 16 13:15:56.497390 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Dec 16 13:15:56.576140 systemd-fsck[1027]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Dec 16 13:15:56.580626 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Dec 16 13:15:56.582795 systemd[1]: Mounting sysroot.mount - /sysroot... Dec 16 13:15:56.804489 kernel: EXT4-fs (vda9): mounted filesystem e48ca59c-1206-4abd-b121-5e9b35e49852 r/w with ordered data mode. Quota mode: none. Dec 16 13:15:56.805643 systemd[1]: Mounted sysroot.mount - /sysroot. Dec 16 13:15:56.806665 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Dec 16 13:15:56.811997 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 13:15:56.814756 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Dec 16 13:15:56.815670 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Dec 16 13:15:56.836195 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Dec 16 13:15:56.838082 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Dec 16 13:15:56.838183 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 13:15:56.842040 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Dec 16 13:15:56.844884 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Dec 16 13:15:56.860487 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1035) Dec 16 13:15:56.864669 kernel: BTRFS info (device vda6): first mount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 16 13:15:56.864725 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Dec 16 13:15:56.877935 kernel: BTRFS info (device vda6): turning on async discard Dec 16 13:15:56.878060 kernel: BTRFS info (device vda6): enabling free space tree Dec 16 13:15:56.881894 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 13:15:56.931508 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:15:56.952170 initrd-setup-root[1067]: cut: /sysroot/etc/passwd: No such file or directory Dec 16 13:15:56.958834 initrd-setup-root[1074]: cut: /sysroot/etc/group: No such file or directory Dec 16 13:15:56.963318 initrd-setup-root[1081]: cut: /sysroot/etc/shadow: No such file or directory Dec 16 13:15:56.967352 initrd-setup-root[1088]: cut: /sysroot/etc/gshadow: No such file or directory Dec 16 13:15:57.138718 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Dec 16 13:15:57.141883 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Dec 16 13:15:57.144421 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Dec 16 13:15:57.182795 systemd[1]: sysroot-oem.mount: Deactivated successfully. Dec 16 13:15:57.186899 kernel: BTRFS info (device vda6): last unmount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 16 13:15:57.228687 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Dec 16 13:15:57.248677 ignition[1156]: INFO : Ignition 2.22.0 Dec 16 13:15:57.248677 ignition[1156]: INFO : Stage: mount Dec 16 13:15:57.252216 ignition[1156]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 13:15:57.252216 ignition[1156]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 13:15:57.252216 ignition[1156]: INFO : mount: mount passed Dec 16 13:15:57.252216 ignition[1156]: INFO : Ignition finished successfully Dec 16 13:15:57.253182 systemd[1]: Finished ignition-mount.service - Ignition (mount). Dec 16 13:15:57.274686 systemd-networkd[967]: eth0: Gained IPv6LL Dec 16 13:15:57.996569 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:16:00.010541 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:16:04.042569 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:16:04.058070 coreos-metadata[1037]: Dec 16 13:16:04.057 WARN failed to locate config-drive, using the metadata service API instead Dec 16 13:16:04.077350 coreos-metadata[1037]: Dec 16 13:16:04.077 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 16 13:16:04.236218 coreos-metadata[1037]: Dec 16 13:16:04.236 INFO Fetch successful Dec 16 13:16:04.236938 coreos-metadata[1037]: Dec 16 13:16:04.236 INFO wrote hostname ci-4459-2-2-0-839c7337fa to /sysroot/etc/hostname Dec 16 13:16:04.238971 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Dec 16 13:16:04.239259 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Dec 16 13:16:04.242822 systemd[1]: Starting ignition-files.service - Ignition (files)... Dec 16 13:16:04.288474 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 13:16:04.322479 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1177) Dec 16 13:16:04.326570 kernel: BTRFS info (device vda6): first mount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 16 13:16:04.326637 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Dec 16 13:16:04.338828 kernel: BTRFS info (device vda6): turning on async discard Dec 16 13:16:04.338902 kernel: BTRFS info (device vda6): enabling free space tree Dec 16 13:16:04.343344 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 13:16:04.392801 ignition[1195]: INFO : Ignition 2.22.0 Dec 16 13:16:04.392801 ignition[1195]: INFO : Stage: files Dec 16 13:16:04.394462 ignition[1195]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 13:16:04.394462 ignition[1195]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 13:16:04.394462 ignition[1195]: DEBUG : files: compiled without relabeling support, skipping Dec 16 13:16:04.396102 ignition[1195]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Dec 16 13:16:04.396102 ignition[1195]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Dec 16 13:16:04.403140 ignition[1195]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Dec 16 13:16:04.403756 ignition[1195]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Dec 16 13:16:04.404514 unknown[1195]: wrote ssh authorized keys file for user: core Dec 16 13:16:04.405229 ignition[1195]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Dec 16 13:16:04.413054 ignition[1195]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Dec 16 13:16:04.415389 ignition[1195]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Dec 16 13:16:04.483662 ignition[1195]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Dec 16 13:16:04.708308 ignition[1195]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Dec 16 13:16:04.709712 ignition[1195]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Dec 16 13:16:04.709712 ignition[1195]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Dec 16 13:16:04.709712 ignition[1195]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Dec 16 13:16:04.711435 ignition[1195]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Dec 16 13:16:04.711435 ignition[1195]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 13:16:04.711435 ignition[1195]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 13:16:04.711435 ignition[1195]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 13:16:04.711435 ignition[1195]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 13:16:04.714805 ignition[1195]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 13:16:04.715215 ignition[1195]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 13:16:04.715215 ignition[1195]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Dec 16 13:16:04.716854 ignition[1195]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Dec 16 13:16:04.716854 ignition[1195]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Dec 16 13:16:04.718165 ignition[1195]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Dec 16 13:16:05.018053 ignition[1195]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Dec 16 13:16:05.615978 ignition[1195]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Dec 16 13:16:05.615978 ignition[1195]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Dec 16 13:16:05.619230 ignition[1195]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 13:16:05.625023 ignition[1195]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 13:16:05.625023 ignition[1195]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Dec 16 13:16:05.626251 ignition[1195]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Dec 16 13:16:05.626251 ignition[1195]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Dec 16 13:16:05.626251 ignition[1195]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Dec 16 13:16:05.626251 ignition[1195]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Dec 16 13:16:05.626251 ignition[1195]: INFO : files: files passed Dec 16 13:16:05.626251 ignition[1195]: INFO : Ignition finished successfully Dec 16 13:16:05.628659 systemd[1]: Finished ignition-files.service - Ignition (files). Dec 16 13:16:05.631282 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Dec 16 13:16:05.633789 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Dec 16 13:16:05.660509 systemd[1]: ignition-quench.service: Deactivated successfully. Dec 16 13:16:05.660760 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Dec 16 13:16:05.674500 initrd-setup-root-after-ignition[1231]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 13:16:05.674500 initrd-setup-root-after-ignition[1231]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Dec 16 13:16:05.677867 initrd-setup-root-after-ignition[1235]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 13:16:05.676987 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 13:16:05.679800 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Dec 16 13:16:05.682901 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Dec 16 13:16:05.772279 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 16 13:16:05.772470 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Dec 16 13:16:05.774985 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Dec 16 13:16:05.776991 systemd[1]: Reached target initrd.target - Initrd Default Target. Dec 16 13:16:05.779110 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Dec 16 13:16:05.780212 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Dec 16 13:16:05.804030 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 13:16:05.806731 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Dec 16 13:16:05.841186 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Dec 16 13:16:05.842248 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 13:16:05.844459 systemd[1]: Stopped target timers.target - Timer Units. Dec 16 13:16:05.846483 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 16 13:16:05.846653 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 13:16:05.849592 systemd[1]: Stopped target initrd.target - Initrd Default Target. Dec 16 13:16:05.851629 systemd[1]: Stopped target basic.target - Basic System. Dec 16 13:16:05.853441 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Dec 16 13:16:05.855303 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 13:16:05.857226 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Dec 16 13:16:05.859105 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Dec 16 13:16:05.861100 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Dec 16 13:16:05.863053 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 13:16:05.865009 systemd[1]: Stopped target sysinit.target - System Initialization. Dec 16 13:16:05.867198 systemd[1]: Stopped target local-fs.target - Local File Systems. Dec 16 13:16:05.869466 systemd[1]: Stopped target swap.target - Swaps. Dec 16 13:16:05.871679 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 16 13:16:05.871903 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Dec 16 13:16:05.875020 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Dec 16 13:16:05.877159 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 13:16:05.878886 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Dec 16 13:16:05.879101 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 13:16:05.880786 systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 16 13:16:05.880984 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Dec 16 13:16:05.883985 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Dec 16 13:16:05.884192 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 13:16:05.886150 systemd[1]: ignition-files.service: Deactivated successfully. Dec 16 13:16:05.886333 systemd[1]: Stopped ignition-files.service - Ignition (files). Dec 16 13:16:05.889644 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Dec 16 13:16:05.892296 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Dec 16 13:16:05.894102 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 16 13:16:05.894326 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 13:16:05.895964 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 16 13:16:05.896107 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 13:16:05.902367 systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 16 13:16:05.917716 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Dec 16 13:16:05.942288 systemd[1]: sysroot-boot.mount: Deactivated successfully. Dec 16 13:16:05.957394 ignition[1257]: INFO : Ignition 2.22.0 Dec 16 13:16:05.957394 ignition[1257]: INFO : Stage: umount Dec 16 13:16:05.958845 ignition[1257]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 13:16:05.958845 ignition[1257]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 13:16:05.959741 ignition[1257]: INFO : umount: umount passed Dec 16 13:16:05.959741 ignition[1257]: INFO : Ignition finished successfully Dec 16 13:16:05.961087 systemd[1]: ignition-mount.service: Deactivated successfully. Dec 16 13:16:05.961226 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Dec 16 13:16:05.962621 systemd[1]: ignition-disks.service: Deactivated successfully. Dec 16 13:16:05.962733 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Dec 16 13:16:05.963705 systemd[1]: ignition-kargs.service: Deactivated successfully. Dec 16 13:16:05.963758 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Dec 16 13:16:05.964555 systemd[1]: ignition-fetch.service: Deactivated successfully. Dec 16 13:16:05.964602 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Dec 16 13:16:05.965460 systemd[1]: Stopped target network.target - Network. Dec 16 13:16:05.966410 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Dec 16 13:16:05.966506 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 13:16:05.967397 systemd[1]: Stopped target paths.target - Path Units. Dec 16 13:16:05.968204 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 16 13:16:05.972532 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 13:16:05.973008 systemd[1]: Stopped target slices.target - Slice Units. Dec 16 13:16:05.973817 systemd[1]: Stopped target sockets.target - Socket Units. Dec 16 13:16:05.974732 systemd[1]: iscsid.socket: Deactivated successfully. Dec 16 13:16:05.974775 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 13:16:05.975697 systemd[1]: iscsiuio.socket: Deactivated successfully. Dec 16 13:16:05.975732 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 13:16:05.976516 systemd[1]: ignition-setup.service: Deactivated successfully. Dec 16 13:16:05.976570 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Dec 16 13:16:05.977372 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Dec 16 13:16:05.977408 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Dec 16 13:16:05.978326 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Dec 16 13:16:05.979269 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Dec 16 13:16:05.981230 systemd[1]: sysroot-boot.service: Deactivated successfully. Dec 16 13:16:05.981362 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Dec 16 13:16:05.982469 systemd[1]: initrd-setup-root.service: Deactivated successfully. Dec 16 13:16:05.982524 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Dec 16 13:16:05.988056 systemd[1]: systemd-resolved.service: Deactivated successfully. Dec 16 13:16:05.988205 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Dec 16 13:16:05.991613 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Dec 16 13:16:05.991946 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 16 13:16:05.991992 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 13:16:05.994307 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Dec 16 13:16:06.001575 systemd[1]: systemd-networkd.service: Deactivated successfully. Dec 16 13:16:06.001706 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Dec 16 13:16:06.003956 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Dec 16 13:16:06.004141 systemd[1]: Stopped target network-pre.target - Preparation for Network. Dec 16 13:16:06.005070 systemd[1]: systemd-networkd.socket: Deactivated successfully. Dec 16 13:16:06.005117 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Dec 16 13:16:06.006896 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Dec 16 13:16:06.007582 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Dec 16 13:16:06.007637 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 13:16:06.008665 systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 16 13:16:06.008717 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Dec 16 13:16:06.010600 systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 16 13:16:06.010644 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Dec 16 13:16:06.011528 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 13:16:06.013264 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Dec 16 13:16:06.041217 systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 16 13:16:06.041514 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 13:16:06.044196 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 16 13:16:06.044286 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Dec 16 13:16:06.045695 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 16 13:16:06.045751 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 13:16:06.046587 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 16 13:16:06.046637 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Dec 16 13:16:06.047941 systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 16 13:16:06.047983 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Dec 16 13:16:06.049304 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 16 13:16:06.049349 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 13:16:06.051496 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Dec 16 13:16:06.052508 systemd[1]: systemd-network-generator.service: Deactivated successfully. Dec 16 13:16:06.052579 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 13:16:06.053704 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 16 13:16:06.053760 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 13:16:06.054739 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Dec 16 13:16:06.054789 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 13:16:06.056058 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 16 13:16:06.056113 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 13:16:06.057065 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 13:16:06.057116 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 13:16:06.059151 systemd[1]: network-cleanup.service: Deactivated successfully. Dec 16 13:16:06.059277 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Dec 16 13:16:06.064175 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 16 13:16:06.064323 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Dec 16 13:16:06.065691 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Dec 16 13:16:06.067738 systemd[1]: Starting initrd-switch-root.service - Switch Root... Dec 16 13:16:06.085323 systemd[1]: Switching root. Dec 16 13:16:06.131584 systemd-journald[276]: Journal stopped Dec 16 13:16:07.484102 systemd-journald[276]: Received SIGTERM from PID 1 (systemd). Dec 16 13:16:07.484182 kernel: SELinux: policy capability network_peer_controls=1 Dec 16 13:16:07.484209 kernel: SELinux: policy capability open_perms=1 Dec 16 13:16:07.484224 kernel: SELinux: policy capability extended_socket_class=1 Dec 16 13:16:07.484234 kernel: SELinux: policy capability always_check_network=0 Dec 16 13:16:07.484253 kernel: SELinux: policy capability cgroup_seclabel=1 Dec 16 13:16:07.484264 kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 16 13:16:07.484274 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Dec 16 13:16:07.484288 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Dec 16 13:16:07.484298 kernel: SELinux: policy capability userspace_initial_context=0 Dec 16 13:16:07.484308 kernel: audit: type=1403 audit(1765890966.330:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Dec 16 13:16:07.484324 systemd[1]: Successfully loaded SELinux policy in 93.558ms. Dec 16 13:16:07.484345 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.123ms. Dec 16 13:16:07.484356 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 13:16:07.484369 systemd[1]: Detected virtualization kvm. Dec 16 13:16:07.484379 systemd[1]: Detected architecture x86-64. Dec 16 13:16:07.484389 systemd[1]: Detected first boot. Dec 16 13:16:07.484402 systemd[1]: Hostname set to . Dec 16 13:16:07.484413 systemd[1]: Initializing machine ID from VM UUID. Dec 16 13:16:07.484426 zram_generator::config[1307]: No configuration found. Dec 16 13:16:07.484438 kernel: Guest personality initialized and is inactive Dec 16 13:16:07.484458 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Dec 16 13:16:07.484469 kernel: Initialized host personality Dec 16 13:16:07.484479 kernel: NET: Registered PF_VSOCK protocol family Dec 16 13:16:07.484488 systemd[1]: Populated /etc with preset unit settings. Dec 16 13:16:07.484499 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Dec 16 13:16:07.484509 systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 16 13:16:07.484520 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Dec 16 13:16:07.484531 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 16 13:16:07.484541 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Dec 16 13:16:07.484554 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Dec 16 13:16:07.484564 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Dec 16 13:16:07.484575 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Dec 16 13:16:07.484585 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Dec 16 13:16:07.484596 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Dec 16 13:16:07.484606 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Dec 16 13:16:07.484617 systemd[1]: Created slice user.slice - User and Session Slice. Dec 16 13:16:07.484627 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 13:16:07.484638 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 13:16:07.484650 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Dec 16 13:16:07.484661 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Dec 16 13:16:07.484671 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Dec 16 13:16:07.484682 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 13:16:07.484693 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Dec 16 13:16:07.484703 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 13:16:07.484715 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 13:16:07.484725 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Dec 16 13:16:07.484736 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Dec 16 13:16:07.484747 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Dec 16 13:16:07.484757 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Dec 16 13:16:07.484768 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 13:16:07.484778 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 13:16:07.484789 systemd[1]: Reached target slices.target - Slice Units. Dec 16 13:16:07.484799 systemd[1]: Reached target swap.target - Swaps. Dec 16 13:16:07.484811 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Dec 16 13:16:07.484821 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Dec 16 13:16:07.484832 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Dec 16 13:16:07.484841 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 13:16:07.484852 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 13:16:07.484861 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 13:16:07.484872 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Dec 16 13:16:07.484882 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Dec 16 13:16:07.484892 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Dec 16 13:16:07.484904 systemd[1]: Mounting media.mount - External Media Directory... Dec 16 13:16:07.484914 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 13:16:07.484924 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Dec 16 13:16:07.484935 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Dec 16 13:16:07.484947 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Dec 16 13:16:07.484958 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 16 13:16:07.484969 systemd[1]: Reached target machines.target - Containers. Dec 16 13:16:07.484979 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Dec 16 13:16:07.484989 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 13:16:07.485001 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 13:16:07.485011 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Dec 16 13:16:07.485022 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 13:16:07.485032 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 13:16:07.485042 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 13:16:07.485053 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Dec 16 13:16:07.485063 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 13:16:07.485074 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Dec 16 13:16:07.485086 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 16 13:16:07.485097 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Dec 16 13:16:07.485107 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Dec 16 13:16:07.485117 systemd[1]: Stopped systemd-fsck-usr.service. Dec 16 13:16:07.485128 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 13:16:07.485140 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 13:16:07.485151 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 13:16:07.485163 kernel: loop: module loaded Dec 16 13:16:07.485173 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 13:16:07.485183 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Dec 16 13:16:07.485197 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Dec 16 13:16:07.485208 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 13:16:07.485220 systemd[1]: verity-setup.service: Deactivated successfully. Dec 16 13:16:07.485231 systemd[1]: Stopped verity-setup.service. Dec 16 13:16:07.485241 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 13:16:07.485251 kernel: fuse: init (API version 7.41) Dec 16 13:16:07.485261 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Dec 16 13:16:07.485272 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Dec 16 13:16:07.485282 systemd[1]: Mounted media.mount - External Media Directory. Dec 16 13:16:07.485293 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Dec 16 13:16:07.485304 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Dec 16 13:16:07.485314 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Dec 16 13:16:07.485324 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 13:16:07.485334 kernel: ACPI: bus type drm_connector registered Dec 16 13:16:07.485363 systemd-journald[1377]: Collecting audit messages is disabled. Dec 16 13:16:07.485391 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Dec 16 13:16:07.485403 systemd-journald[1377]: Journal started Dec 16 13:16:07.485426 systemd-journald[1377]: Runtime Journal (/run/log/journal/9f950044b44b49b0887ecb5f731386bb) is 8M, max 319.5M, 311.5M free. Dec 16 13:16:07.263717 systemd[1]: Queued start job for default target multi-user.target. Dec 16 13:16:07.283725 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Dec 16 13:16:07.284222 systemd[1]: systemd-journald.service: Deactivated successfully. Dec 16 13:16:07.487468 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 13:16:07.487982 systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 16 13:16:07.488182 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Dec 16 13:16:07.488839 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 13:16:07.489024 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 13:16:07.489677 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 13:16:07.489828 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 13:16:07.490408 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 13:16:07.490553 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 13:16:07.491128 systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 16 13:16:07.491250 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Dec 16 13:16:07.491848 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 13:16:07.491973 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 13:16:07.492588 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 13:16:07.493178 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 13:16:07.493927 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Dec 16 13:16:07.494517 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Dec 16 13:16:07.503245 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 13:16:07.504839 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Dec 16 13:16:07.506161 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Dec 16 13:16:07.506639 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Dec 16 13:16:07.506671 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 13:16:07.508156 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Dec 16 13:16:07.509418 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Dec 16 13:16:07.509931 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 13:16:07.510814 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Dec 16 13:16:07.523200 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Dec 16 13:16:07.524014 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 13:16:07.524798 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Dec 16 13:16:07.525333 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 13:16:07.526095 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 13:16:07.528058 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Dec 16 13:16:07.529175 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 16 13:16:07.530954 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Dec 16 13:16:07.531585 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Dec 16 13:16:07.536057 systemd-journald[1377]: Time spent on flushing to /var/log/journal/9f950044b44b49b0887ecb5f731386bb is 21.822ms for 1710 entries. Dec 16 13:16:07.536057 systemd-journald[1377]: System Journal (/var/log/journal/9f950044b44b49b0887ecb5f731386bb) is 8M, max 584.8M, 576.8M free. Dec 16 13:16:07.662596 systemd-journald[1377]: Received client request to flush runtime journal. Dec 16 13:16:07.662637 kernel: loop0: detected capacity change from 0 to 224512 Dec 16 13:16:07.662653 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Dec 16 13:16:07.662668 kernel: loop1: detected capacity change from 0 to 1640 Dec 16 13:16:07.555354 systemd-tmpfiles[1433]: ACLs are not supported, ignoring. Dec 16 13:16:07.555365 systemd-tmpfiles[1433]: ACLs are not supported, ignoring. Dec 16 13:16:07.557138 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 13:16:07.559662 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 13:16:07.561414 systemd[1]: Starting systemd-sysusers.service - Create System Users... Dec 16 13:16:07.574701 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 13:16:07.642488 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Dec 16 13:16:07.643564 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Dec 16 13:16:07.645974 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Dec 16 13:16:07.650961 systemd[1]: Finished systemd-sysusers.service - Create System Users. Dec 16 13:16:07.653530 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 13:16:07.666781 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Dec 16 13:16:07.675532 kernel: loop2: detected capacity change from 0 to 110984 Dec 16 13:16:07.686889 systemd-tmpfiles[1450]: ACLs are not supported, ignoring. Dec 16 13:16:07.686917 systemd-tmpfiles[1450]: ACLs are not supported, ignoring. Dec 16 13:16:07.691859 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Dec 16 13:16:07.693137 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 13:16:07.763497 kernel: loop3: detected capacity change from 0 to 128560 Dec 16 13:16:07.832484 kernel: loop4: detected capacity change from 0 to 224512 Dec 16 13:16:07.888474 kernel: loop5: detected capacity change from 0 to 1640 Dec 16 13:16:07.901475 kernel: loop6: detected capacity change from 0 to 110984 Dec 16 13:16:07.932471 kernel: loop7: detected capacity change from 0 to 128560 Dec 16 13:16:07.967105 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Dec 16 13:16:07.969177 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 13:16:07.969459 (sd-merge)[1459]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-stackit'. Dec 16 13:16:07.969877 (sd-merge)[1459]: Merged extensions into '/usr'. Dec 16 13:16:07.975425 systemd[1]: Reload requested from client PID 1432 ('systemd-sysext') (unit systemd-sysext.service)... Dec 16 13:16:07.975439 systemd[1]: Reloading... Dec 16 13:16:08.007133 systemd-udevd[1461]: Using default interface naming scheme 'v255'. Dec 16 13:16:08.011463 zram_generator::config[1485]: No configuration found. Dec 16 13:16:08.160502 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Dec 16 13:16:08.169475 kernel: ACPI: button: Power Button [PWRF] Dec 16 13:16:08.192183 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Dec 16 13:16:08.192399 systemd[1]: Reloading finished in 216 ms. Dec 16 13:16:08.194467 kernel: mousedev: PS/2 mouse device common for all mice Dec 16 13:16:08.214245 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 13:16:08.215236 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Dec 16 13:16:08.257359 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Dec 16 13:16:08.258482 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Dec 16 13:16:08.258725 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Dec 16 13:16:08.258744 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Dec 16 13:16:08.275504 kernel: Console: switching to colour dummy device 80x25 Dec 16 13:16:08.277785 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Dec 16 13:16:08.286716 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Dec 16 13:16:08.286736 kernel: [drm] features: -context_init Dec 16 13:16:08.286759 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Dec 16 13:16:08.291660 kernel: [drm] number of scanouts: 1 Dec 16 13:16:08.291714 kernel: [drm] number of cap sets: 0 Dec 16 13:16:08.292881 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Dec 16 13:16:08.294471 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Dec 16 13:16:08.299726 kernel: Console: switching to colour frame buffer device 160x50 Dec 16 13:16:08.301767 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 16 13:16:08.305462 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Dec 16 13:16:08.339674 systemd[1]: Starting ensure-sysext.service... Dec 16 13:16:08.344613 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Dec 16 13:16:08.347267 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 13:16:08.350510 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 13:16:08.352882 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 13:16:08.364181 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Dec 16 13:16:08.368373 systemd[1]: Reload requested from client PID 1594 ('systemctl') (unit ensure-sysext.service)... Dec 16 13:16:08.368391 systemd[1]: Reloading... Dec 16 13:16:08.374023 systemd-tmpfiles[1597]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Dec 16 13:16:08.374050 systemd-tmpfiles[1597]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Dec 16 13:16:08.374280 systemd-tmpfiles[1597]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 16 13:16:08.374493 systemd-tmpfiles[1597]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Dec 16 13:16:08.375168 systemd-tmpfiles[1597]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Dec 16 13:16:08.375382 systemd-tmpfiles[1597]: ACLs are not supported, ignoring. Dec 16 13:16:08.375430 systemd-tmpfiles[1597]: ACLs are not supported, ignoring. Dec 16 13:16:08.382272 systemd-tmpfiles[1597]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 13:16:08.382283 systemd-tmpfiles[1597]: Skipping /boot Dec 16 13:16:08.388675 systemd-tmpfiles[1597]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 13:16:08.388686 systemd-tmpfiles[1597]: Skipping /boot Dec 16 13:16:08.411465 zram_generator::config[1631]: No configuration found. Dec 16 13:16:08.479471 ldconfig[1427]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Dec 16 13:16:08.581572 systemd[1]: Reloading finished in 212 ms. Dec 16 13:16:08.605384 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Dec 16 13:16:08.610912 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 13:16:08.627752 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 13:16:08.636799 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 13:16:08.641008 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Dec 16 13:16:08.643373 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Dec 16 13:16:08.646333 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 13:16:08.660868 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Dec 16 13:16:08.664035 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Dec 16 13:16:08.672089 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 13:16:08.672355 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 13:16:08.675399 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 13:16:08.677485 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 13:16:08.679173 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 13:16:08.680558 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 13:16:08.680677 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 13:16:08.680779 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 13:16:08.683015 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 13:16:08.683200 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 13:16:08.684769 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 13:16:08.684942 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 13:16:08.687984 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 13:16:08.688136 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 13:16:08.695680 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Dec 16 13:16:08.703648 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 13:16:08.703873 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 13:16:08.705080 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 13:16:08.708158 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 13:16:08.718162 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 13:16:08.720227 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 13:16:08.722483 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Dec 16 13:16:08.723745 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 13:16:08.723788 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 13:16:08.723863 systemd[1]: Reached target time-set.target - System Time Set. Dec 16 13:16:08.724241 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 13:16:08.726181 systemd[1]: Started systemd-userdbd.service - User Database Manager. Dec 16 13:16:08.727785 systemd[1]: Finished ensure-sysext.service. Dec 16 13:16:08.728850 augenrules[1723]: No rules Dec 16 13:16:08.729142 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Dec 16 13:16:08.730382 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 13:16:08.730568 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 13:16:08.731489 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 13:16:08.731660 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 13:16:08.732248 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 13:16:08.732372 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 13:16:08.732927 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 13:16:08.733096 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 13:16:08.735782 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 13:16:08.735912 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 13:16:08.741150 kernel: pps_core: LinuxPPS API ver. 1 registered Dec 16 13:16:08.741207 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Dec 16 13:16:08.741163 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 13:16:08.741237 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 13:16:08.742372 systemd[1]: Starting systemd-update-done.service - Update is Completed... Dec 16 13:16:08.746483 kernel: PTP clock support registered Dec 16 13:16:08.749152 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Dec 16 13:16:08.757670 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Dec 16 13:16:08.765625 systemd[1]: Finished systemd-update-done.service - Update is Completed. Dec 16 13:16:08.770494 systemd-networkd[1596]: lo: Link UP Dec 16 13:16:08.770504 systemd-networkd[1596]: lo: Gained carrier Dec 16 13:16:08.770591 systemd-resolved[1682]: Positive Trust Anchors: Dec 16 13:16:08.770598 systemd-resolved[1682]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 13:16:08.770631 systemd-resolved[1682]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 13:16:08.771621 systemd-networkd[1596]: Enumeration completed Dec 16 13:16:08.771746 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 13:16:08.771926 systemd-networkd[1596]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 13:16:08.771934 systemd-networkd[1596]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 13:16:08.772272 systemd-networkd[1596]: eth0: Link UP Dec 16 13:16:08.772373 systemd-networkd[1596]: eth0: Gained carrier Dec 16 13:16:08.772391 systemd-networkd[1596]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 13:16:08.773386 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Dec 16 13:16:08.776172 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Dec 16 13:16:08.777514 systemd-resolved[1682]: Using system hostname 'ci-4459-2-2-0-839c7337fa'. Dec 16 13:16:08.778695 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 13:16:08.779178 systemd[1]: Reached target network.target - Network. Dec 16 13:16:08.779568 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 13:16:08.795495 systemd-networkd[1596]: eth0: DHCPv4 address 10.0.23.154/25, gateway 10.0.23.129 acquired from 10.0.23.129 Dec 16 13:16:08.798165 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Dec 16 13:16:08.913714 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Dec 16 13:16:08.915480 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 16 13:16:08.915523 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 13:16:08.916002 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Dec 16 13:16:08.916373 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Dec 16 13:16:08.916740 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Dec 16 13:16:08.917213 systemd[1]: Started logrotate.timer - Daily rotation of log files. Dec 16 13:16:08.917674 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Dec 16 13:16:08.918025 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Dec 16 13:16:08.918359 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Dec 16 13:16:08.918386 systemd[1]: Reached target paths.target - Path Units. Dec 16 13:16:08.918746 systemd[1]: Reached target timers.target - Timer Units. Dec 16 13:16:08.921653 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Dec 16 13:16:08.924181 systemd[1]: Starting docker.socket - Docker Socket for the API... Dec 16 13:16:08.932712 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Dec 16 13:16:08.933227 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Dec 16 13:16:08.937815 systemd[1]: Reached target ssh-access.target - SSH Access Available. Dec 16 13:16:08.946722 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Dec 16 13:16:08.950188 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Dec 16 13:16:08.952468 systemd[1]: Listening on docker.socket - Docker Socket for the API. Dec 16 13:16:08.954408 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 13:16:08.957309 systemd[1]: Reached target basic.target - Basic System. Dec 16 13:16:08.957963 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Dec 16 13:16:08.958000 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Dec 16 13:16:08.962073 systemd[1]: Starting chronyd.service - NTP client/server... Dec 16 13:16:08.965842 systemd[1]: Starting containerd.service - containerd container runtime... Dec 16 13:16:08.969209 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Dec 16 13:16:08.972561 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Dec 16 13:16:08.991013 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Dec 16 13:16:08.993351 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Dec 16 13:16:08.999373 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:16:08.999350 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Dec 16 13:16:09.000165 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Dec 16 13:16:09.001280 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Dec 16 13:16:09.004215 jq[1751]: false Dec 16 13:16:09.004426 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Dec 16 13:16:09.007562 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Dec 16 13:16:09.010904 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Dec 16 13:16:09.012466 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Dec 16 13:16:09.012793 google_oslogin_nss_cache[1755]: oslogin_cache_refresh[1755]: Refreshing passwd entry cache Dec 16 13:16:09.012745 oslogin_cache_refresh[1755]: Refreshing passwd entry cache Dec 16 13:16:09.015957 systemd[1]: Starting systemd-logind.service - User Login Management... Dec 16 13:16:09.019008 extend-filesystems[1754]: Found /dev/vda6 Dec 16 13:16:09.019090 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Dec 16 13:16:09.019568 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Dec 16 13:16:09.020011 systemd[1]: Starting update-engine.service - Update Engine... Dec 16 13:16:09.021609 google_oslogin_nss_cache[1755]: oslogin_cache_refresh[1755]: Failure getting users, quitting Dec 16 13:16:09.021609 google_oslogin_nss_cache[1755]: oslogin_cache_refresh[1755]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Dec 16 13:16:09.021596 oslogin_cache_refresh[1755]: Failure getting users, quitting Dec 16 13:16:09.021738 google_oslogin_nss_cache[1755]: oslogin_cache_refresh[1755]: Refreshing group entry cache Dec 16 13:16:09.021614 oslogin_cache_refresh[1755]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Dec 16 13:16:09.021654 oslogin_cache_refresh[1755]: Refreshing group entry cache Dec 16 13:16:09.021987 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Dec 16 13:16:09.023677 extend-filesystems[1754]: Found /dev/vda9 Dec 16 13:16:09.026512 extend-filesystems[1754]: Checking size of /dev/vda9 Dec 16 13:16:09.024826 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Dec 16 13:16:09.026360 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Dec 16 13:16:09.026614 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Dec 16 13:16:09.029268 oslogin_cache_refresh[1755]: Failure getting groups, quitting Dec 16 13:16:09.036086 google_oslogin_nss_cache[1755]: oslogin_cache_refresh[1755]: Failure getting groups, quitting Dec 16 13:16:09.036086 google_oslogin_nss_cache[1755]: oslogin_cache_refresh[1755]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Dec 16 13:16:09.027436 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Dec 16 13:16:09.029280 oslogin_cache_refresh[1755]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Dec 16 13:16:09.027662 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Dec 16 13:16:09.030375 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Dec 16 13:16:09.036399 jq[1766]: true Dec 16 13:16:09.030619 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Dec 16 13:16:09.042768 jq[1776]: true Dec 16 13:16:09.047589 extend-filesystems[1754]: Resized partition /dev/vda9 Dec 16 13:16:09.048691 chronyd[1746]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Dec 16 13:16:09.049796 systemd[1]: Started chronyd.service - NTP client/server. Dec 16 13:16:09.049697 chronyd[1746]: Loaded seccomp filter (level 2) Dec 16 13:16:09.051133 (ntainerd)[1785]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Dec 16 13:16:09.051288 systemd[1]: motdgen.service: Deactivated successfully. Dec 16 13:16:09.051590 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Dec 16 13:16:09.053136 update_engine[1765]: I20251216 13:16:09.052999 1765 main.cc:92] Flatcar Update Engine starting Dec 16 13:16:09.054081 extend-filesystems[1794]: resize2fs 1.47.3 (8-Jul-2025) Dec 16 13:16:09.061466 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 12499963 blocks Dec 16 13:16:09.064993 tar[1771]: linux-amd64/LICENSE Dec 16 13:16:09.088886 dbus-daemon[1749]: [system] SELinux support is enabled Dec 16 13:16:09.089022 systemd[1]: Started dbus.service - D-Bus System Message Bus. Dec 16 13:16:09.089380 tar[1771]: linux-amd64/helm Dec 16 13:16:09.089218 systemd-logind[1763]: New seat seat0. Dec 16 13:16:09.092871 update_engine[1765]: I20251216 13:16:09.092823 1765 update_check_scheduler.cc:74] Next update check in 10m1s Dec 16 13:16:09.094796 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Dec 16 13:16:09.094816 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Dec 16 13:16:09.096764 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Dec 16 13:16:09.096782 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Dec 16 13:16:09.097224 systemd[1]: Started update-engine.service - Update Engine. Dec 16 13:16:09.100930 systemd[1]: Started locksmithd.service - Cluster reboot manager. Dec 16 13:16:09.123942 systemd-logind[1763]: Watching system buttons on /dev/input/event3 (Power Button) Dec 16 13:16:09.123959 systemd-logind[1763]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Dec 16 13:16:09.124234 systemd[1]: Started systemd-logind.service - User Login Management. Dec 16 13:16:09.158963 locksmithd[1814]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Dec 16 13:16:09.294790 containerd[1785]: time="2025-12-16T13:16:09Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Dec 16 13:16:09.296696 containerd[1785]: time="2025-12-16T13:16:09.296640131Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Dec 16 13:16:09.297299 sshd_keygen[1796]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Dec 16 13:16:09.309213 containerd[1785]: time="2025-12-16T13:16:09.309137893Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="14.981µs" Dec 16 13:16:09.318272 containerd[1785]: time="2025-12-16T13:16:09.309353272Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Dec 16 13:16:09.318272 containerd[1785]: time="2025-12-16T13:16:09.309389685Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Dec 16 13:16:09.319919 containerd[1785]: time="2025-12-16T13:16:09.319616753Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Dec 16 13:16:09.319919 containerd[1785]: time="2025-12-16T13:16:09.319662817Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Dec 16 13:16:09.319919 containerd[1785]: time="2025-12-16T13:16:09.319696210Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 13:16:09.319919 containerd[1785]: time="2025-12-16T13:16:09.319758275Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 13:16:09.319919 containerd[1785]: time="2025-12-16T13:16:09.319771796Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 13:16:09.320207 containerd[1785]: time="2025-12-16T13:16:09.320116498Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 13:16:09.320207 containerd[1785]: time="2025-12-16T13:16:09.320132641Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 13:16:09.320207 containerd[1785]: time="2025-12-16T13:16:09.320144985Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 13:16:09.320207 containerd[1785]: time="2025-12-16T13:16:09.320154247Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Dec 16 13:16:09.320397 containerd[1785]: time="2025-12-16T13:16:09.320253413Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Dec 16 13:16:09.325498 containerd[1785]: time="2025-12-16T13:16:09.325434560Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 13:16:09.325926 containerd[1785]: time="2025-12-16T13:16:09.325831978Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 13:16:09.325926 containerd[1785]: time="2025-12-16T13:16:09.325857278Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Dec 16 13:16:09.326057 bash[1813]: Updated "/home/core/.ssh/authorized_keys" Dec 16 13:16:09.326950 containerd[1785]: time="2025-12-16T13:16:09.326171011Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Dec 16 13:16:09.326822 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Dec 16 13:16:09.327395 containerd[1785]: time="2025-12-16T13:16:09.327338429Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Dec 16 13:16:09.327654 containerd[1785]: time="2025-12-16T13:16:09.327609284Z" level=info msg="metadata content store policy set" policy=shared Dec 16 13:16:09.329961 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Dec 16 13:16:09.336008 systemd[1]: Starting issuegen.service - Generate /run/issue... Dec 16 13:16:09.340350 systemd[1]: Starting sshkeys.service... Dec 16 13:16:09.361337 systemd[1]: issuegen.service: Deactivated successfully. Dec 16 13:16:09.361631 systemd[1]: Finished issuegen.service - Generate /run/issue. Dec 16 13:16:09.368429 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Dec 16 13:16:09.375180 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Dec 16 13:16:09.377084 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Dec 16 13:16:09.398472 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:16:09.403769 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Dec 16 13:16:09.406566 systemd[1]: Started getty@tty1.service - Getty on tty1. Dec 16 13:16:09.408590 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Dec 16 13:16:09.412097 systemd[1]: Reached target getty.target - Login Prompts. Dec 16 13:16:09.440011 containerd[1785]: time="2025-12-16T13:16:09.439764190Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Dec 16 13:16:09.440011 containerd[1785]: time="2025-12-16T13:16:09.439878496Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Dec 16 13:16:09.440011 containerd[1785]: time="2025-12-16T13:16:09.439909083Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Dec 16 13:16:09.440011 containerd[1785]: time="2025-12-16T13:16:09.439930121Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Dec 16 13:16:09.440011 containerd[1785]: time="2025-12-16T13:16:09.439952762Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Dec 16 13:16:09.440011 containerd[1785]: time="2025-12-16T13:16:09.440022187Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Dec 16 13:16:09.440719 containerd[1785]: time="2025-12-16T13:16:09.440649219Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Dec 16 13:16:09.440775 containerd[1785]: time="2025-12-16T13:16:09.440735334Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Dec 16 13:16:09.440825 containerd[1785]: time="2025-12-16T13:16:09.440767815Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Dec 16 13:16:09.440825 containerd[1785]: time="2025-12-16T13:16:09.440793492Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Dec 16 13:16:09.440825 containerd[1785]: time="2025-12-16T13:16:09.440812416Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Dec 16 13:16:09.440951 containerd[1785]: time="2025-12-16T13:16:09.440836568Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Dec 16 13:16:09.441682 containerd[1785]: time="2025-12-16T13:16:09.441036779Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Dec 16 13:16:09.441682 containerd[1785]: time="2025-12-16T13:16:09.441076868Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Dec 16 13:16:09.441682 containerd[1785]: time="2025-12-16T13:16:09.441111944Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Dec 16 13:16:09.441682 containerd[1785]: time="2025-12-16T13:16:09.441139949Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Dec 16 13:16:09.441682 containerd[1785]: time="2025-12-16T13:16:09.441164205Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Dec 16 13:16:09.441682 containerd[1785]: time="2025-12-16T13:16:09.441180508Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Dec 16 13:16:09.441682 containerd[1785]: time="2025-12-16T13:16:09.441202858Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Dec 16 13:16:09.441682 containerd[1785]: time="2025-12-16T13:16:09.441221552Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Dec 16 13:16:09.441682 containerd[1785]: time="2025-12-16T13:16:09.441242699Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Dec 16 13:16:09.441682 containerd[1785]: time="2025-12-16T13:16:09.441257811Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Dec 16 13:16:09.441682 containerd[1785]: time="2025-12-16T13:16:09.441275081Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Dec 16 13:16:09.441682 containerd[1785]: time="2025-12-16T13:16:09.441352235Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Dec 16 13:16:09.441682 containerd[1785]: time="2025-12-16T13:16:09.441374637Z" level=info msg="Start snapshots syncer" Dec 16 13:16:09.441682 containerd[1785]: time="2025-12-16T13:16:09.441417129Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Dec 16 13:16:09.442561 containerd[1785]: time="2025-12-16T13:16:09.442480522Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Dec 16 13:16:09.442769 containerd[1785]: time="2025-12-16T13:16:09.442586948Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Dec 16 13:16:09.445818 containerd[1785]: time="2025-12-16T13:16:09.445739209Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Dec 16 13:16:09.446322 containerd[1785]: time="2025-12-16T13:16:09.446224182Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Dec 16 13:16:09.446409 containerd[1785]: time="2025-12-16T13:16:09.446300351Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Dec 16 13:16:09.447134 containerd[1785]: time="2025-12-16T13:16:09.446483111Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Dec 16 13:16:09.447134 containerd[1785]: time="2025-12-16T13:16:09.446507448Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Dec 16 13:16:09.447134 containerd[1785]: time="2025-12-16T13:16:09.446532876Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Dec 16 13:16:09.447134 containerd[1785]: time="2025-12-16T13:16:09.446552982Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Dec 16 13:16:09.447134 containerd[1785]: time="2025-12-16T13:16:09.446573437Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Dec 16 13:16:09.447134 containerd[1785]: time="2025-12-16T13:16:09.446625530Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Dec 16 13:16:09.447134 containerd[1785]: time="2025-12-16T13:16:09.446646375Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Dec 16 13:16:09.447134 containerd[1785]: time="2025-12-16T13:16:09.446667430Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Dec 16 13:16:09.447134 containerd[1785]: time="2025-12-16T13:16:09.446729188Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 13:16:09.447134 containerd[1785]: time="2025-12-16T13:16:09.446756379Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 13:16:09.447134 containerd[1785]: time="2025-12-16T13:16:09.446773461Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 13:16:09.447134 containerd[1785]: time="2025-12-16T13:16:09.446792746Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 13:16:09.447134 containerd[1785]: time="2025-12-16T13:16:09.446808623Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Dec 16 13:16:09.447134 containerd[1785]: time="2025-12-16T13:16:09.446826432Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Dec 16 13:16:09.447760 containerd[1785]: time="2025-12-16T13:16:09.446855484Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Dec 16 13:16:09.447760 containerd[1785]: time="2025-12-16T13:16:09.446884649Z" level=info msg="runtime interface created" Dec 16 13:16:09.447760 containerd[1785]: time="2025-12-16T13:16:09.446895133Z" level=info msg="created NRI interface" Dec 16 13:16:09.447760 containerd[1785]: time="2025-12-16T13:16:09.446911751Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Dec 16 13:16:09.447760 containerd[1785]: time="2025-12-16T13:16:09.446937336Z" level=info msg="Connect containerd service" Dec 16 13:16:09.447760 containerd[1785]: time="2025-12-16T13:16:09.446972187Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Dec 16 13:16:09.448193 containerd[1785]: time="2025-12-16T13:16:09.448133403Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 16 13:16:09.557664 containerd[1785]: time="2025-12-16T13:16:09.557500617Z" level=info msg="Start subscribing containerd event" Dec 16 13:16:09.557664 containerd[1785]: time="2025-12-16T13:16:09.557568356Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Dec 16 13:16:09.557664 containerd[1785]: time="2025-12-16T13:16:09.557614027Z" level=info msg="Start recovering state" Dec 16 13:16:09.557819 containerd[1785]: time="2025-12-16T13:16:09.557636686Z" level=info msg=serving... address=/run/containerd/containerd.sock Dec 16 13:16:09.557855 containerd[1785]: time="2025-12-16T13:16:09.557808222Z" level=info msg="Start event monitor" Dec 16 13:16:09.557855 containerd[1785]: time="2025-12-16T13:16:09.557834520Z" level=info msg="Start cni network conf syncer for default" Dec 16 13:16:09.557855 containerd[1785]: time="2025-12-16T13:16:09.557849569Z" level=info msg="Start streaming server" Dec 16 13:16:09.557938 containerd[1785]: time="2025-12-16T13:16:09.557884467Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Dec 16 13:16:09.557938 containerd[1785]: time="2025-12-16T13:16:09.557901727Z" level=info msg="runtime interface starting up..." Dec 16 13:16:09.557938 containerd[1785]: time="2025-12-16T13:16:09.557917487Z" level=info msg="starting plugins..." Dec 16 13:16:09.558015 containerd[1785]: time="2025-12-16T13:16:09.557953619Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Dec 16 13:16:09.558332 containerd[1785]: time="2025-12-16T13:16:09.558234942Z" level=info msg="containerd successfully booted in 0.264025s" Dec 16 13:16:09.560601 systemd[1]: Started containerd.service - containerd container runtime. Dec 16 13:16:09.650766 tar[1771]: linux-amd64/README.md Dec 16 13:16:09.675947 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Dec 16 13:16:09.738495 kernel: EXT4-fs (vda9): resized filesystem to 12499963 Dec 16 13:16:09.782010 extend-filesystems[1794]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Dec 16 13:16:09.782010 extend-filesystems[1794]: old_desc_blocks = 1, new_desc_blocks = 6 Dec 16 13:16:09.782010 extend-filesystems[1794]: The filesystem on /dev/vda9 is now 12499963 (4k) blocks long. Dec 16 13:16:09.784319 extend-filesystems[1754]: Resized filesystem in /dev/vda9 Dec 16 13:16:09.782940 systemd[1]: extend-filesystems.service: Deactivated successfully. Dec 16 13:16:09.783267 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Dec 16 13:16:10.009497 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:16:10.202864 systemd-networkd[1596]: eth0: Gained IPv6LL Dec 16 13:16:10.207361 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Dec 16 13:16:10.208771 systemd[1]: Reached target network-online.target - Network is Online. Dec 16 13:16:10.213811 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:16:10.220393 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Dec 16 13:16:10.280130 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Dec 16 13:16:10.415491 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:16:11.884602 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:16:11.891923 (kubelet)[1890]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 13:16:12.021514 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:16:12.433755 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:16:12.885692 kubelet[1890]: E1216 13:16:12.885599 1890 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 13:16:12.891192 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 13:16:12.891458 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 13:16:12.892067 systemd[1]: kubelet.service: Consumed 1.275s CPU time, 266.8M memory peak. Dec 16 13:16:16.030484 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:16:16.044566 coreos-metadata[1748]: Dec 16 13:16:16.044 WARN failed to locate config-drive, using the metadata service API instead Dec 16 13:16:16.066987 coreos-metadata[1748]: Dec 16 13:16:16.066 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Dec 16 13:16:16.362912 coreos-metadata[1748]: Dec 16 13:16:16.362 INFO Fetch successful Dec 16 13:16:16.362912 coreos-metadata[1748]: Dec 16 13:16:16.362 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 16 13:16:16.458533 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:16:16.470964 coreos-metadata[1843]: Dec 16 13:16:16.470 WARN failed to locate config-drive, using the metadata service API instead Dec 16 13:16:16.487469 coreos-metadata[1843]: Dec 16 13:16:16.487 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Dec 16 13:16:16.569546 coreos-metadata[1748]: Dec 16 13:16:16.569 INFO Fetch successful Dec 16 13:16:16.569546 coreos-metadata[1748]: Dec 16 13:16:16.569 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Dec 16 13:16:16.709048 coreos-metadata[1843]: Dec 16 13:16:16.708 INFO Fetch successful Dec 16 13:16:16.709048 coreos-metadata[1843]: Dec 16 13:16:16.708 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Dec 16 13:16:16.789560 coreos-metadata[1748]: Dec 16 13:16:16.789 INFO Fetch successful Dec 16 13:16:16.789560 coreos-metadata[1748]: Dec 16 13:16:16.789 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Dec 16 13:16:16.928491 coreos-metadata[1843]: Dec 16 13:16:16.928 INFO Fetch successful Dec 16 13:16:16.932896 unknown[1843]: wrote ssh authorized keys file for user: core Dec 16 13:16:16.935784 coreos-metadata[1748]: Dec 16 13:16:16.935 INFO Fetch successful Dec 16 13:16:16.935784 coreos-metadata[1748]: Dec 16 13:16:16.935 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Dec 16 13:16:16.996128 update-ssh-keys[1912]: Updated "/home/core/.ssh/authorized_keys" Dec 16 13:16:16.997874 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Dec 16 13:16:17.001622 systemd[1]: Finished sshkeys.service. Dec 16 13:16:17.053194 coreos-metadata[1748]: Dec 16 13:16:17.053 INFO Fetch successful Dec 16 13:16:17.053194 coreos-metadata[1748]: Dec 16 13:16:17.053 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Dec 16 13:16:17.171935 coreos-metadata[1748]: Dec 16 13:16:17.171 INFO Fetch successful Dec 16 13:16:17.201377 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Dec 16 13:16:17.202363 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Dec 16 13:16:17.202688 systemd[1]: Reached target multi-user.target - Multi-User System. Dec 16 13:16:17.202944 systemd[1]: Startup finished in 4.104s (kernel) + 13.654s (initrd) + 10.964s (userspace) = 28.722s. Dec 16 13:16:21.002639 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Dec 16 13:16:21.003796 systemd[1]: Started sshd@0-10.0.23.154:22-147.75.109.163:57914.service - OpenSSH per-connection server daemon (147.75.109.163:57914). Dec 16 13:16:22.023490 sshd[1922]: Accepted publickey for core from 147.75.109.163 port 57914 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:16:22.028590 sshd-session[1922]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:16:22.041645 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Dec 16 13:16:22.043212 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Dec 16 13:16:22.056176 systemd-logind[1763]: New session 1 of user core. Dec 16 13:16:22.092875 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Dec 16 13:16:22.098917 systemd[1]: Starting user@500.service - User Manager for UID 500... Dec 16 13:16:22.131178 (systemd)[1927]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Dec 16 13:16:22.135368 systemd-logind[1763]: New session c1 of user core. Dec 16 13:16:22.293564 systemd[1927]: Queued start job for default target default.target. Dec 16 13:16:22.302356 systemd[1927]: Created slice app.slice - User Application Slice. Dec 16 13:16:22.302384 systemd[1927]: Reached target paths.target - Paths. Dec 16 13:16:22.302424 systemd[1927]: Reached target timers.target - Timers. Dec 16 13:16:22.303532 systemd[1927]: Starting dbus.socket - D-Bus User Message Bus Socket... Dec 16 13:16:22.313878 systemd[1927]: Listening on dbus.socket - D-Bus User Message Bus Socket. Dec 16 13:16:22.313975 systemd[1927]: Reached target sockets.target - Sockets. Dec 16 13:16:22.314009 systemd[1927]: Reached target basic.target - Basic System. Dec 16 13:16:22.314040 systemd[1927]: Reached target default.target - Main User Target. Dec 16 13:16:22.314065 systemd[1927]: Startup finished in 163ms. Dec 16 13:16:22.315093 systemd[1]: Started user@500.service - User Manager for UID 500. Dec 16 13:16:22.318599 systemd[1]: Started session-1.scope - Session 1 of User core. Dec 16 13:16:23.014232 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Dec 16 13:16:23.017742 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:16:23.020088 systemd[1]: Started sshd@1-10.0.23.154:22-147.75.109.163:40104.service - OpenSSH per-connection server daemon (147.75.109.163:40104). Dec 16 13:16:23.191540 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:16:23.214007 (kubelet)[1949]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 13:16:23.290778 kubelet[1949]: E1216 13:16:23.290682 1949 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 13:16:23.298123 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 13:16:23.298382 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 13:16:23.298959 systemd[1]: kubelet.service: Consumed 236ms CPU time, 111.7M memory peak. Dec 16 13:16:24.104341 sshd[1939]: Accepted publickey for core from 147.75.109.163 port 40104 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:16:24.107372 sshd-session[1939]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:16:24.117062 systemd-logind[1763]: New session 2 of user core. Dec 16 13:16:24.137693 systemd[1]: Started session-2.scope - Session 2 of User core. Dec 16 13:16:24.821078 sshd[1961]: Connection closed by 147.75.109.163 port 40104 Dec 16 13:16:24.821776 sshd-session[1939]: pam_unix(sshd:session): session closed for user core Dec 16 13:16:24.827420 systemd[1]: sshd@1-10.0.23.154:22-147.75.109.163:40104.service: Deactivated successfully. Dec 16 13:16:24.831377 systemd[1]: session-2.scope: Deactivated successfully. Dec 16 13:16:24.834153 systemd-logind[1763]: Session 2 logged out. Waiting for processes to exit. Dec 16 13:16:24.835958 systemd-logind[1763]: Removed session 2. Dec 16 13:16:24.991386 systemd[1]: Started sshd@2-10.0.23.154:22-147.75.109.163:40120.service - OpenSSH per-connection server daemon (147.75.109.163:40120). Dec 16 13:16:25.989372 sshd[1967]: Accepted publickey for core from 147.75.109.163 port 40120 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:16:25.991373 sshd-session[1967]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:16:25.999506 systemd-logind[1763]: New session 3 of user core. Dec 16 13:16:26.012716 systemd[1]: Started session-3.scope - Session 3 of User core. Dec 16 13:16:26.649291 sshd[1970]: Connection closed by 147.75.109.163 port 40120 Dec 16 13:16:26.650213 sshd-session[1967]: pam_unix(sshd:session): session closed for user core Dec 16 13:16:26.655302 systemd[1]: sshd@2-10.0.23.154:22-147.75.109.163:40120.service: Deactivated successfully. Dec 16 13:16:26.658416 systemd[1]: session-3.scope: Deactivated successfully. Dec 16 13:16:26.661820 systemd-logind[1763]: Session 3 logged out. Waiting for processes to exit. Dec 16 13:16:26.663150 systemd-logind[1763]: Removed session 3. Dec 16 13:16:26.824326 systemd[1]: Started sshd@3-10.0.23.154:22-147.75.109.163:40122.service - OpenSSH per-connection server daemon (147.75.109.163:40122). Dec 16 13:16:27.823189 sshd[1976]: Accepted publickey for core from 147.75.109.163 port 40122 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:16:27.824387 sshd-session[1976]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:16:27.828521 systemd-logind[1763]: New session 4 of user core. Dec 16 13:16:27.848701 systemd[1]: Started session-4.scope - Session 4 of User core. Dec 16 13:16:28.491563 sshd[1979]: Connection closed by 147.75.109.163 port 40122 Dec 16 13:16:28.492212 sshd-session[1976]: pam_unix(sshd:session): session closed for user core Dec 16 13:16:28.497042 systemd[1]: sshd@3-10.0.23.154:22-147.75.109.163:40122.service: Deactivated successfully. Dec 16 13:16:28.500753 systemd[1]: session-4.scope: Deactivated successfully. Dec 16 13:16:28.503750 systemd-logind[1763]: Session 4 logged out. Waiting for processes to exit. Dec 16 13:16:28.506167 systemd-logind[1763]: Removed session 4. Dec 16 13:16:28.675695 systemd[1]: Started sshd@4-10.0.23.154:22-147.75.109.163:40126.service - OpenSSH per-connection server daemon (147.75.109.163:40126). Dec 16 13:16:29.683505 sshd[1985]: Accepted publickey for core from 147.75.109.163 port 40126 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:16:29.685359 sshd-session[1985]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:16:29.691551 systemd-logind[1763]: New session 5 of user core. Dec 16 13:16:29.708691 systemd[1]: Started session-5.scope - Session 5 of User core. Dec 16 13:16:30.229437 sudo[1989]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Dec 16 13:16:30.230313 sudo[1989]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 13:16:30.249819 sudo[1989]: pam_unix(sudo:session): session closed for user root Dec 16 13:16:30.406478 sshd[1988]: Connection closed by 147.75.109.163 port 40126 Dec 16 13:16:30.406925 sshd-session[1985]: pam_unix(sshd:session): session closed for user core Dec 16 13:16:30.410617 systemd[1]: sshd@4-10.0.23.154:22-147.75.109.163:40126.service: Deactivated successfully. Dec 16 13:16:30.412194 systemd[1]: session-5.scope: Deactivated successfully. Dec 16 13:16:30.412925 systemd-logind[1763]: Session 5 logged out. Waiting for processes to exit. Dec 16 13:16:30.414492 systemd-logind[1763]: Removed session 5. Dec 16 13:16:30.606239 systemd[1]: Started sshd@5-10.0.23.154:22-147.75.109.163:40128.service - OpenSSH per-connection server daemon (147.75.109.163:40128). Dec 16 13:16:31.701906 sshd[1995]: Accepted publickey for core from 147.75.109.163 port 40128 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:16:31.703997 sshd-session[1995]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:16:31.710621 systemd-logind[1763]: New session 6 of user core. Dec 16 13:16:31.733693 systemd[1]: Started session-6.scope - Session 6 of User core. Dec 16 13:16:32.264235 sudo[2000]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Dec 16 13:16:32.264818 sudo[2000]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 13:16:32.275297 sudo[2000]: pam_unix(sudo:session): session closed for user root Dec 16 13:16:32.289888 sudo[1999]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Dec 16 13:16:32.291239 sudo[1999]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 13:16:32.306937 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 13:16:32.367911 augenrules[2022]: No rules Dec 16 13:16:32.369846 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 13:16:32.370183 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 13:16:32.371237 sudo[1999]: pam_unix(sudo:session): session closed for user root Dec 16 13:16:32.540834 sshd[1998]: Connection closed by 147.75.109.163 port 40128 Dec 16 13:16:32.541159 sshd-session[1995]: pam_unix(sshd:session): session closed for user core Dec 16 13:16:32.545038 systemd[1]: sshd@5-10.0.23.154:22-147.75.109.163:40128.service: Deactivated successfully. Dec 16 13:16:32.549203 systemd[1]: session-6.scope: Deactivated successfully. Dec 16 13:16:32.552238 systemd-logind[1763]: Session 6 logged out. Waiting for processes to exit. Dec 16 13:16:32.554850 systemd-logind[1763]: Removed session 6. Dec 16 13:16:32.706285 systemd[1]: Started sshd@6-10.0.23.154:22-147.75.109.163:51352.service - OpenSSH per-connection server daemon (147.75.109.163:51352). Dec 16 13:16:32.837050 chronyd[1746]: Selected source PHC0 Dec 16 13:16:34.056275 systemd-resolved[1682]: Clock change detected. Flushing caches. Dec 16 13:16:32.837112 chronyd[1746]: System clock wrong by 1.219002 seconds Dec 16 13:16:34.056180 chronyd[1746]: System clock was stepped by 1.219002 seconds Dec 16 13:16:34.744963 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Dec 16 13:16:34.748658 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:16:34.949799 sshd[2031]: Accepted publickey for core from 147.75.109.163 port 51352 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:16:34.952941 sshd-session[2031]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:16:34.962943 systemd-logind[1763]: New session 7 of user core. Dec 16 13:16:34.967086 systemd[1]: Started session-7.scope - Session 7 of User core. Dec 16 13:16:34.983841 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:16:34.990713 (kubelet)[2043]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 13:16:35.065944 kubelet[2043]: E1216 13:16:35.065737 2043 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 13:16:35.071703 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 13:16:35.071966 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 13:16:35.072540 systemd[1]: kubelet.service: Consumed 295ms CPU time, 116.4M memory peak. Dec 16 13:16:35.468035 sudo[2055]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Dec 16 13:16:35.468341 sudo[2055]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 13:16:36.087551 systemd[1]: Starting docker.service - Docker Application Container Engine... Dec 16 13:16:36.116350 (dockerd)[2077]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Dec 16 13:16:36.477541 dockerd[2077]: time="2025-12-16T13:16:36.477379569Z" level=info msg="Starting up" Dec 16 13:16:36.478029 dockerd[2077]: time="2025-12-16T13:16:36.478004237Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Dec 16 13:16:36.490092 dockerd[2077]: time="2025-12-16T13:16:36.490053646Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Dec 16 13:16:36.554022 dockerd[2077]: time="2025-12-16T13:16:36.553822989Z" level=info msg="Loading containers: start." Dec 16 13:16:36.587524 kernel: Initializing XFRM netlink socket Dec 16 13:16:37.115845 systemd-networkd[1596]: docker0: Link UP Dec 16 13:16:37.126964 dockerd[2077]: time="2025-12-16T13:16:37.126887617Z" level=info msg="Loading containers: done." Dec 16 13:16:37.152897 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck862498382-merged.mount: Deactivated successfully. Dec 16 13:16:37.173644 dockerd[2077]: time="2025-12-16T13:16:37.173516068Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Dec 16 13:16:37.173836 dockerd[2077]: time="2025-12-16T13:16:37.173704250Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Dec 16 13:16:37.173894 dockerd[2077]: time="2025-12-16T13:16:37.173875542Z" level=info msg="Initializing buildkit" Dec 16 13:16:37.237014 dockerd[2077]: time="2025-12-16T13:16:37.236923654Z" level=info msg="Completed buildkit initialization" Dec 16 13:16:37.248748 dockerd[2077]: time="2025-12-16T13:16:37.248618363Z" level=info msg="Daemon has completed initialization" Dec 16 13:16:37.248937 dockerd[2077]: time="2025-12-16T13:16:37.248836177Z" level=info msg="API listen on /run/docker.sock" Dec 16 13:16:37.249145 systemd[1]: Started docker.service - Docker Application Container Engine. Dec 16 13:16:39.074915 containerd[1785]: time="2025-12-16T13:16:39.074853039Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.10\"" Dec 16 13:16:39.845047 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount541687540.mount: Deactivated successfully. Dec 16 13:16:40.972087 containerd[1785]: time="2025-12-16T13:16:40.972027366Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:16:40.973583 containerd[1785]: time="2025-12-16T13:16:40.973543574Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.10: active requests=0, bytes read=29072281" Dec 16 13:16:40.975741 containerd[1785]: time="2025-12-16T13:16:40.975701666Z" level=info msg="ImageCreate event name:\"sha256:77f8b0de97da9ee43e174b170c363c893ab69a20b03878e1bf6b54b10d44ef6f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:16:40.979275 containerd[1785]: time="2025-12-16T13:16:40.979236436Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:af4ee57c047e31a7f58422b94a9ec4c62221d3deebb16755bdeff720df796189\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:16:40.980328 containerd[1785]: time="2025-12-16T13:16:40.980294928Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.10\" with image id \"sha256:77f8b0de97da9ee43e174b170c363c893ab69a20b03878e1bf6b54b10d44ef6f\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.10\", repo digest \"registry.k8s.io/kube-apiserver@sha256:af4ee57c047e31a7f58422b94a9ec4c62221d3deebb16755bdeff720df796189\", size \"29068782\" in 1.905393501s" Dec 16 13:16:40.980364 containerd[1785]: time="2025-12-16T13:16:40.980331551Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.10\" returns image reference \"sha256:77f8b0de97da9ee43e174b170c363c893ab69a20b03878e1bf6b54b10d44ef6f\"" Dec 16 13:16:40.980987 containerd[1785]: time="2025-12-16T13:16:40.980963090Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.10\"" Dec 16 13:16:42.459014 containerd[1785]: time="2025-12-16T13:16:42.458957008Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:16:42.461436 containerd[1785]: time="2025-12-16T13:16:42.461405222Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.10: active requests=0, bytes read=24992030" Dec 16 13:16:42.463225 containerd[1785]: time="2025-12-16T13:16:42.463191760Z" level=info msg="ImageCreate event name:\"sha256:34e0beef266f1ca24c0093506853b1cc0ed91e873aeef655f39721813f10f924\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:16:42.467030 containerd[1785]: time="2025-12-16T13:16:42.466977611Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:efbd9d1dfcd2940e1c73a1476c880c3c2cdf04cc60722d329b21cd48745c8660\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:16:42.468594 containerd[1785]: time="2025-12-16T13:16:42.468560854Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.10\" with image id \"sha256:34e0beef266f1ca24c0093506853b1cc0ed91e873aeef655f39721813f10f924\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.10\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:efbd9d1dfcd2940e1c73a1476c880c3c2cdf04cc60722d329b21cd48745c8660\", size \"26649046\" in 1.487567698s" Dec 16 13:16:42.468651 containerd[1785]: time="2025-12-16T13:16:42.468597648Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.10\" returns image reference \"sha256:34e0beef266f1ca24c0093506853b1cc0ed91e873aeef655f39721813f10f924\"" Dec 16 13:16:42.469208 containerd[1785]: time="2025-12-16T13:16:42.469166530Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.10\"" Dec 16 13:16:43.724689 containerd[1785]: time="2025-12-16T13:16:43.724616231Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:16:43.726518 containerd[1785]: time="2025-12-16T13:16:43.726485282Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.10: active requests=0, bytes read=19404268" Dec 16 13:16:43.729614 containerd[1785]: time="2025-12-16T13:16:43.729584290Z" level=info msg="ImageCreate event name:\"sha256:fd6f6aae834c2ec73b534bc30902f1602089a8f4d1bbd8c521fe2b39968efe4a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:16:43.733151 containerd[1785]: time="2025-12-16T13:16:43.733122241Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:9c58e1adcad5af66d1d9ca5cf9a4c266e4054b8f19f91a8fff1993549e657b10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:16:43.733864 containerd[1785]: time="2025-12-16T13:16:43.733842365Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.10\" with image id \"sha256:fd6f6aae834c2ec73b534bc30902f1602089a8f4d1bbd8c521fe2b39968efe4a\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.10\", repo digest \"registry.k8s.io/kube-scheduler@sha256:9c58e1adcad5af66d1d9ca5cf9a4c266e4054b8f19f91a8fff1993549e657b10\", size \"21061302\" in 1.264617774s" Dec 16 13:16:43.733906 containerd[1785]: time="2025-12-16T13:16:43.733871461Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.10\" returns image reference \"sha256:fd6f6aae834c2ec73b534bc30902f1602089a8f4d1bbd8c521fe2b39968efe4a\"" Dec 16 13:16:43.734299 containerd[1785]: time="2025-12-16T13:16:43.734277411Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.10\"" Dec 16 13:16:44.703312 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount552881311.mount: Deactivated successfully. Dec 16 13:16:45.015041 containerd[1785]: time="2025-12-16T13:16:45.014638797Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:16:45.016815 containerd[1785]: time="2025-12-16T13:16:45.016778072Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.10: active requests=0, bytes read=31161449" Dec 16 13:16:45.019087 containerd[1785]: time="2025-12-16T13:16:45.019060663Z" level=info msg="ImageCreate event name:\"sha256:db4bcdca85a39c02add2db5eed4fc6ab21eb20616fbf8cd2cf824e59e384a956\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:16:45.022199 containerd[1785]: time="2025-12-16T13:16:45.022166145Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:e3dda1c7b384f9eb5b2fa1c27493b23b80e6204b9fa2ee8791b2de078f468cbf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:16:45.022649 containerd[1785]: time="2025-12-16T13:16:45.022625515Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.10\" with image id \"sha256:db4bcdca85a39c02add2db5eed4fc6ab21eb20616fbf8cd2cf824e59e384a956\", repo tag \"registry.k8s.io/kube-proxy:v1.32.10\", repo digest \"registry.k8s.io/kube-proxy@sha256:e3dda1c7b384f9eb5b2fa1c27493b23b80e6204b9fa2ee8791b2de078f468cbf\", size \"31160442\" in 1.288320979s" Dec 16 13:16:45.022692 containerd[1785]: time="2025-12-16T13:16:45.022653212Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.10\" returns image reference \"sha256:db4bcdca85a39c02add2db5eed4fc6ab21eb20616fbf8cd2cf824e59e384a956\"" Dec 16 13:16:45.023220 containerd[1785]: time="2025-12-16T13:16:45.023121376Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Dec 16 13:16:45.244809 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Dec 16 13:16:45.248165 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:16:45.495419 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:16:45.500813 (kubelet)[2381]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 13:16:45.544692 kubelet[2381]: E1216 13:16:45.544587 2381 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 13:16:45.546931 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 13:16:45.547068 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 13:16:45.547372 systemd[1]: kubelet.service: Consumed 249ms CPU time, 110.2M memory peak. Dec 16 13:16:45.748818 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount88806735.mount: Deactivated successfully. Dec 16 13:16:46.486012 containerd[1785]: time="2025-12-16T13:16:46.485954898Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:16:46.487186 containerd[1785]: time="2025-12-16T13:16:46.487120536Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565333" Dec 16 13:16:46.488878 containerd[1785]: time="2025-12-16T13:16:46.488826174Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:16:46.492100 containerd[1785]: time="2025-12-16T13:16:46.492074589Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:16:46.492891 containerd[1785]: time="2025-12-16T13:16:46.492783545Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.469631633s" Dec 16 13:16:46.492891 containerd[1785]: time="2025-12-16T13:16:46.492811042Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Dec 16 13:16:46.493240 containerd[1785]: time="2025-12-16T13:16:46.493215948Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Dec 16 13:16:47.048657 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3299944348.mount: Deactivated successfully. Dec 16 13:16:47.058134 containerd[1785]: time="2025-12-16T13:16:47.058041929Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 13:16:47.059867 containerd[1785]: time="2025-12-16T13:16:47.059799213Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321158" Dec 16 13:16:47.062739 containerd[1785]: time="2025-12-16T13:16:47.062662838Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 13:16:47.069054 containerd[1785]: time="2025-12-16T13:16:47.068933784Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 13:16:47.070426 containerd[1785]: time="2025-12-16T13:16:47.070331710Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 577.068192ms" Dec 16 13:16:47.070426 containerd[1785]: time="2025-12-16T13:16:47.070398708Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Dec 16 13:16:47.071192 containerd[1785]: time="2025-12-16T13:16:47.071079577Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Dec 16 13:16:47.754844 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount931757304.mount: Deactivated successfully. Dec 16 13:16:49.209579 containerd[1785]: time="2025-12-16T13:16:49.209530813Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:16:49.210668 containerd[1785]: time="2025-12-16T13:16:49.210636151Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57682130" Dec 16 13:16:49.212645 containerd[1785]: time="2025-12-16T13:16:49.212623439Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:16:49.215606 containerd[1785]: time="2025-12-16T13:16:49.215584387Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:16:49.216564 containerd[1785]: time="2025-12-16T13:16:49.216530382Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 2.145376881s" Dec 16 13:16:49.216680 containerd[1785]: time="2025-12-16T13:16:49.216663859Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Dec 16 13:16:52.915949 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:16:52.916411 systemd[1]: kubelet.service: Consumed 249ms CPU time, 110.2M memory peak. Dec 16 13:16:52.918753 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:16:52.945819 systemd[1]: Reload requested from client PID 2540 ('systemctl') (unit session-7.scope)... Dec 16 13:16:52.945943 systemd[1]: Reloading... Dec 16 13:16:53.009606 zram_generator::config[2587]: No configuration found. Dec 16 13:16:53.200757 systemd[1]: Reloading finished in 254 ms. Dec 16 13:16:53.261924 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Dec 16 13:16:53.262002 systemd[1]: kubelet.service: Failed with result 'signal'. Dec 16 13:16:53.262272 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:16:53.262338 systemd[1]: kubelet.service: Consumed 110ms CPU time, 98.3M memory peak. Dec 16 13:16:53.263919 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:16:53.490222 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:16:53.494324 (kubelet)[2638]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 13:16:53.530688 kubelet[2638]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 13:16:53.530688 kubelet[2638]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 13:16:53.530688 kubelet[2638]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 13:16:53.531090 kubelet[2638]: I1216 13:16:53.530732 2638 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 13:16:54.040061 kubelet[2638]: I1216 13:16:54.039940 2638 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Dec 16 13:16:54.040061 kubelet[2638]: I1216 13:16:54.039997 2638 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 13:16:54.040639 kubelet[2638]: I1216 13:16:54.040595 2638 server.go:954] "Client rotation is on, will bootstrap in background" Dec 16 13:16:54.091973 kubelet[2638]: E1216 13:16:54.091864 2638 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.23.154:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.23.154:6443: connect: connection refused" logger="UnhandledError" Dec 16 13:16:54.097791 kubelet[2638]: I1216 13:16:54.097718 2638 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 13:16:54.116054 kubelet[2638]: I1216 13:16:54.115962 2638 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 13:16:54.129017 kubelet[2638]: I1216 13:16:54.128942 2638 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 16 13:16:54.129732 kubelet[2638]: I1216 13:16:54.129641 2638 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 13:16:54.130117 kubelet[2638]: I1216 13:16:54.129722 2638 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-0-839c7337fa","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 13:16:54.132051 kubelet[2638]: I1216 13:16:54.131979 2638 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 13:16:54.132339 kubelet[2638]: I1216 13:16:54.132306 2638 container_manager_linux.go:304] "Creating device plugin manager" Dec 16 13:16:54.132759 kubelet[2638]: I1216 13:16:54.132696 2638 state_mem.go:36] "Initialized new in-memory state store" Dec 16 13:16:54.145183 kubelet[2638]: I1216 13:16:54.145101 2638 kubelet.go:446] "Attempting to sync node with API server" Dec 16 13:16:54.145183 kubelet[2638]: I1216 13:16:54.145191 2638 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 13:16:54.145362 kubelet[2638]: I1216 13:16:54.145250 2638 kubelet.go:352] "Adding apiserver pod source" Dec 16 13:16:54.145362 kubelet[2638]: I1216 13:16:54.145277 2638 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 13:16:54.149282 kubelet[2638]: W1216 13:16:54.149197 2638 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.23.154:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.23.154:6443: connect: connection refused Dec 16 13:16:54.149369 kubelet[2638]: E1216 13:16:54.149291 2638 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.23.154:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.23.154:6443: connect: connection refused" logger="UnhandledError" Dec 16 13:16:54.152863 kubelet[2638]: W1216 13:16:54.152804 2638 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.23.154:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-2-0-839c7337fa&limit=500&resourceVersion=0": dial tcp 10.0.23.154:6443: connect: connection refused Dec 16 13:16:54.152941 kubelet[2638]: E1216 13:16:54.152865 2638 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.23.154:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-2-0-839c7337fa&limit=500&resourceVersion=0\": dial tcp 10.0.23.154:6443: connect: connection refused" logger="UnhandledError" Dec 16 13:16:54.155815 kubelet[2638]: I1216 13:16:54.155773 2638 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 16 13:16:54.156497 kubelet[2638]: I1216 13:16:54.156394 2638 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 16 13:16:54.156497 kubelet[2638]: W1216 13:16:54.156497 2638 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Dec 16 13:16:54.160127 kubelet[2638]: I1216 13:16:54.160037 2638 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 16 13:16:54.160127 kubelet[2638]: I1216 13:16:54.160097 2638 server.go:1287] "Started kubelet" Dec 16 13:16:54.160488 kubelet[2638]: I1216 13:16:54.160420 2638 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 13:16:54.162103 kubelet[2638]: I1216 13:16:54.162075 2638 server.go:479] "Adding debug handlers to kubelet server" Dec 16 13:16:54.167461 kubelet[2638]: I1216 13:16:54.167394 2638 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 13:16:54.167551 kubelet[2638]: I1216 13:16:54.167534 2638 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 16 13:16:54.168727 kubelet[2638]: I1216 13:16:54.167557 2638 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 13:16:54.168727 kubelet[2638]: I1216 13:16:54.167703 2638 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 16 13:16:54.168727 kubelet[2638]: I1216 13:16:54.167812 2638 reconciler.go:26] "Reconciler: start to sync state" Dec 16 13:16:54.168727 kubelet[2638]: W1216 13:16:54.167937 2638 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.23.154:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.23.154:6443: connect: connection refused Dec 16 13:16:54.168727 kubelet[2638]: E1216 13:16:54.167979 2638 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.23.154:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.23.154:6443: connect: connection refused" logger="UnhandledError" Dec 16 13:16:54.168727 kubelet[2638]: E1216 13:16:54.168231 2638 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459-2-2-0-839c7337fa\" not found" Dec 16 13:16:54.168727 kubelet[2638]: E1216 13:16:54.168314 2638 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.23.154:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-0-839c7337fa?timeout=10s\": dial tcp 10.0.23.154:6443: connect: connection refused" interval="200ms" Dec 16 13:16:54.168727 kubelet[2638]: I1216 13:16:54.168636 2638 factory.go:221] Registration of the systemd container factory successfully Dec 16 13:16:54.168727 kubelet[2638]: I1216 13:16:54.168719 2638 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 13:16:54.169647 kubelet[2638]: I1216 13:16:54.169582 2638 factory.go:221] Registration of the containerd container factory successfully Dec 16 13:16:54.170254 kubelet[2638]: E1216 13:16:54.170195 2638 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 16 13:16:54.170663 kubelet[2638]: I1216 13:16:54.170518 2638 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 13:16:54.170987 kubelet[2638]: I1216 13:16:54.170937 2638 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 13:16:54.184696 kubelet[2638]: E1216 13:16:54.182323 2638 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.23.154:6443/api/v1/namespaces/default/events\": dial tcp 10.0.23.154:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-2-0-839c7337fa.1881b48573109361 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-2-0-839c7337fa,UID:ci-4459-2-2-0-839c7337fa,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-2-0-839c7337fa,},FirstTimestamp:2025-12-16 13:16:54.160061281 +0000 UTC m=+0.662264244,LastTimestamp:2025-12-16 13:16:54.160061281 +0000 UTC m=+0.662264244,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-0-839c7337fa,}" Dec 16 13:16:54.187377 kubelet[2638]: I1216 13:16:54.187333 2638 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 13:16:54.187377 kubelet[2638]: I1216 13:16:54.187351 2638 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 13:16:54.187377 kubelet[2638]: I1216 13:16:54.187370 2638 state_mem.go:36] "Initialized new in-memory state store" Dec 16 13:16:54.190638 kubelet[2638]: I1216 13:16:54.190609 2638 policy_none.go:49] "None policy: Start" Dec 16 13:16:54.190638 kubelet[2638]: I1216 13:16:54.190630 2638 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 16 13:16:54.190638 kubelet[2638]: I1216 13:16:54.190641 2638 state_mem.go:35] "Initializing new in-memory state store" Dec 16 13:16:54.197586 kubelet[2638]: I1216 13:16:54.197435 2638 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 16 13:16:54.198913 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Dec 16 13:16:54.199594 kubelet[2638]: I1216 13:16:54.199563 2638 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 16 13:16:54.199732 kubelet[2638]: I1216 13:16:54.199718 2638 status_manager.go:227] "Starting to sync pod status with apiserver" Dec 16 13:16:54.199813 kubelet[2638]: I1216 13:16:54.199803 2638 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 13:16:54.199874 kubelet[2638]: I1216 13:16:54.199866 2638 kubelet.go:2382] "Starting kubelet main sync loop" Dec 16 13:16:54.200012 kubelet[2638]: E1216 13:16:54.199992 2638 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 13:16:54.200684 kubelet[2638]: W1216 13:16:54.200615 2638 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.23.154:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.23.154:6443: connect: connection refused Dec 16 13:16:54.200989 kubelet[2638]: E1216 13:16:54.200949 2638 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.23.154:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.23.154:6443: connect: connection refused" logger="UnhandledError" Dec 16 13:16:54.210758 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Dec 16 13:16:54.214073 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Dec 16 13:16:54.237541 kubelet[2638]: I1216 13:16:54.237509 2638 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 16 13:16:54.237826 kubelet[2638]: I1216 13:16:54.237729 2638 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 13:16:54.237826 kubelet[2638]: I1216 13:16:54.237746 2638 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 13:16:54.238027 kubelet[2638]: I1216 13:16:54.237973 2638 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 13:16:54.238992 kubelet[2638]: E1216 13:16:54.238977 2638 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 13:16:54.239104 kubelet[2638]: E1216 13:16:54.239095 2638 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-2-0-839c7337fa\" not found" Dec 16 13:16:54.312633 systemd[1]: Created slice kubepods-burstable-pod69e0b346028da1b01568a31cd03b8a3d.slice - libcontainer container kubepods-burstable-pod69e0b346028da1b01568a31cd03b8a3d.slice. Dec 16 13:16:54.333654 kubelet[2638]: E1216 13:16:54.333592 2638 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-0-839c7337fa\" not found" node="ci-4459-2-2-0-839c7337fa" Dec 16 13:16:54.336854 systemd[1]: Created slice kubepods-burstable-pod7c4ac50a3f19f917a969516fa759df46.slice - libcontainer container kubepods-burstable-pod7c4ac50a3f19f917a969516fa759df46.slice. Dec 16 13:16:54.339290 kubelet[2638]: I1216 13:16:54.339275 2638 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-0-839c7337fa" Dec 16 13:16:54.339626 kubelet[2638]: E1216 13:16:54.339607 2638 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.23.154:6443/api/v1/nodes\": dial tcp 10.0.23.154:6443: connect: connection refused" node="ci-4459-2-2-0-839c7337fa" Dec 16 13:16:54.357957 kubelet[2638]: E1216 13:16:54.357900 2638 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-0-839c7337fa\" not found" node="ci-4459-2-2-0-839c7337fa" Dec 16 13:16:54.360154 systemd[1]: Created slice kubepods-burstable-pod89e4f6f6d04d9cfdca1ff92d5fd4384c.slice - libcontainer container kubepods-burstable-pod89e4f6f6d04d9cfdca1ff92d5fd4384c.slice. Dec 16 13:16:54.362376 kubelet[2638]: E1216 13:16:54.362342 2638 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-0-839c7337fa\" not found" node="ci-4459-2-2-0-839c7337fa" Dec 16 13:16:54.369442 kubelet[2638]: E1216 13:16:54.369363 2638 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.23.154:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-0-839c7337fa?timeout=10s\": dial tcp 10.0.23.154:6443: connect: connection refused" interval="400ms" Dec 16 13:16:54.470036 kubelet[2638]: I1216 13:16:54.469939 2638 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/69e0b346028da1b01568a31cd03b8a3d-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-0-839c7337fa\" (UID: \"69e0b346028da1b01568a31cd03b8a3d\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:54.470036 kubelet[2638]: I1216 13:16:54.470017 2638 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/89e4f6f6d04d9cfdca1ff92d5fd4384c-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-0-839c7337fa\" (UID: \"89e4f6f6d04d9cfdca1ff92d5fd4384c\") " pod="kube-system/kube-apiserver-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:54.470387 kubelet[2638]: I1216 13:16:54.470058 2638 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/69e0b346028da1b01568a31cd03b8a3d-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-0-839c7337fa\" (UID: \"69e0b346028da1b01568a31cd03b8a3d\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:54.470387 kubelet[2638]: I1216 13:16:54.470091 2638 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/69e0b346028da1b01568a31cd03b8a3d-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-0-839c7337fa\" (UID: \"69e0b346028da1b01568a31cd03b8a3d\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:54.470387 kubelet[2638]: I1216 13:16:54.470123 2638 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7c4ac50a3f19f917a969516fa759df46-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-0-839c7337fa\" (UID: \"7c4ac50a3f19f917a969516fa759df46\") " pod="kube-system/kube-scheduler-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:54.470387 kubelet[2638]: I1216 13:16:54.470154 2638 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/89e4f6f6d04d9cfdca1ff92d5fd4384c-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-0-839c7337fa\" (UID: \"89e4f6f6d04d9cfdca1ff92d5fd4384c\") " pod="kube-system/kube-apiserver-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:54.470387 kubelet[2638]: I1216 13:16:54.470182 2638 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/89e4f6f6d04d9cfdca1ff92d5fd4384c-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-0-839c7337fa\" (UID: \"89e4f6f6d04d9cfdca1ff92d5fd4384c\") " pod="kube-system/kube-apiserver-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:54.470631 kubelet[2638]: I1216 13:16:54.470211 2638 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/69e0b346028da1b01568a31cd03b8a3d-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-0-839c7337fa\" (UID: \"69e0b346028da1b01568a31cd03b8a3d\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:54.470631 kubelet[2638]: I1216 13:16:54.470250 2638 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/69e0b346028da1b01568a31cd03b8a3d-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-0-839c7337fa\" (UID: \"69e0b346028da1b01568a31cd03b8a3d\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:54.542924 kubelet[2638]: I1216 13:16:54.542876 2638 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-0-839c7337fa" Dec 16 13:16:54.543712 kubelet[2638]: E1216 13:16:54.543659 2638 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.23.154:6443/api/v1/nodes\": dial tcp 10.0.23.154:6443: connect: connection refused" node="ci-4459-2-2-0-839c7337fa" Dec 16 13:16:54.635866 containerd[1785]: time="2025-12-16T13:16:54.635708721Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-0-839c7337fa,Uid:69e0b346028da1b01568a31cd03b8a3d,Namespace:kube-system,Attempt:0,}" Dec 16 13:16:54.659669 containerd[1785]: time="2025-12-16T13:16:54.659579876Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-0-839c7337fa,Uid:7c4ac50a3f19f917a969516fa759df46,Namespace:kube-system,Attempt:0,}" Dec 16 13:16:54.663800 containerd[1785]: time="2025-12-16T13:16:54.663736771Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-0-839c7337fa,Uid:89e4f6f6d04d9cfdca1ff92d5fd4384c,Namespace:kube-system,Attempt:0,}" Dec 16 13:16:54.671276 containerd[1785]: time="2025-12-16T13:16:54.671187764Z" level=info msg="connecting to shim 050870786c83145f5293642d475ac449b3a8d49bbaa94303b0806d679bf40804" address="unix:///run/containerd/s/ba5d5c180b9daf558038cc3c17135656cfab956b2373d7e34ea5e1b0de01967d" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:16:54.695906 containerd[1785]: time="2025-12-16T13:16:54.695831610Z" level=info msg="connecting to shim 19a1a07f133317e4fa3379d926bdd9ac36dc6b96e5272ff63196405b8ce0d6e7" address="unix:///run/containerd/s/88e9592e2cd695c0fce9e98f49b26a69f1214de7f315c0cef6e3714f3a72f296" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:16:54.702257 containerd[1785]: time="2025-12-16T13:16:54.702195577Z" level=info msg="connecting to shim a9f24174fb545727345a1b4a8f99dd4fbdb5f58911b9169123ba1ee5411cc699" address="unix:///run/containerd/s/7bf5a277c3e3a319f85243421334bcaa8b631ae4d40017d163c06e6dd5dfef69" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:16:54.713699 systemd[1]: Started cri-containerd-050870786c83145f5293642d475ac449b3a8d49bbaa94303b0806d679bf40804.scope - libcontainer container 050870786c83145f5293642d475ac449b3a8d49bbaa94303b0806d679bf40804. Dec 16 13:16:54.716787 systemd[1]: Started cri-containerd-19a1a07f133317e4fa3379d926bdd9ac36dc6b96e5272ff63196405b8ce0d6e7.scope - libcontainer container 19a1a07f133317e4fa3379d926bdd9ac36dc6b96e5272ff63196405b8ce0d6e7. Dec 16 13:16:54.722765 systemd[1]: Started cri-containerd-a9f24174fb545727345a1b4a8f99dd4fbdb5f58911b9169123ba1ee5411cc699.scope - libcontainer container a9f24174fb545727345a1b4a8f99dd4fbdb5f58911b9169123ba1ee5411cc699. Dec 16 13:16:54.770210 kubelet[2638]: E1216 13:16:54.770171 2638 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.23.154:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-0-839c7337fa?timeout=10s\": dial tcp 10.0.23.154:6443: connect: connection refused" interval="800ms" Dec 16 13:16:54.771404 containerd[1785]: time="2025-12-16T13:16:54.771371486Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-0-839c7337fa,Uid:69e0b346028da1b01568a31cd03b8a3d,Namespace:kube-system,Attempt:0,} returns sandbox id \"050870786c83145f5293642d475ac449b3a8d49bbaa94303b0806d679bf40804\"" Dec 16 13:16:54.773384 containerd[1785]: time="2025-12-16T13:16:54.773351775Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-0-839c7337fa,Uid:89e4f6f6d04d9cfdca1ff92d5fd4384c,Namespace:kube-system,Attempt:0,} returns sandbox id \"a9f24174fb545727345a1b4a8f99dd4fbdb5f58911b9169123ba1ee5411cc699\"" Dec 16 13:16:54.774736 containerd[1785]: time="2025-12-16T13:16:54.774710330Z" level=info msg="CreateContainer within sandbox \"050870786c83145f5293642d475ac449b3a8d49bbaa94303b0806d679bf40804\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Dec 16 13:16:54.775671 containerd[1785]: time="2025-12-16T13:16:54.775649880Z" level=info msg="CreateContainer within sandbox \"a9f24174fb545727345a1b4a8f99dd4fbdb5f58911b9169123ba1ee5411cc699\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Dec 16 13:16:54.777048 containerd[1785]: time="2025-12-16T13:16:54.777022262Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-0-839c7337fa,Uid:7c4ac50a3f19f917a969516fa759df46,Namespace:kube-system,Attempt:0,} returns sandbox id \"19a1a07f133317e4fa3379d926bdd9ac36dc6b96e5272ff63196405b8ce0d6e7\"" Dec 16 13:16:54.779495 containerd[1785]: time="2025-12-16T13:16:54.778930591Z" level=info msg="CreateContainer within sandbox \"19a1a07f133317e4fa3379d926bdd9ac36dc6b96e5272ff63196405b8ce0d6e7\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Dec 16 13:16:54.801612 containerd[1785]: time="2025-12-16T13:16:54.801570930Z" level=info msg="Container c28a017a85dcef5902b09fa32c716873dcc3bcdda2cb421a3bad1405c05e7828: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:16:54.805262 containerd[1785]: time="2025-12-16T13:16:54.805231936Z" level=info msg="Container e686cfb173f2bfeea9cc1f8b91e1880892d6a1d8dc868d71a281ea9e9b81320e: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:16:54.816488 containerd[1785]: time="2025-12-16T13:16:54.816365552Z" level=info msg="CreateContainer within sandbox \"a9f24174fb545727345a1b4a8f99dd4fbdb5f58911b9169123ba1ee5411cc699\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"c28a017a85dcef5902b09fa32c716873dcc3bcdda2cb421a3bad1405c05e7828\"" Dec 16 13:16:54.817432 containerd[1785]: time="2025-12-16T13:16:54.817370062Z" level=info msg="StartContainer for \"c28a017a85dcef5902b09fa32c716873dcc3bcdda2cb421a3bad1405c05e7828\"" Dec 16 13:16:54.819201 containerd[1785]: time="2025-12-16T13:16:54.819152804Z" level=info msg="connecting to shim c28a017a85dcef5902b09fa32c716873dcc3bcdda2cb421a3bad1405c05e7828" address="unix:///run/containerd/s/7bf5a277c3e3a319f85243421334bcaa8b631ae4d40017d163c06e6dd5dfef69" protocol=ttrpc version=3 Dec 16 13:16:54.820093 containerd[1785]: time="2025-12-16T13:16:54.820049687Z" level=info msg="Container 2aadfad751c96ed258b6c96f21aa44ff7b11f3ed3904f2cd8a8a8c8e0ff2f34a: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:16:54.827086 containerd[1785]: time="2025-12-16T13:16:54.826994809Z" level=info msg="CreateContainer within sandbox \"050870786c83145f5293642d475ac449b3a8d49bbaa94303b0806d679bf40804\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"e686cfb173f2bfeea9cc1f8b91e1880892d6a1d8dc868d71a281ea9e9b81320e\"" Dec 16 13:16:54.827487 containerd[1785]: time="2025-12-16T13:16:54.827466011Z" level=info msg="StartContainer for \"e686cfb173f2bfeea9cc1f8b91e1880892d6a1d8dc868d71a281ea9e9b81320e\"" Dec 16 13:16:54.828638 containerd[1785]: time="2025-12-16T13:16:54.828613402Z" level=info msg="connecting to shim e686cfb173f2bfeea9cc1f8b91e1880892d6a1d8dc868d71a281ea9e9b81320e" address="unix:///run/containerd/s/ba5d5c180b9daf558038cc3c17135656cfab956b2373d7e34ea5e1b0de01967d" protocol=ttrpc version=3 Dec 16 13:16:54.835770 containerd[1785]: time="2025-12-16T13:16:54.835720498Z" level=info msg="CreateContainer within sandbox \"19a1a07f133317e4fa3379d926bdd9ac36dc6b96e5272ff63196405b8ce0d6e7\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"2aadfad751c96ed258b6c96f21aa44ff7b11f3ed3904f2cd8a8a8c8e0ff2f34a\"" Dec 16 13:16:54.837091 containerd[1785]: time="2025-12-16T13:16:54.837068301Z" level=info msg="StartContainer for \"2aadfad751c96ed258b6c96f21aa44ff7b11f3ed3904f2cd8a8a8c8e0ff2f34a\"" Dec 16 13:16:54.838044 containerd[1785]: time="2025-12-16T13:16:54.837915116Z" level=info msg="connecting to shim 2aadfad751c96ed258b6c96f21aa44ff7b11f3ed3904f2cd8a8a8c8e0ff2f34a" address="unix:///run/containerd/s/88e9592e2cd695c0fce9e98f49b26a69f1214de7f315c0cef6e3714f3a72f296" protocol=ttrpc version=3 Dec 16 13:16:54.850174 systemd[1]: Started cri-containerd-c28a017a85dcef5902b09fa32c716873dcc3bcdda2cb421a3bad1405c05e7828.scope - libcontainer container c28a017a85dcef5902b09fa32c716873dcc3bcdda2cb421a3bad1405c05e7828. Dec 16 13:16:54.857124 systemd[1]: Started cri-containerd-2aadfad751c96ed258b6c96f21aa44ff7b11f3ed3904f2cd8a8a8c8e0ff2f34a.scope - libcontainer container 2aadfad751c96ed258b6c96f21aa44ff7b11f3ed3904f2cd8a8a8c8e0ff2f34a. Dec 16 13:16:54.859352 systemd[1]: Started cri-containerd-e686cfb173f2bfeea9cc1f8b91e1880892d6a1d8dc868d71a281ea9e9b81320e.scope - libcontainer container e686cfb173f2bfeea9cc1f8b91e1880892d6a1d8dc868d71a281ea9e9b81320e. Dec 16 13:16:54.917145 containerd[1785]: time="2025-12-16T13:16:54.916975146Z" level=info msg="StartContainer for \"e686cfb173f2bfeea9cc1f8b91e1880892d6a1d8dc868d71a281ea9e9b81320e\" returns successfully" Dec 16 13:16:54.917836 containerd[1785]: time="2025-12-16T13:16:54.917797822Z" level=info msg="StartContainer for \"c28a017a85dcef5902b09fa32c716873dcc3bcdda2cb421a3bad1405c05e7828\" returns successfully" Dec 16 13:16:54.918705 containerd[1785]: time="2025-12-16T13:16:54.918619453Z" level=info msg="StartContainer for \"2aadfad751c96ed258b6c96f21aa44ff7b11f3ed3904f2cd8a8a8c8e0ff2f34a\" returns successfully" Dec 16 13:16:54.946042 kubelet[2638]: I1216 13:16:54.946018 2638 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-0-839c7337fa" Dec 16 13:16:55.214853 kubelet[2638]: E1216 13:16:55.214765 2638 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-0-839c7337fa\" not found" node="ci-4459-2-2-0-839c7337fa" Dec 16 13:16:55.215149 kubelet[2638]: E1216 13:16:55.215110 2638 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-0-839c7337fa\" not found" node="ci-4459-2-2-0-839c7337fa" Dec 16 13:16:55.217614 kubelet[2638]: E1216 13:16:55.217591 2638 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-0-839c7337fa\" not found" node="ci-4459-2-2-0-839c7337fa" Dec 16 13:16:55.699692 update_engine[1765]: I20251216 13:16:55.699563 1765 update_attempter.cc:509] Updating boot flags... Dec 16 13:16:56.052789 kubelet[2638]: E1216 13:16:56.052744 2638 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-2-2-0-839c7337fa\" not found" node="ci-4459-2-2-0-839c7337fa" Dec 16 13:16:56.147590 kubelet[2638]: I1216 13:16:56.146423 2638 apiserver.go:52] "Watching apiserver" Dec 16 13:16:56.149191 kubelet[2638]: I1216 13:16:56.149150 2638 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-0-839c7337fa" Dec 16 13:16:56.149191 kubelet[2638]: E1216 13:16:56.149188 2638 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4459-2-2-0-839c7337fa\": node \"ci-4459-2-2-0-839c7337fa\" not found" Dec 16 13:16:56.167898 kubelet[2638]: I1216 13:16:56.167843 2638 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 16 13:16:56.169430 kubelet[2638]: I1216 13:16:56.169400 2638 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:56.175949 kubelet[2638]: E1216 13:16:56.175896 2638 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-0-839c7337fa\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:56.175949 kubelet[2638]: I1216 13:16:56.175923 2638 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:56.177285 kubelet[2638]: E1216 13:16:56.177245 2638 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-0-839c7337fa\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:56.177285 kubelet[2638]: I1216 13:16:56.177268 2638 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:56.178586 kubelet[2638]: E1216 13:16:56.178542 2638 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-2-0-839c7337fa\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:56.218106 kubelet[2638]: I1216 13:16:56.218073 2638 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:56.218235 kubelet[2638]: I1216 13:16:56.218198 2638 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:56.219799 kubelet[2638]: E1216 13:16:56.219751 2638 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-0-839c7337fa\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:56.220000 kubelet[2638]: E1216 13:16:56.219970 2638 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-0-839c7337fa\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:57.219957 kubelet[2638]: I1216 13:16:57.219761 2638 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:57.219957 kubelet[2638]: I1216 13:16:57.219906 2638 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:58.484261 systemd[1]: Reload requested from client PID 2944 ('systemctl') (unit session-7.scope)... Dec 16 13:16:58.484286 systemd[1]: Reloading... Dec 16 13:16:58.575541 zram_generator::config[2987]: No configuration found. Dec 16 13:16:58.798801 systemd[1]: Reloading finished in 313 ms. Dec 16 13:16:58.832536 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:16:58.832952 kubelet[2638]: I1216 13:16:58.832625 2638 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 13:16:58.849046 systemd[1]: kubelet.service: Deactivated successfully. Dec 16 13:16:58.849318 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:16:58.849379 systemd[1]: kubelet.service: Consumed 1.216s CPU time, 136.2M memory peak. Dec 16 13:16:58.851403 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:16:59.039489 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:16:59.044552 (kubelet)[3038]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 13:16:59.079081 kubelet[3038]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 13:16:59.079081 kubelet[3038]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 13:16:59.079081 kubelet[3038]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 13:16:59.079440 kubelet[3038]: I1216 13:16:59.079313 3038 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 13:16:59.086542 kubelet[3038]: I1216 13:16:59.086500 3038 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Dec 16 13:16:59.086542 kubelet[3038]: I1216 13:16:59.086524 3038 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 13:16:59.086790 kubelet[3038]: I1216 13:16:59.086778 3038 server.go:954] "Client rotation is on, will bootstrap in background" Dec 16 13:16:59.087922 kubelet[3038]: I1216 13:16:59.087888 3038 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 16 13:16:59.091635 kubelet[3038]: I1216 13:16:59.091593 3038 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 13:16:59.097496 kubelet[3038]: I1216 13:16:59.097470 3038 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 13:16:59.103652 kubelet[3038]: I1216 13:16:59.103598 3038 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 16 13:16:59.103843 kubelet[3038]: I1216 13:16:59.103811 3038 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 13:16:59.104022 kubelet[3038]: I1216 13:16:59.103842 3038 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-0-839c7337fa","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 13:16:59.104127 kubelet[3038]: I1216 13:16:59.104027 3038 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 13:16:59.104127 kubelet[3038]: I1216 13:16:59.104037 3038 container_manager_linux.go:304] "Creating device plugin manager" Dec 16 13:16:59.104127 kubelet[3038]: I1216 13:16:59.104090 3038 state_mem.go:36] "Initialized new in-memory state store" Dec 16 13:16:59.104285 kubelet[3038]: I1216 13:16:59.104278 3038 kubelet.go:446] "Attempting to sync node with API server" Dec 16 13:16:59.104308 kubelet[3038]: I1216 13:16:59.104297 3038 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 13:16:59.104328 kubelet[3038]: I1216 13:16:59.104319 3038 kubelet.go:352] "Adding apiserver pod source" Dec 16 13:16:59.104356 kubelet[3038]: I1216 13:16:59.104330 3038 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 13:16:59.105224 kubelet[3038]: I1216 13:16:59.105207 3038 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 16 13:16:59.105764 kubelet[3038]: I1216 13:16:59.105749 3038 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 16 13:16:59.106756 kubelet[3038]: I1216 13:16:59.106241 3038 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 16 13:16:59.106756 kubelet[3038]: I1216 13:16:59.106277 3038 server.go:1287] "Started kubelet" Dec 16 13:16:59.106756 kubelet[3038]: I1216 13:16:59.106337 3038 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 13:16:59.107325 kubelet[3038]: I1216 13:16:59.106432 3038 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 13:16:59.107408 kubelet[3038]: I1216 13:16:59.107390 3038 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 13:16:59.108111 kubelet[3038]: I1216 13:16:59.108096 3038 server.go:479] "Adding debug handlers to kubelet server" Dec 16 13:16:59.108440 kubelet[3038]: I1216 13:16:59.108316 3038 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 13:16:59.108440 kubelet[3038]: E1216 13:16:59.108413 3038 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459-2-2-0-839c7337fa\" not found" Dec 16 13:16:59.109510 kubelet[3038]: I1216 13:16:59.109473 3038 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 13:16:59.109563 kubelet[3038]: E1216 13:16:59.109514 3038 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 16 13:16:59.110397 kubelet[3038]: I1216 13:16:59.110382 3038 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 16 13:16:59.110678 kubelet[3038]: I1216 13:16:59.110665 3038 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 16 13:16:59.110882 kubelet[3038]: I1216 13:16:59.110872 3038 reconciler.go:26] "Reconciler: start to sync state" Dec 16 13:16:59.111028 kubelet[3038]: I1216 13:16:59.111003 3038 factory.go:221] Registration of the systemd container factory successfully Dec 16 13:16:59.111195 kubelet[3038]: I1216 13:16:59.111175 3038 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 13:16:59.113460 kubelet[3038]: I1216 13:16:59.112481 3038 factory.go:221] Registration of the containerd container factory successfully Dec 16 13:16:59.126306 kubelet[3038]: I1216 13:16:59.126251 3038 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 16 13:16:59.127820 kubelet[3038]: I1216 13:16:59.127799 3038 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 16 13:16:59.127898 kubelet[3038]: I1216 13:16:59.127832 3038 status_manager.go:227] "Starting to sync pod status with apiserver" Dec 16 13:16:59.127898 kubelet[3038]: I1216 13:16:59.127851 3038 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 13:16:59.127898 kubelet[3038]: I1216 13:16:59.127859 3038 kubelet.go:2382] "Starting kubelet main sync loop" Dec 16 13:16:59.127962 kubelet[3038]: E1216 13:16:59.127902 3038 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 13:16:59.145505 kubelet[3038]: I1216 13:16:59.145474 3038 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 13:16:59.145505 kubelet[3038]: I1216 13:16:59.145491 3038 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 13:16:59.145505 kubelet[3038]: I1216 13:16:59.145512 3038 state_mem.go:36] "Initialized new in-memory state store" Dec 16 13:16:59.145723 kubelet[3038]: I1216 13:16:59.145657 3038 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 16 13:16:59.145723 kubelet[3038]: I1216 13:16:59.145666 3038 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 16 13:16:59.145723 kubelet[3038]: I1216 13:16:59.145683 3038 policy_none.go:49] "None policy: Start" Dec 16 13:16:59.145723 kubelet[3038]: I1216 13:16:59.145692 3038 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 16 13:16:59.145723 kubelet[3038]: I1216 13:16:59.145701 3038 state_mem.go:35] "Initializing new in-memory state store" Dec 16 13:16:59.145833 kubelet[3038]: I1216 13:16:59.145810 3038 state_mem.go:75] "Updated machine memory state" Dec 16 13:16:59.149709 kubelet[3038]: I1216 13:16:59.149103 3038 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 16 13:16:59.149709 kubelet[3038]: I1216 13:16:59.149274 3038 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 13:16:59.149709 kubelet[3038]: I1216 13:16:59.149286 3038 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 13:16:59.149709 kubelet[3038]: I1216 13:16:59.149471 3038 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 13:16:59.150258 kubelet[3038]: E1216 13:16:59.150245 3038 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 13:16:59.229505 kubelet[3038]: I1216 13:16:59.229442 3038 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:59.229706 kubelet[3038]: I1216 13:16:59.229686 3038 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:59.229866 kubelet[3038]: I1216 13:16:59.229588 3038 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:59.240093 kubelet[3038]: E1216 13:16:59.240042 3038 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-0-839c7337fa\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:59.240093 kubelet[3038]: E1216 13:16:59.240055 3038 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-0-839c7337fa\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:59.253236 kubelet[3038]: I1216 13:16:59.253204 3038 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-0-839c7337fa" Dec 16 13:16:59.262052 kubelet[3038]: I1216 13:16:59.262016 3038 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459-2-2-0-839c7337fa" Dec 16 13:16:59.262226 kubelet[3038]: I1216 13:16:59.262099 3038 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-0-839c7337fa" Dec 16 13:16:59.312200 kubelet[3038]: I1216 13:16:59.312138 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/69e0b346028da1b01568a31cd03b8a3d-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-0-839c7337fa\" (UID: \"69e0b346028da1b01568a31cd03b8a3d\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:59.312200 kubelet[3038]: I1216 13:16:59.312185 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7c4ac50a3f19f917a969516fa759df46-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-0-839c7337fa\" (UID: \"7c4ac50a3f19f917a969516fa759df46\") " pod="kube-system/kube-scheduler-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:59.312200 kubelet[3038]: I1216 13:16:59.312205 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/89e4f6f6d04d9cfdca1ff92d5fd4384c-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-0-839c7337fa\" (UID: \"89e4f6f6d04d9cfdca1ff92d5fd4384c\") " pod="kube-system/kube-apiserver-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:59.312387 kubelet[3038]: I1216 13:16:59.312224 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/69e0b346028da1b01568a31cd03b8a3d-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-0-839c7337fa\" (UID: \"69e0b346028da1b01568a31cd03b8a3d\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:59.312387 kubelet[3038]: I1216 13:16:59.312263 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/69e0b346028da1b01568a31cd03b8a3d-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-0-839c7337fa\" (UID: \"69e0b346028da1b01568a31cd03b8a3d\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:59.312387 kubelet[3038]: I1216 13:16:59.312320 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/69e0b346028da1b01568a31cd03b8a3d-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-0-839c7337fa\" (UID: \"69e0b346028da1b01568a31cd03b8a3d\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:59.312387 kubelet[3038]: I1216 13:16:59.312351 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/89e4f6f6d04d9cfdca1ff92d5fd4384c-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-0-839c7337fa\" (UID: \"89e4f6f6d04d9cfdca1ff92d5fd4384c\") " pod="kube-system/kube-apiserver-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:59.312387 kubelet[3038]: I1216 13:16:59.312370 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/89e4f6f6d04d9cfdca1ff92d5fd4384c-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-0-839c7337fa\" (UID: \"89e4f6f6d04d9cfdca1ff92d5fd4384c\") " pod="kube-system/kube-apiserver-ci-4459-2-2-0-839c7337fa" Dec 16 13:16:59.312529 kubelet[3038]: I1216 13:16:59.312404 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/69e0b346028da1b01568a31cd03b8a3d-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-0-839c7337fa\" (UID: \"69e0b346028da1b01568a31cd03b8a3d\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-0-839c7337fa" Dec 16 13:17:00.105600 kubelet[3038]: I1216 13:17:00.105562 3038 apiserver.go:52] "Watching apiserver" Dec 16 13:17:00.111326 kubelet[3038]: I1216 13:17:00.111264 3038 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 16 13:17:00.164210 kubelet[3038]: I1216 13:17:00.164131 3038 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-2-0-839c7337fa" podStartSLOduration=3.164107825 podStartE2EDuration="3.164107825s" podCreationTimestamp="2025-12-16 13:16:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:17:00.164104448 +0000 UTC m=+1.115791540" watchObservedRunningTime="2025-12-16 13:17:00.164107825 +0000 UTC m=+1.115794900" Dec 16 13:17:00.182703 kubelet[3038]: I1216 13:17:00.182641 3038 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-2-0-839c7337fa" podStartSLOduration=3.182624228 podStartE2EDuration="3.182624228s" podCreationTimestamp="2025-12-16 13:16:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:17:00.173418279 +0000 UTC m=+1.125105368" watchObservedRunningTime="2025-12-16 13:17:00.182624228 +0000 UTC m=+1.134311307" Dec 16 13:17:00.192666 kubelet[3038]: I1216 13:17:00.192478 3038 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-2-0-839c7337fa" podStartSLOduration=1.192015514 podStartE2EDuration="1.192015514s" podCreationTimestamp="2025-12-16 13:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:17:00.182772377 +0000 UTC m=+1.134459450" watchObservedRunningTime="2025-12-16 13:17:00.192015514 +0000 UTC m=+1.143702602" Dec 16 13:17:03.525671 kubelet[3038]: I1216 13:17:03.525596 3038 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Dec 16 13:17:03.526345 kubelet[3038]: I1216 13:17:03.526137 3038 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Dec 16 13:17:03.526416 containerd[1785]: time="2025-12-16T13:17:03.525962378Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Dec 16 13:17:04.583493 systemd[1]: Created slice kubepods-besteffort-pod000c58a2_cbe6_4cca_a252_ca41b925284a.slice - libcontainer container kubepods-besteffort-pod000c58a2_cbe6_4cca_a252_ca41b925284a.slice. Dec 16 13:17:04.636620 systemd[1]: Created slice kubepods-besteffort-pod1b0d8679_fb83_49d2_afd3_fb2459b560f9.slice - libcontainer container kubepods-besteffort-pod1b0d8679_fb83_49d2_afd3_fb2459b560f9.slice. Dec 16 13:17:04.645405 kubelet[3038]: I1216 13:17:04.645334 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/000c58a2-cbe6-4cca-a252-ca41b925284a-lib-modules\") pod \"kube-proxy-764fd\" (UID: \"000c58a2-cbe6-4cca-a252-ca41b925284a\") " pod="kube-system/kube-proxy-764fd" Dec 16 13:17:04.645405 kubelet[3038]: I1216 13:17:04.645380 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/000c58a2-cbe6-4cca-a252-ca41b925284a-kube-proxy\") pod \"kube-proxy-764fd\" (UID: \"000c58a2-cbe6-4cca-a252-ca41b925284a\") " pod="kube-system/kube-proxy-764fd" Dec 16 13:17:04.645405 kubelet[3038]: I1216 13:17:04.645400 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/000c58a2-cbe6-4cca-a252-ca41b925284a-xtables-lock\") pod \"kube-proxy-764fd\" (UID: \"000c58a2-cbe6-4cca-a252-ca41b925284a\") " pod="kube-system/kube-proxy-764fd" Dec 16 13:17:04.645902 kubelet[3038]: I1216 13:17:04.645426 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/1b0d8679-fb83-49d2-afd3-fb2459b560f9-var-lib-calico\") pod \"tigera-operator-7dcd859c48-c67th\" (UID: \"1b0d8679-fb83-49d2-afd3-fb2459b560f9\") " pod="tigera-operator/tigera-operator-7dcd859c48-c67th" Dec 16 13:17:04.645902 kubelet[3038]: I1216 13:17:04.645471 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2vk7\" (UniqueName: \"kubernetes.io/projected/1b0d8679-fb83-49d2-afd3-fb2459b560f9-kube-api-access-n2vk7\") pod \"tigera-operator-7dcd859c48-c67th\" (UID: \"1b0d8679-fb83-49d2-afd3-fb2459b560f9\") " pod="tigera-operator/tigera-operator-7dcd859c48-c67th" Dec 16 13:17:04.645902 kubelet[3038]: I1216 13:17:04.645527 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrm9z\" (UniqueName: \"kubernetes.io/projected/000c58a2-cbe6-4cca-a252-ca41b925284a-kube-api-access-hrm9z\") pod \"kube-proxy-764fd\" (UID: \"000c58a2-cbe6-4cca-a252-ca41b925284a\") " pod="kube-system/kube-proxy-764fd" Dec 16 13:17:04.908770 containerd[1785]: time="2025-12-16T13:17:04.908578864Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-764fd,Uid:000c58a2-cbe6-4cca-a252-ca41b925284a,Namespace:kube-system,Attempt:0,}" Dec 16 13:17:04.941099 containerd[1785]: time="2025-12-16T13:17:04.941022202Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-c67th,Uid:1b0d8679-fb83-49d2-afd3-fb2459b560f9,Namespace:tigera-operator,Attempt:0,}" Dec 16 13:17:04.942219 containerd[1785]: time="2025-12-16T13:17:04.942151232Z" level=info msg="connecting to shim f60fc2307aa45105ff9ecb9824aaf08a7f98d6cb0ec9d4ed5733fd676a09f81e" address="unix:///run/containerd/s/58d902626496f0e29232c249d9102366bcf328c87af535c81979e00bdfa68a4d" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:17:04.970071 containerd[1785]: time="2025-12-16T13:17:04.970007814Z" level=info msg="connecting to shim 8754c5370505f7dd0617808a2ed70af3ae81b288a984f36a1753ca67bf36e14d" address="unix:///run/containerd/s/1e76db58d781fd4bd48e647f9e35c5b7d1c80957877de751470468fec46ac6e9" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:17:04.984749 systemd[1]: Started cri-containerd-f60fc2307aa45105ff9ecb9824aaf08a7f98d6cb0ec9d4ed5733fd676a09f81e.scope - libcontainer container f60fc2307aa45105ff9ecb9824aaf08a7f98d6cb0ec9d4ed5733fd676a09f81e. Dec 16 13:17:04.993508 systemd[1]: Started cri-containerd-8754c5370505f7dd0617808a2ed70af3ae81b288a984f36a1753ca67bf36e14d.scope - libcontainer container 8754c5370505f7dd0617808a2ed70af3ae81b288a984f36a1753ca67bf36e14d. Dec 16 13:17:05.026147 containerd[1785]: time="2025-12-16T13:17:05.026067638Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-764fd,Uid:000c58a2-cbe6-4cca-a252-ca41b925284a,Namespace:kube-system,Attempt:0,} returns sandbox id \"f60fc2307aa45105ff9ecb9824aaf08a7f98d6cb0ec9d4ed5733fd676a09f81e\"" Dec 16 13:17:05.028849 containerd[1785]: time="2025-12-16T13:17:05.028810383Z" level=info msg="CreateContainer within sandbox \"f60fc2307aa45105ff9ecb9824aaf08a7f98d6cb0ec9d4ed5733fd676a09f81e\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Dec 16 13:17:05.049005 containerd[1785]: time="2025-12-16T13:17:05.048959899Z" level=info msg="Container abfb63257724c378aa79798cf1dac71389dc50e78b2eb16a7e92c23026ee19ae: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:17:05.061945 containerd[1785]: time="2025-12-16T13:17:05.061807432Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-c67th,Uid:1b0d8679-fb83-49d2-afd3-fb2459b560f9,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"8754c5370505f7dd0617808a2ed70af3ae81b288a984f36a1753ca67bf36e14d\"" Dec 16 13:17:05.062913 containerd[1785]: time="2025-12-16T13:17:05.062866793Z" level=info msg="CreateContainer within sandbox \"f60fc2307aa45105ff9ecb9824aaf08a7f98d6cb0ec9d4ed5733fd676a09f81e\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"abfb63257724c378aa79798cf1dac71389dc50e78b2eb16a7e92c23026ee19ae\"" Dec 16 13:17:05.063128 containerd[1785]: time="2025-12-16T13:17:05.063097021Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Dec 16 13:17:05.064500 containerd[1785]: time="2025-12-16T13:17:05.063334393Z" level=info msg="StartContainer for \"abfb63257724c378aa79798cf1dac71389dc50e78b2eb16a7e92c23026ee19ae\"" Dec 16 13:17:05.064706 containerd[1785]: time="2025-12-16T13:17:05.064668508Z" level=info msg="connecting to shim abfb63257724c378aa79798cf1dac71389dc50e78b2eb16a7e92c23026ee19ae" address="unix:///run/containerd/s/58d902626496f0e29232c249d9102366bcf328c87af535c81979e00bdfa68a4d" protocol=ttrpc version=3 Dec 16 13:17:05.089692 systemd[1]: Started cri-containerd-abfb63257724c378aa79798cf1dac71389dc50e78b2eb16a7e92c23026ee19ae.scope - libcontainer container abfb63257724c378aa79798cf1dac71389dc50e78b2eb16a7e92c23026ee19ae. Dec 16 13:17:05.203313 containerd[1785]: time="2025-12-16T13:17:05.203124775Z" level=info msg="StartContainer for \"abfb63257724c378aa79798cf1dac71389dc50e78b2eb16a7e92c23026ee19ae\" returns successfully" Dec 16 13:17:06.982048 kubelet[3038]: I1216 13:17:06.981921 3038 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-764fd" podStartSLOduration=2.981894627 podStartE2EDuration="2.981894627s" podCreationTimestamp="2025-12-16 13:17:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:17:06.184050849 +0000 UTC m=+7.135738056" watchObservedRunningTime="2025-12-16 13:17:06.981894627 +0000 UTC m=+7.933581709" Dec 16 13:17:07.096154 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1639103476.mount: Deactivated successfully. Dec 16 13:17:07.843683 containerd[1785]: time="2025-12-16T13:17:07.843618309Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:17:07.844885 containerd[1785]: time="2025-12-16T13:17:07.844838922Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=25061691" Dec 16 13:17:07.847192 containerd[1785]: time="2025-12-16T13:17:07.847141085Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:17:07.849348 containerd[1785]: time="2025-12-16T13:17:07.849318833Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:17:07.849850 containerd[1785]: time="2025-12-16T13:17:07.849819515Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 2.786698013s" Dec 16 13:17:07.849884 containerd[1785]: time="2025-12-16T13:17:07.849856070Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Dec 16 13:17:07.851531 containerd[1785]: time="2025-12-16T13:17:07.851496235Z" level=info msg="CreateContainer within sandbox \"8754c5370505f7dd0617808a2ed70af3ae81b288a984f36a1753ca67bf36e14d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Dec 16 13:17:07.864704 containerd[1785]: time="2025-12-16T13:17:07.864654520Z" level=info msg="Container 1b5e2e85a70cfbbf4548f0587c73c154555c195b5b101d46bdfc8648e36b7b7b: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:17:07.888740 containerd[1785]: time="2025-12-16T13:17:07.888546988Z" level=info msg="CreateContainer within sandbox \"8754c5370505f7dd0617808a2ed70af3ae81b288a984f36a1753ca67bf36e14d\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"1b5e2e85a70cfbbf4548f0587c73c154555c195b5b101d46bdfc8648e36b7b7b\"" Dec 16 13:17:07.889011 containerd[1785]: time="2025-12-16T13:17:07.888981687Z" level=info msg="StartContainer for \"1b5e2e85a70cfbbf4548f0587c73c154555c195b5b101d46bdfc8648e36b7b7b\"" Dec 16 13:17:07.889799 containerd[1785]: time="2025-12-16T13:17:07.889770549Z" level=info msg="connecting to shim 1b5e2e85a70cfbbf4548f0587c73c154555c195b5b101d46bdfc8648e36b7b7b" address="unix:///run/containerd/s/1e76db58d781fd4bd48e647f9e35c5b7d1c80957877de751470468fec46ac6e9" protocol=ttrpc version=3 Dec 16 13:17:07.919817 systemd[1]: Started cri-containerd-1b5e2e85a70cfbbf4548f0587c73c154555c195b5b101d46bdfc8648e36b7b7b.scope - libcontainer container 1b5e2e85a70cfbbf4548f0587c73c154555c195b5b101d46bdfc8648e36b7b7b. Dec 16 13:17:07.947574 containerd[1785]: time="2025-12-16T13:17:07.947530808Z" level=info msg="StartContainer for \"1b5e2e85a70cfbbf4548f0587c73c154555c195b5b101d46bdfc8648e36b7b7b\" returns successfully" Dec 16 13:17:08.177716 kubelet[3038]: I1216 13:17:08.177579 3038 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-c67th" podStartSLOduration=1.389803044 podStartE2EDuration="4.177564164s" podCreationTimestamp="2025-12-16 13:17:04 +0000 UTC" firstStartedPulling="2025-12-16 13:17:05.062772284 +0000 UTC m=+6.014459346" lastFinishedPulling="2025-12-16 13:17:07.850533404 +0000 UTC m=+8.802220466" observedRunningTime="2025-12-16 13:17:08.177313774 +0000 UTC m=+9.129000900" watchObservedRunningTime="2025-12-16 13:17:08.177564164 +0000 UTC m=+9.129251246" Dec 16 13:17:12.971788 sudo[2055]: pam_unix(sudo:session): session closed for user root Dec 16 13:17:13.131387 sshd[2041]: Connection closed by 147.75.109.163 port 51352 Dec 16 13:17:13.131930 sshd-session[2031]: pam_unix(sshd:session): session closed for user core Dec 16 13:17:13.135206 systemd[1]: sshd@6-10.0.23.154:22-147.75.109.163:51352.service: Deactivated successfully. Dec 16 13:17:13.136982 systemd[1]: session-7.scope: Deactivated successfully. Dec 16 13:17:13.137143 systemd[1]: session-7.scope: Consumed 5.928s CPU time, 229.2M memory peak. Dec 16 13:17:13.138111 systemd-logind[1763]: Session 7 logged out. Waiting for processes to exit. Dec 16 13:17:13.139000 systemd-logind[1763]: Removed session 7. Dec 16 13:17:17.064645 systemd[1]: Created slice kubepods-besteffort-pode0bd4548_e268_4ac1_8e45_4111582c4dff.slice - libcontainer container kubepods-besteffort-pode0bd4548_e268_4ac1_8e45_4111582c4dff.slice. Dec 16 13:17:17.129357 kubelet[3038]: I1216 13:17:17.129136 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0bd4548-e268-4ac1-8e45-4111582c4dff-tigera-ca-bundle\") pod \"calico-typha-78c46fb8c7-jsr6h\" (UID: \"e0bd4548-e268-4ac1-8e45-4111582c4dff\") " pod="calico-system/calico-typha-78c46fb8c7-jsr6h" Dec 16 13:17:17.129357 kubelet[3038]: I1216 13:17:17.129219 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp9qc\" (UniqueName: \"kubernetes.io/projected/e0bd4548-e268-4ac1-8e45-4111582c4dff-kube-api-access-mp9qc\") pod \"calico-typha-78c46fb8c7-jsr6h\" (UID: \"e0bd4548-e268-4ac1-8e45-4111582c4dff\") " pod="calico-system/calico-typha-78c46fb8c7-jsr6h" Dec 16 13:17:17.129357 kubelet[3038]: I1216 13:17:17.129254 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/e0bd4548-e268-4ac1-8e45-4111582c4dff-typha-certs\") pod \"calico-typha-78c46fb8c7-jsr6h\" (UID: \"e0bd4548-e268-4ac1-8e45-4111582c4dff\") " pod="calico-system/calico-typha-78c46fb8c7-jsr6h" Dec 16 13:17:17.261181 systemd[1]: Created slice kubepods-besteffort-pod4ed9eed7_90de_4a32_a262_60466d684851.slice - libcontainer container kubepods-besteffort-pod4ed9eed7_90de_4a32_a262_60466d684851.slice. Dec 16 13:17:17.329978 kubelet[3038]: I1216 13:17:17.329836 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/4ed9eed7-90de-4a32-a262-60466d684851-var-lib-calico\") pod \"calico-node-7sz26\" (UID: \"4ed9eed7-90de-4a32-a262-60466d684851\") " pod="calico-system/calico-node-7sz26" Dec 16 13:17:17.329978 kubelet[3038]: I1216 13:17:17.329884 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/4ed9eed7-90de-4a32-a262-60466d684851-policysync\") pod \"calico-node-7sz26\" (UID: \"4ed9eed7-90de-4a32-a262-60466d684851\") " pod="calico-system/calico-node-7sz26" Dec 16 13:17:17.329978 kubelet[3038]: I1216 13:17:17.329903 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ed9eed7-90de-4a32-a262-60466d684851-tigera-ca-bundle\") pod \"calico-node-7sz26\" (UID: \"4ed9eed7-90de-4a32-a262-60466d684851\") " pod="calico-system/calico-node-7sz26" Dec 16 13:17:17.329978 kubelet[3038]: I1216 13:17:17.329932 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/4ed9eed7-90de-4a32-a262-60466d684851-cni-bin-dir\") pod \"calico-node-7sz26\" (UID: \"4ed9eed7-90de-4a32-a262-60466d684851\") " pod="calico-system/calico-node-7sz26" Dec 16 13:17:17.329978 kubelet[3038]: I1216 13:17:17.329950 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/4ed9eed7-90de-4a32-a262-60466d684851-cni-log-dir\") pod \"calico-node-7sz26\" (UID: \"4ed9eed7-90de-4a32-a262-60466d684851\") " pod="calico-system/calico-node-7sz26" Dec 16 13:17:17.330295 kubelet[3038]: I1216 13:17:17.329998 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mswg\" (UniqueName: \"kubernetes.io/projected/4ed9eed7-90de-4a32-a262-60466d684851-kube-api-access-4mswg\") pod \"calico-node-7sz26\" (UID: \"4ed9eed7-90de-4a32-a262-60466d684851\") " pod="calico-system/calico-node-7sz26" Dec 16 13:17:17.330295 kubelet[3038]: I1216 13:17:17.330088 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/4ed9eed7-90de-4a32-a262-60466d684851-node-certs\") pod \"calico-node-7sz26\" (UID: \"4ed9eed7-90de-4a32-a262-60466d684851\") " pod="calico-system/calico-node-7sz26" Dec 16 13:17:17.330295 kubelet[3038]: I1216 13:17:17.330146 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/4ed9eed7-90de-4a32-a262-60466d684851-var-run-calico\") pod \"calico-node-7sz26\" (UID: \"4ed9eed7-90de-4a32-a262-60466d684851\") " pod="calico-system/calico-node-7sz26" Dec 16 13:17:17.330295 kubelet[3038]: I1216 13:17:17.330187 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/4ed9eed7-90de-4a32-a262-60466d684851-xtables-lock\") pod \"calico-node-7sz26\" (UID: \"4ed9eed7-90de-4a32-a262-60466d684851\") " pod="calico-system/calico-node-7sz26" Dec 16 13:17:17.330295 kubelet[3038]: I1216 13:17:17.330215 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/4ed9eed7-90de-4a32-a262-60466d684851-cni-net-dir\") pod \"calico-node-7sz26\" (UID: \"4ed9eed7-90de-4a32-a262-60466d684851\") " pod="calico-system/calico-node-7sz26" Dec 16 13:17:17.330501 kubelet[3038]: I1216 13:17:17.330244 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ed9eed7-90de-4a32-a262-60466d684851-lib-modules\") pod \"calico-node-7sz26\" (UID: \"4ed9eed7-90de-4a32-a262-60466d684851\") " pod="calico-system/calico-node-7sz26" Dec 16 13:17:17.330501 kubelet[3038]: I1216 13:17:17.330312 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/4ed9eed7-90de-4a32-a262-60466d684851-flexvol-driver-host\") pod \"calico-node-7sz26\" (UID: \"4ed9eed7-90de-4a32-a262-60466d684851\") " pod="calico-system/calico-node-7sz26" Dec 16 13:17:17.369906 containerd[1785]: time="2025-12-16T13:17:17.369821632Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-78c46fb8c7-jsr6h,Uid:e0bd4548-e268-4ac1-8e45-4111582c4dff,Namespace:calico-system,Attempt:0,}" Dec 16 13:17:17.400890 containerd[1785]: time="2025-12-16T13:17:17.400273670Z" level=info msg="connecting to shim c5b90cf4d934fe4610e6d99f02c094c1e09a716a1846fbbd41b8fee631b1f190" address="unix:///run/containerd/s/a3026e1279d54a29bdb477becb6511b8f9de8d8d3f45b956e3c875d8f5005f96" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:17:17.432971 kubelet[3038]: E1216 13:17:17.432654 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.433227 kubelet[3038]: W1216 13:17:17.433197 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.433347 kubelet[3038]: E1216 13:17:17.433334 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.433824 kubelet[3038]: E1216 13:17:17.433664 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.433824 kubelet[3038]: W1216 13:17:17.433684 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.433824 kubelet[3038]: E1216 13:17:17.433707 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.433955 kubelet[3038]: E1216 13:17:17.433879 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.433955 kubelet[3038]: W1216 13:17:17.433888 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.433955 kubelet[3038]: E1216 13:17:17.433897 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.434482 kubelet[3038]: E1216 13:17:17.434097 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.434482 kubelet[3038]: W1216 13:17:17.434108 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.434482 kubelet[3038]: E1216 13:17:17.434301 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.435855 kubelet[3038]: E1216 13:17:17.435629 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.435855 kubelet[3038]: W1216 13:17:17.435644 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.435855 kubelet[3038]: E1216 13:17:17.435756 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.440075 kubelet[3038]: E1216 13:17:17.436257 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.440075 kubelet[3038]: W1216 13:17:17.436268 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.440075 kubelet[3038]: E1216 13:17:17.436319 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.440075 kubelet[3038]: E1216 13:17:17.436636 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.440075 kubelet[3038]: W1216 13:17:17.436645 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.440075 kubelet[3038]: E1216 13:17:17.436746 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.440075 kubelet[3038]: E1216 13:17:17.436982 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.440075 kubelet[3038]: W1216 13:17:17.436992 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.440075 kubelet[3038]: E1216 13:17:17.437021 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.440075 kubelet[3038]: E1216 13:17:17.437526 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.440357 kubelet[3038]: W1216 13:17:17.437536 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.440357 kubelet[3038]: E1216 13:17:17.437549 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.440357 kubelet[3038]: E1216 13:17:17.437768 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.440357 kubelet[3038]: W1216 13:17:17.437777 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.440357 kubelet[3038]: E1216 13:17:17.437785 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.440357 kubelet[3038]: E1216 13:17:17.438035 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.440357 kubelet[3038]: W1216 13:17:17.438043 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.440357 kubelet[3038]: E1216 13:17:17.438052 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.441177 kubelet[3038]: E1216 13:17:17.441116 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.441177 kubelet[3038]: W1216 13:17:17.441132 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.441177 kubelet[3038]: E1216 13:17:17.441146 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.443866 kubelet[3038]: E1216 13:17:17.443819 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-4vgpv" podUID="2c2ad34d-d82b-4624-87cd-76ece6a8970b" Dec 16 13:17:17.450376 kubelet[3038]: E1216 13:17:17.450348 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.450376 kubelet[3038]: W1216 13:17:17.450378 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.450534 kubelet[3038]: E1216 13:17:17.450397 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.459844 systemd[1]: Started cri-containerd-c5b90cf4d934fe4610e6d99f02c094c1e09a716a1846fbbd41b8fee631b1f190.scope - libcontainer container c5b90cf4d934fe4610e6d99f02c094c1e09a716a1846fbbd41b8fee631b1f190. Dec 16 13:17:17.518237 containerd[1785]: time="2025-12-16T13:17:17.518185623Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-78c46fb8c7-jsr6h,Uid:e0bd4548-e268-4ac1-8e45-4111582c4dff,Namespace:calico-system,Attempt:0,} returns sandbox id \"c5b90cf4d934fe4610e6d99f02c094c1e09a716a1846fbbd41b8fee631b1f190\"" Dec 16 13:17:17.520812 kubelet[3038]: E1216 13:17:17.520741 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.520812 kubelet[3038]: W1216 13:17:17.520765 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.520812 kubelet[3038]: E1216 13:17:17.520801 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.521155 containerd[1785]: time="2025-12-16T13:17:17.520741053Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Dec 16 13:17:17.521221 kubelet[3038]: E1216 13:17:17.520945 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.521221 kubelet[3038]: W1216 13:17:17.520951 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.521221 kubelet[3038]: E1216 13:17:17.520958 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.521221 kubelet[3038]: E1216 13:17:17.521130 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.521221 kubelet[3038]: W1216 13:17:17.521136 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.521221 kubelet[3038]: E1216 13:17:17.521143 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.521495 kubelet[3038]: E1216 13:17:17.521320 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.521495 kubelet[3038]: W1216 13:17:17.521326 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.521495 kubelet[3038]: E1216 13:17:17.521344 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.521495 kubelet[3038]: E1216 13:17:17.521490 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.521495 kubelet[3038]: W1216 13:17:17.521496 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.521713 kubelet[3038]: E1216 13:17:17.521503 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.521713 kubelet[3038]: E1216 13:17:17.521624 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.521713 kubelet[3038]: W1216 13:17:17.521630 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.521713 kubelet[3038]: E1216 13:17:17.521636 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.521936 kubelet[3038]: E1216 13:17:17.521755 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.521936 kubelet[3038]: W1216 13:17:17.521760 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.521936 kubelet[3038]: E1216 13:17:17.521766 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.521936 kubelet[3038]: E1216 13:17:17.521914 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.521936 kubelet[3038]: W1216 13:17:17.521919 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.521936 kubelet[3038]: E1216 13:17:17.521931 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.522184 kubelet[3038]: E1216 13:17:17.522060 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.522184 kubelet[3038]: W1216 13:17:17.522065 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.522184 kubelet[3038]: E1216 13:17:17.522071 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.522346 kubelet[3038]: E1216 13:17:17.522332 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.522346 kubelet[3038]: W1216 13:17:17.522341 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.522426 kubelet[3038]: E1216 13:17:17.522347 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.522505 kubelet[3038]: E1216 13:17:17.522491 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.522505 kubelet[3038]: W1216 13:17:17.522500 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.522597 kubelet[3038]: E1216 13:17:17.522506 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.522635 kubelet[3038]: E1216 13:17:17.522622 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.522635 kubelet[3038]: W1216 13:17:17.522627 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.522635 kubelet[3038]: E1216 13:17:17.522633 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.522767 kubelet[3038]: E1216 13:17:17.522758 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.522767 kubelet[3038]: W1216 13:17:17.522763 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.522856 kubelet[3038]: E1216 13:17:17.522769 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.522912 kubelet[3038]: E1216 13:17:17.522887 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.522912 kubelet[3038]: W1216 13:17:17.522892 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.522912 kubelet[3038]: E1216 13:17:17.522897 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.523023 kubelet[3038]: E1216 13:17:17.523018 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.523023 kubelet[3038]: W1216 13:17:17.523023 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.523110 kubelet[3038]: E1216 13:17:17.523029 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.523160 kubelet[3038]: E1216 13:17:17.523143 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.523160 kubelet[3038]: W1216 13:17:17.523148 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.523160 kubelet[3038]: E1216 13:17:17.523153 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.523291 kubelet[3038]: E1216 13:17:17.523280 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.523291 kubelet[3038]: W1216 13:17:17.523288 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.523412 kubelet[3038]: E1216 13:17:17.523294 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.523412 kubelet[3038]: E1216 13:17:17.523407 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.523412 kubelet[3038]: W1216 13:17:17.523412 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.523519 kubelet[3038]: E1216 13:17:17.523417 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.523553 kubelet[3038]: E1216 13:17:17.523548 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.523589 kubelet[3038]: W1216 13:17:17.523553 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.523589 kubelet[3038]: E1216 13:17:17.523559 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.523697 kubelet[3038]: E1216 13:17:17.523685 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.523697 kubelet[3038]: W1216 13:17:17.523693 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.523773 kubelet[3038]: E1216 13:17:17.523699 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.532224 kubelet[3038]: E1216 13:17:17.532167 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.532224 kubelet[3038]: W1216 13:17:17.532196 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.532224 kubelet[3038]: E1216 13:17:17.532220 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.532385 kubelet[3038]: I1216 13:17:17.532252 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2c2ad34d-d82b-4624-87cd-76ece6a8970b-kubelet-dir\") pod \"csi-node-driver-4vgpv\" (UID: \"2c2ad34d-d82b-4624-87cd-76ece6a8970b\") " pod="calico-system/csi-node-driver-4vgpv" Dec 16 13:17:17.532465 kubelet[3038]: E1216 13:17:17.532436 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.532503 kubelet[3038]: W1216 13:17:17.532492 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.532526 kubelet[3038]: E1216 13:17:17.532511 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.532547 kubelet[3038]: I1216 13:17:17.532530 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/2c2ad34d-d82b-4624-87cd-76ece6a8970b-varrun\") pod \"csi-node-driver-4vgpv\" (UID: \"2c2ad34d-d82b-4624-87cd-76ece6a8970b\") " pod="calico-system/csi-node-driver-4vgpv" Dec 16 13:17:17.532731 kubelet[3038]: E1216 13:17:17.532715 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.532731 kubelet[3038]: W1216 13:17:17.532729 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.532777 kubelet[3038]: E1216 13:17:17.532743 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.532777 kubelet[3038]: I1216 13:17:17.532761 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zgzl\" (UniqueName: \"kubernetes.io/projected/2c2ad34d-d82b-4624-87cd-76ece6a8970b-kube-api-access-9zgzl\") pod \"csi-node-driver-4vgpv\" (UID: \"2c2ad34d-d82b-4624-87cd-76ece6a8970b\") " pod="calico-system/csi-node-driver-4vgpv" Dec 16 13:17:17.532932 kubelet[3038]: E1216 13:17:17.532917 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.532932 kubelet[3038]: W1216 13:17:17.532929 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.532977 kubelet[3038]: E1216 13:17:17.532943 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.532977 kubelet[3038]: I1216 13:17:17.532964 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2c2ad34d-d82b-4624-87cd-76ece6a8970b-registration-dir\") pod \"csi-node-driver-4vgpv\" (UID: \"2c2ad34d-d82b-4624-87cd-76ece6a8970b\") " pod="calico-system/csi-node-driver-4vgpv" Dec 16 13:17:17.533340 kubelet[3038]: E1216 13:17:17.533313 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.533386 kubelet[3038]: W1216 13:17:17.533345 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.533424 kubelet[3038]: E1216 13:17:17.533407 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.533720 kubelet[3038]: E1216 13:17:17.533695 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.533750 kubelet[3038]: W1216 13:17:17.533723 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.533750 kubelet[3038]: E1216 13:17:17.533745 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.534068 kubelet[3038]: E1216 13:17:17.534054 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.534100 kubelet[3038]: W1216 13:17:17.534091 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.534125 kubelet[3038]: E1216 13:17:17.534111 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.534420 kubelet[3038]: E1216 13:17:17.534406 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.534470 kubelet[3038]: W1216 13:17:17.534423 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.534544 kubelet[3038]: E1216 13:17:17.534530 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.534756 kubelet[3038]: E1216 13:17:17.534735 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.534785 kubelet[3038]: W1216 13:17:17.534753 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.534822 kubelet[3038]: E1216 13:17:17.534807 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.535079 kubelet[3038]: E1216 13:17:17.535066 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.535110 kubelet[3038]: W1216 13:17:17.535083 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.535166 kubelet[3038]: E1216 13:17:17.535134 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.535166 kubelet[3038]: I1216 13:17:17.535156 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2c2ad34d-d82b-4624-87cd-76ece6a8970b-socket-dir\") pod \"csi-node-driver-4vgpv\" (UID: \"2c2ad34d-d82b-4624-87cd-76ece6a8970b\") " pod="calico-system/csi-node-driver-4vgpv" Dec 16 13:17:17.535379 kubelet[3038]: E1216 13:17:17.535361 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.535408 kubelet[3038]: W1216 13:17:17.535377 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.535476 kubelet[3038]: E1216 13:17:17.535464 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.535672 kubelet[3038]: E1216 13:17:17.535659 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.535693 kubelet[3038]: W1216 13:17:17.535675 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.535693 kubelet[3038]: E1216 13:17:17.535688 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.535995 kubelet[3038]: E1216 13:17:17.535981 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.536021 kubelet[3038]: W1216 13:17:17.535996 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.536021 kubelet[3038]: E1216 13:17:17.536008 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.536313 kubelet[3038]: E1216 13:17:17.536302 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.536344 kubelet[3038]: W1216 13:17:17.536314 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.536344 kubelet[3038]: E1216 13:17:17.536324 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.536512 kubelet[3038]: E1216 13:17:17.536502 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.536535 kubelet[3038]: W1216 13:17:17.536513 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.536535 kubelet[3038]: E1216 13:17:17.536523 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.565293 containerd[1785]: time="2025-12-16T13:17:17.565240030Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-7sz26,Uid:4ed9eed7-90de-4a32-a262-60466d684851,Namespace:calico-system,Attempt:0,}" Dec 16 13:17:17.596124 containerd[1785]: time="2025-12-16T13:17:17.595989789Z" level=info msg="connecting to shim b3d6a73f7d6057dd9b10e8911e1df9823eff8f3aea25eaa820db1b40e9d5f187" address="unix:///run/containerd/s/84d815db7ca429ed09e54e5657895524671cd67e067043e879559a5ef3725011" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:17:17.625677 systemd[1]: Started cri-containerd-b3d6a73f7d6057dd9b10e8911e1df9823eff8f3aea25eaa820db1b40e9d5f187.scope - libcontainer container b3d6a73f7d6057dd9b10e8911e1df9823eff8f3aea25eaa820db1b40e9d5f187. Dec 16 13:17:17.636923 kubelet[3038]: E1216 13:17:17.636889 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.636923 kubelet[3038]: W1216 13:17:17.636911 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.637111 kubelet[3038]: E1216 13:17:17.636933 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.637167 kubelet[3038]: E1216 13:17:17.637111 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.637167 kubelet[3038]: W1216 13:17:17.637122 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.637167 kubelet[3038]: E1216 13:17:17.637130 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.637402 kubelet[3038]: E1216 13:17:17.637379 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.637428 kubelet[3038]: W1216 13:17:17.637406 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.637461 kubelet[3038]: E1216 13:17:17.637435 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.637647 kubelet[3038]: E1216 13:17:17.637612 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.637647 kubelet[3038]: W1216 13:17:17.637624 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.637647 kubelet[3038]: E1216 13:17:17.637641 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.637884 kubelet[3038]: E1216 13:17:17.637798 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.637884 kubelet[3038]: W1216 13:17:17.637807 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.637884 kubelet[3038]: E1216 13:17:17.637821 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.637986 kubelet[3038]: E1216 13:17:17.637978 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.638331 kubelet[3038]: W1216 13:17:17.637986 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.638331 kubelet[3038]: E1216 13:17:17.638000 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.638331 kubelet[3038]: E1216 13:17:17.638170 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.638331 kubelet[3038]: W1216 13:17:17.638176 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.638331 kubelet[3038]: E1216 13:17:17.638187 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.638331 kubelet[3038]: E1216 13:17:17.638301 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.638331 kubelet[3038]: W1216 13:17:17.638307 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.638570 kubelet[3038]: E1216 13:17:17.638340 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.638570 kubelet[3038]: E1216 13:17:17.638427 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.638570 kubelet[3038]: W1216 13:17:17.638433 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.638570 kubelet[3038]: E1216 13:17:17.638526 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.638659 kubelet[3038]: E1216 13:17:17.638646 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.638659 kubelet[3038]: W1216 13:17:17.638655 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.638702 kubelet[3038]: E1216 13:17:17.638688 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.638821 kubelet[3038]: E1216 13:17:17.638808 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.638821 kubelet[3038]: W1216 13:17:17.638816 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.638863 kubelet[3038]: E1216 13:17:17.638829 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.638978 kubelet[3038]: E1216 13:17:17.638968 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.638978 kubelet[3038]: W1216 13:17:17.638976 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.639025 kubelet[3038]: E1216 13:17:17.638986 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.639098 kubelet[3038]: E1216 13:17:17.639089 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.639098 kubelet[3038]: W1216 13:17:17.639097 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.639165 kubelet[3038]: E1216 13:17:17.639155 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.639216 kubelet[3038]: E1216 13:17:17.639208 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.639239 kubelet[3038]: W1216 13:17:17.639218 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.639302 kubelet[3038]: E1216 13:17:17.639292 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.639338 kubelet[3038]: E1216 13:17:17.639330 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.639338 kubelet[3038]: W1216 13:17:17.639337 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.639384 kubelet[3038]: E1216 13:17:17.639350 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.639572 kubelet[3038]: E1216 13:17:17.639558 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.639572 kubelet[3038]: W1216 13:17:17.639568 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.639651 kubelet[3038]: E1216 13:17:17.639580 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.639720 kubelet[3038]: E1216 13:17:17.639711 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.639720 kubelet[3038]: W1216 13:17:17.639719 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.639759 kubelet[3038]: E1216 13:17:17.639731 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.639886 kubelet[3038]: E1216 13:17:17.639877 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.639886 kubelet[3038]: W1216 13:17:17.639885 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.639985 kubelet[3038]: E1216 13:17:17.639959 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.640016 kubelet[3038]: E1216 13:17:17.640006 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.640016 kubelet[3038]: W1216 13:17:17.640015 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.640096 kubelet[3038]: E1216 13:17:17.640086 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.640199 kubelet[3038]: E1216 13:17:17.640190 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.640224 kubelet[3038]: W1216 13:17:17.640198 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.640224 kubelet[3038]: E1216 13:17:17.640207 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.640623 kubelet[3038]: E1216 13:17:17.640541 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.640623 kubelet[3038]: W1216 13:17:17.640552 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.640623 kubelet[3038]: E1216 13:17:17.640566 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.640718 kubelet[3038]: E1216 13:17:17.640702 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.640718 kubelet[3038]: W1216 13:17:17.640708 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.640772 kubelet[3038]: E1216 13:17:17.640731 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.640893 kubelet[3038]: E1216 13:17:17.640885 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.640893 kubelet[3038]: W1216 13:17:17.640892 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.640945 kubelet[3038]: E1216 13:17:17.640917 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.641061 kubelet[3038]: E1216 13:17:17.641053 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.641061 kubelet[3038]: W1216 13:17:17.641060 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.641108 kubelet[3038]: E1216 13:17:17.641070 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.643034 kubelet[3038]: E1216 13:17:17.642943 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.643034 kubelet[3038]: W1216 13:17:17.642999 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.643034 kubelet[3038]: E1216 13:17:17.643020 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.650324 kubelet[3038]: E1216 13:17:17.650282 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:17.650324 kubelet[3038]: W1216 13:17:17.650303 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:17.650324 kubelet[3038]: E1216 13:17:17.650321 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:17.658176 containerd[1785]: time="2025-12-16T13:17:17.658124414Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-7sz26,Uid:4ed9eed7-90de-4a32-a262-60466d684851,Namespace:calico-system,Attempt:0,} returns sandbox id \"b3d6a73f7d6057dd9b10e8911e1df9823eff8f3aea25eaa820db1b40e9d5f187\"" Dec 16 13:17:19.056632 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2618181233.mount: Deactivated successfully. Dec 16 13:17:19.128634 kubelet[3038]: E1216 13:17:19.128585 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-4vgpv" podUID="2c2ad34d-d82b-4624-87cd-76ece6a8970b" Dec 16 13:17:19.869640 containerd[1785]: time="2025-12-16T13:17:19.869579486Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:17:19.871427 containerd[1785]: time="2025-12-16T13:17:19.871403490Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=35234628" Dec 16 13:17:19.873010 containerd[1785]: time="2025-12-16T13:17:19.872970031Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:17:19.875685 containerd[1785]: time="2025-12-16T13:17:19.875642456Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:17:19.876162 containerd[1785]: time="2025-12-16T13:17:19.876107920Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 2.355320736s" Dec 16 13:17:19.876162 containerd[1785]: time="2025-12-16T13:17:19.876160235Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Dec 16 13:17:19.877163 containerd[1785]: time="2025-12-16T13:17:19.877135839Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Dec 16 13:17:19.884111 containerd[1785]: time="2025-12-16T13:17:19.884082022Z" level=info msg="CreateContainer within sandbox \"c5b90cf4d934fe4610e6d99f02c094c1e09a716a1846fbbd41b8fee631b1f190\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Dec 16 13:17:19.900076 containerd[1785]: time="2025-12-16T13:17:19.900024191Z" level=info msg="Container 4f0c81aae5822039ed03791ac783688243090420417504bf26b570a4d0e6e0c9: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:17:19.909283 containerd[1785]: time="2025-12-16T13:17:19.909243529Z" level=info msg="CreateContainer within sandbox \"c5b90cf4d934fe4610e6d99f02c094c1e09a716a1846fbbd41b8fee631b1f190\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"4f0c81aae5822039ed03791ac783688243090420417504bf26b570a4d0e6e0c9\"" Dec 16 13:17:19.909715 containerd[1785]: time="2025-12-16T13:17:19.909700087Z" level=info msg="StartContainer for \"4f0c81aae5822039ed03791ac783688243090420417504bf26b570a4d0e6e0c9\"" Dec 16 13:17:19.910547 containerd[1785]: time="2025-12-16T13:17:19.910521053Z" level=info msg="connecting to shim 4f0c81aae5822039ed03791ac783688243090420417504bf26b570a4d0e6e0c9" address="unix:///run/containerd/s/a3026e1279d54a29bdb477becb6511b8f9de8d8d3f45b956e3c875d8f5005f96" protocol=ttrpc version=3 Dec 16 13:17:19.931630 systemd[1]: Started cri-containerd-4f0c81aae5822039ed03791ac783688243090420417504bf26b570a4d0e6e0c9.scope - libcontainer container 4f0c81aae5822039ed03791ac783688243090420417504bf26b570a4d0e6e0c9. Dec 16 13:17:19.978474 containerd[1785]: time="2025-12-16T13:17:19.978027801Z" level=info msg="StartContainer for \"4f0c81aae5822039ed03791ac783688243090420417504bf26b570a4d0e6e0c9\" returns successfully" Dec 16 13:17:20.202241 kubelet[3038]: I1216 13:17:20.202097 3038 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-78c46fb8c7-jsr6h" podStartSLOduration=0.844648124 podStartE2EDuration="3.202077356s" podCreationTimestamp="2025-12-16 13:17:17 +0000 UTC" firstStartedPulling="2025-12-16 13:17:17.519557126 +0000 UTC m=+18.471244219" lastFinishedPulling="2025-12-16 13:17:19.876986389 +0000 UTC m=+20.828673451" observedRunningTime="2025-12-16 13:17:20.201818993 +0000 UTC m=+21.153506101" watchObservedRunningTime="2025-12-16 13:17:20.202077356 +0000 UTC m=+21.153764466" Dec 16 13:17:20.241382 kubelet[3038]: E1216 13:17:20.241329 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.241382 kubelet[3038]: W1216 13:17:20.241363 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.241382 kubelet[3038]: E1216 13:17:20.241393 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.242101 kubelet[3038]: E1216 13:17:20.241659 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.242101 kubelet[3038]: W1216 13:17:20.241673 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.242101 kubelet[3038]: E1216 13:17:20.241684 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.242101 kubelet[3038]: E1216 13:17:20.241868 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.242101 kubelet[3038]: W1216 13:17:20.241877 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.242101 kubelet[3038]: E1216 13:17:20.241888 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.242295 kubelet[3038]: E1216 13:17:20.242191 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.242295 kubelet[3038]: W1216 13:17:20.242200 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.242295 kubelet[3038]: E1216 13:17:20.242210 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.242418 kubelet[3038]: E1216 13:17:20.242403 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.242418 kubelet[3038]: W1216 13:17:20.242416 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.242521 kubelet[3038]: E1216 13:17:20.242427 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.242651 kubelet[3038]: E1216 13:17:20.242631 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.242651 kubelet[3038]: W1216 13:17:20.242644 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.243556 kubelet[3038]: E1216 13:17:20.242654 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.243556 kubelet[3038]: E1216 13:17:20.242808 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.243556 kubelet[3038]: W1216 13:17:20.242817 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.243556 kubelet[3038]: E1216 13:17:20.242825 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.243556 kubelet[3038]: E1216 13:17:20.242968 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.243556 kubelet[3038]: W1216 13:17:20.242976 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.243556 kubelet[3038]: E1216 13:17:20.242984 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.243556 kubelet[3038]: E1216 13:17:20.243140 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.243556 kubelet[3038]: W1216 13:17:20.243148 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.243556 kubelet[3038]: E1216 13:17:20.243156 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.243773 kubelet[3038]: E1216 13:17:20.243299 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.243773 kubelet[3038]: W1216 13:17:20.243307 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.243773 kubelet[3038]: E1216 13:17:20.243316 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.243773 kubelet[3038]: E1216 13:17:20.243466 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.243773 kubelet[3038]: W1216 13:17:20.243475 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.243773 kubelet[3038]: E1216 13:17:20.243484 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.243773 kubelet[3038]: E1216 13:17:20.243629 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.243773 kubelet[3038]: W1216 13:17:20.243637 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.243773 kubelet[3038]: E1216 13:17:20.243646 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.243960 kubelet[3038]: E1216 13:17:20.243793 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.243960 kubelet[3038]: W1216 13:17:20.243802 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.243960 kubelet[3038]: E1216 13:17:20.243810 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.243960 kubelet[3038]: E1216 13:17:20.243959 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.244046 kubelet[3038]: W1216 13:17:20.243968 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.244046 kubelet[3038]: E1216 13:17:20.243977 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.246179 kubelet[3038]: E1216 13:17:20.244122 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.246179 kubelet[3038]: W1216 13:17:20.244134 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.246179 kubelet[3038]: E1216 13:17:20.244143 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.257974 kubelet[3038]: E1216 13:17:20.257932 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.257974 kubelet[3038]: W1216 13:17:20.257958 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.257974 kubelet[3038]: E1216 13:17:20.257981 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.258315 kubelet[3038]: E1216 13:17:20.258289 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.258315 kubelet[3038]: W1216 13:17:20.258304 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.258419 kubelet[3038]: E1216 13:17:20.258330 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.258665 kubelet[3038]: E1216 13:17:20.258640 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.258726 kubelet[3038]: W1216 13:17:20.258679 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.258726 kubelet[3038]: E1216 13:17:20.258706 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.259213 kubelet[3038]: E1216 13:17:20.259185 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.259213 kubelet[3038]: W1216 13:17:20.259201 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.259309 kubelet[3038]: E1216 13:17:20.259220 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.259478 kubelet[3038]: E1216 13:17:20.259434 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.259478 kubelet[3038]: W1216 13:17:20.259464 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.259571 kubelet[3038]: E1216 13:17:20.259481 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.259803 kubelet[3038]: E1216 13:17:20.259776 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.259803 kubelet[3038]: W1216 13:17:20.259790 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.259897 kubelet[3038]: E1216 13:17:20.259837 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.259998 kubelet[3038]: E1216 13:17:20.259980 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.259998 kubelet[3038]: W1216 13:17:20.259993 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.260086 kubelet[3038]: E1216 13:17:20.260021 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.260191 kubelet[3038]: E1216 13:17:20.260173 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.260191 kubelet[3038]: W1216 13:17:20.260188 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.260283 kubelet[3038]: E1216 13:17:20.260213 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.261305 kubelet[3038]: E1216 13:17:20.261268 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.261305 kubelet[3038]: W1216 13:17:20.261289 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.261412 kubelet[3038]: E1216 13:17:20.261319 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.261618 kubelet[3038]: E1216 13:17:20.261589 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.261618 kubelet[3038]: W1216 13:17:20.261604 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.261716 kubelet[3038]: E1216 13:17:20.261623 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.261820 kubelet[3038]: E1216 13:17:20.261795 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.261820 kubelet[3038]: W1216 13:17:20.261805 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.261908 kubelet[3038]: E1216 13:17:20.261838 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.261982 kubelet[3038]: E1216 13:17:20.261966 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.261982 kubelet[3038]: W1216 13:17:20.261975 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.262070 kubelet[3038]: E1216 13:17:20.261997 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.262148 kubelet[3038]: E1216 13:17:20.262131 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.262148 kubelet[3038]: W1216 13:17:20.262141 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.262226 kubelet[3038]: E1216 13:17:20.262153 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.262352 kubelet[3038]: E1216 13:17:20.262327 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.262352 kubelet[3038]: W1216 13:17:20.262337 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.262352 kubelet[3038]: E1216 13:17:20.262349 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.262677 kubelet[3038]: E1216 13:17:20.262649 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.262677 kubelet[3038]: W1216 13:17:20.262665 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.262767 kubelet[3038]: E1216 13:17:20.262685 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.262898 kubelet[3038]: E1216 13:17:20.262881 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.262898 kubelet[3038]: W1216 13:17:20.262893 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.262982 kubelet[3038]: E1216 13:17:20.262908 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.263650 kubelet[3038]: E1216 13:17:20.263385 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.263650 kubelet[3038]: W1216 13:17:20.263403 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.263650 kubelet[3038]: E1216 13:17:20.263418 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:20.264741 kubelet[3038]: E1216 13:17:20.264720 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:20.264800 kubelet[3038]: W1216 13:17:20.264737 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:20.264800 kubelet[3038]: E1216 13:17:20.264761 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.128665 kubelet[3038]: E1216 13:17:21.128605 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-4vgpv" podUID="2c2ad34d-d82b-4624-87cd-76ece6a8970b" Dec 16 13:17:21.194814 kubelet[3038]: I1216 13:17:21.194773 3038 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 13:17:21.252223 kubelet[3038]: E1216 13:17:21.252176 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.252223 kubelet[3038]: W1216 13:17:21.252200 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.252223 kubelet[3038]: E1216 13:17:21.252220 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.252822 kubelet[3038]: E1216 13:17:21.252348 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.252822 kubelet[3038]: W1216 13:17:21.252353 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.252822 kubelet[3038]: E1216 13:17:21.252360 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.252822 kubelet[3038]: E1216 13:17:21.252483 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.252822 kubelet[3038]: W1216 13:17:21.252489 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.252822 kubelet[3038]: E1216 13:17:21.252495 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.252822 kubelet[3038]: E1216 13:17:21.252651 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.252822 kubelet[3038]: W1216 13:17:21.252656 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.252822 kubelet[3038]: E1216 13:17:21.252662 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.252822 kubelet[3038]: E1216 13:17:21.252779 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.253134 kubelet[3038]: W1216 13:17:21.252783 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.253134 kubelet[3038]: E1216 13:17:21.252789 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.253134 kubelet[3038]: E1216 13:17:21.252897 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.253134 kubelet[3038]: W1216 13:17:21.252903 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.253134 kubelet[3038]: E1216 13:17:21.252909 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.253134 kubelet[3038]: E1216 13:17:21.253011 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.253134 kubelet[3038]: W1216 13:17:21.253016 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.253134 kubelet[3038]: E1216 13:17:21.253021 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.253134 kubelet[3038]: E1216 13:17:21.253132 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.253134 kubelet[3038]: W1216 13:17:21.253137 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.253554 kubelet[3038]: E1216 13:17:21.253143 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.253554 kubelet[3038]: E1216 13:17:21.253254 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.253554 kubelet[3038]: W1216 13:17:21.253260 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.253554 kubelet[3038]: E1216 13:17:21.253265 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.253554 kubelet[3038]: E1216 13:17:21.253370 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.253554 kubelet[3038]: W1216 13:17:21.253377 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.253554 kubelet[3038]: E1216 13:17:21.253382 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.253554 kubelet[3038]: E1216 13:17:21.253501 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.253554 kubelet[3038]: W1216 13:17:21.253506 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.253554 kubelet[3038]: E1216 13:17:21.253511 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.253858 kubelet[3038]: E1216 13:17:21.253621 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.253858 kubelet[3038]: W1216 13:17:21.253626 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.253858 kubelet[3038]: E1216 13:17:21.253631 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.253858 kubelet[3038]: E1216 13:17:21.253739 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.253858 kubelet[3038]: W1216 13:17:21.253744 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.253858 kubelet[3038]: E1216 13:17:21.253749 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.253858 kubelet[3038]: E1216 13:17:21.253859 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.254076 kubelet[3038]: W1216 13:17:21.253864 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.254076 kubelet[3038]: E1216 13:17:21.253870 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.254076 kubelet[3038]: E1216 13:17:21.253981 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.254076 kubelet[3038]: W1216 13:17:21.253986 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.254076 kubelet[3038]: E1216 13:17:21.253991 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.264641 kubelet[3038]: E1216 13:17:21.264585 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.264641 kubelet[3038]: W1216 13:17:21.264614 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.264824 kubelet[3038]: E1216 13:17:21.264651 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.264955 kubelet[3038]: E1216 13:17:21.264931 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.264955 kubelet[3038]: W1216 13:17:21.264945 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.265011 kubelet[3038]: E1216 13:17:21.264963 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.265192 kubelet[3038]: E1216 13:17:21.265169 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.265192 kubelet[3038]: W1216 13:17:21.265182 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.265247 kubelet[3038]: E1216 13:17:21.265198 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.265560 kubelet[3038]: E1216 13:17:21.265529 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.265560 kubelet[3038]: W1216 13:17:21.265553 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.265626 kubelet[3038]: E1216 13:17:21.265579 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.265828 kubelet[3038]: E1216 13:17:21.265806 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.265828 kubelet[3038]: W1216 13:17:21.265820 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.265889 kubelet[3038]: E1216 13:17:21.265837 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.266018 kubelet[3038]: E1216 13:17:21.266001 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.266018 kubelet[3038]: W1216 13:17:21.266014 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.266070 kubelet[3038]: E1216 13:17:21.266031 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.266249 kubelet[3038]: E1216 13:17:21.266233 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.266249 kubelet[3038]: W1216 13:17:21.266246 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.266304 kubelet[3038]: E1216 13:17:21.266271 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.266488 kubelet[3038]: E1216 13:17:21.266470 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.266488 kubelet[3038]: W1216 13:17:21.266484 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.266545 kubelet[3038]: E1216 13:17:21.266508 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.266689 kubelet[3038]: E1216 13:17:21.266673 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.266689 kubelet[3038]: W1216 13:17:21.266686 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.266741 kubelet[3038]: E1216 13:17:21.266711 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.266905 kubelet[3038]: E1216 13:17:21.266889 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.266905 kubelet[3038]: W1216 13:17:21.266902 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.266959 kubelet[3038]: E1216 13:17:21.266919 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.267120 kubelet[3038]: E1216 13:17:21.267104 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.267120 kubelet[3038]: W1216 13:17:21.267116 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.267177 kubelet[3038]: E1216 13:17:21.267132 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.267302 kubelet[3038]: E1216 13:17:21.267286 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.267302 kubelet[3038]: W1216 13:17:21.267299 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.267352 kubelet[3038]: E1216 13:17:21.267315 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.267525 kubelet[3038]: E1216 13:17:21.267509 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.267525 kubelet[3038]: W1216 13:17:21.267522 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.267583 kubelet[3038]: E1216 13:17:21.267538 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.267772 kubelet[3038]: E1216 13:17:21.267757 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.267772 kubelet[3038]: W1216 13:17:21.267769 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.267829 kubelet[3038]: E1216 13:17:21.267779 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.267931 kubelet[3038]: E1216 13:17:21.267917 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.267931 kubelet[3038]: W1216 13:17:21.267926 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.267989 kubelet[3038]: E1216 13:17:21.267942 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.268132 kubelet[3038]: E1216 13:17:21.268119 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.268132 kubelet[3038]: W1216 13:17:21.268127 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.268195 kubelet[3038]: E1216 13:17:21.268138 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.268456 kubelet[3038]: E1216 13:17:21.268424 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.268456 kubelet[3038]: W1216 13:17:21.268443 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.268516 kubelet[3038]: E1216 13:17:21.268473 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.268672 kubelet[3038]: E1216 13:17:21.268649 3038 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:17:21.268672 kubelet[3038]: W1216 13:17:21.268663 3038 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:17:21.268734 kubelet[3038]: E1216 13:17:21.268674 3038 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:17:21.415867 containerd[1785]: time="2025-12-16T13:17:21.415729219Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:17:21.416958 containerd[1785]: time="2025-12-16T13:17:21.416913682Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4446754" Dec 16 13:17:21.418537 containerd[1785]: time="2025-12-16T13:17:21.418494993Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:17:21.423209 containerd[1785]: time="2025-12-16T13:17:21.423164194Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:17:21.423724 containerd[1785]: time="2025-12-16T13:17:21.423533889Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.546364869s" Dec 16 13:17:21.423724 containerd[1785]: time="2025-12-16T13:17:21.423570250Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Dec 16 13:17:21.425125 containerd[1785]: time="2025-12-16T13:17:21.425102318Z" level=info msg="CreateContainer within sandbox \"b3d6a73f7d6057dd9b10e8911e1df9823eff8f3aea25eaa820db1b40e9d5f187\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Dec 16 13:17:21.435469 containerd[1785]: time="2025-12-16T13:17:21.435371799Z" level=info msg="Container cada80d26a9e02f96fba798a567f29897131270a83948a8c04c272ce6652e903: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:17:21.446353 containerd[1785]: time="2025-12-16T13:17:21.446294287Z" level=info msg="CreateContainer within sandbox \"b3d6a73f7d6057dd9b10e8911e1df9823eff8f3aea25eaa820db1b40e9d5f187\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"cada80d26a9e02f96fba798a567f29897131270a83948a8c04c272ce6652e903\"" Dec 16 13:17:21.447056 containerd[1785]: time="2025-12-16T13:17:21.446806144Z" level=info msg="StartContainer for \"cada80d26a9e02f96fba798a567f29897131270a83948a8c04c272ce6652e903\"" Dec 16 13:17:21.448124 containerd[1785]: time="2025-12-16T13:17:21.448096919Z" level=info msg="connecting to shim cada80d26a9e02f96fba798a567f29897131270a83948a8c04c272ce6652e903" address="unix:///run/containerd/s/84d815db7ca429ed09e54e5657895524671cd67e067043e879559a5ef3725011" protocol=ttrpc version=3 Dec 16 13:17:21.474653 systemd[1]: Started cri-containerd-cada80d26a9e02f96fba798a567f29897131270a83948a8c04c272ce6652e903.scope - libcontainer container cada80d26a9e02f96fba798a567f29897131270a83948a8c04c272ce6652e903. Dec 16 13:17:21.556179 containerd[1785]: time="2025-12-16T13:17:21.556057049Z" level=info msg="StartContainer for \"cada80d26a9e02f96fba798a567f29897131270a83948a8c04c272ce6652e903\" returns successfully" Dec 16 13:17:21.562809 systemd[1]: cri-containerd-cada80d26a9e02f96fba798a567f29897131270a83948a8c04c272ce6652e903.scope: Deactivated successfully. Dec 16 13:17:21.564376 containerd[1785]: time="2025-12-16T13:17:21.564343432Z" level=info msg="received container exit event container_id:\"cada80d26a9e02f96fba798a567f29897131270a83948a8c04c272ce6652e903\" id:\"cada80d26a9e02f96fba798a567f29897131270a83948a8c04c272ce6652e903\" pid:3800 exited_at:{seconds:1765891041 nanos:564034621}" Dec 16 13:17:21.585935 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-cada80d26a9e02f96fba798a567f29897131270a83948a8c04c272ce6652e903-rootfs.mount: Deactivated successfully. Dec 16 13:17:22.200308 containerd[1785]: time="2025-12-16T13:17:22.200245182Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Dec 16 13:17:23.128289 kubelet[3038]: E1216 13:17:23.128178 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-4vgpv" podUID="2c2ad34d-d82b-4624-87cd-76ece6a8970b" Dec 16 13:17:25.129028 kubelet[3038]: E1216 13:17:25.128950 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-4vgpv" podUID="2c2ad34d-d82b-4624-87cd-76ece6a8970b" Dec 16 13:17:25.816287 containerd[1785]: time="2025-12-16T13:17:25.816208804Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:17:25.818482 containerd[1785]: time="2025-12-16T13:17:25.818417359Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70446859" Dec 16 13:17:25.820336 containerd[1785]: time="2025-12-16T13:17:25.820291184Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:17:25.822956 containerd[1785]: time="2025-12-16T13:17:25.822889018Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:17:25.823411 containerd[1785]: time="2025-12-16T13:17:25.823369805Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 3.623079267s" Dec 16 13:17:25.823530 containerd[1785]: time="2025-12-16T13:17:25.823402282Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Dec 16 13:17:25.825463 containerd[1785]: time="2025-12-16T13:17:25.825416312Z" level=info msg="CreateContainer within sandbox \"b3d6a73f7d6057dd9b10e8911e1df9823eff8f3aea25eaa820db1b40e9d5f187\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Dec 16 13:17:25.837589 containerd[1785]: time="2025-12-16T13:17:25.837528165Z" level=info msg="Container 5657a3030ab2e65028858a9f0e30a81fac7c8786c74e2f3f64bc1e6892f83ec8: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:17:25.849412 containerd[1785]: time="2025-12-16T13:17:25.849348112Z" level=info msg="CreateContainer within sandbox \"b3d6a73f7d6057dd9b10e8911e1df9823eff8f3aea25eaa820db1b40e9d5f187\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"5657a3030ab2e65028858a9f0e30a81fac7c8786c74e2f3f64bc1e6892f83ec8\"" Dec 16 13:17:25.850004 containerd[1785]: time="2025-12-16T13:17:25.849953641Z" level=info msg="StartContainer for \"5657a3030ab2e65028858a9f0e30a81fac7c8786c74e2f3f64bc1e6892f83ec8\"" Dec 16 13:17:25.851637 containerd[1785]: time="2025-12-16T13:17:25.851574680Z" level=info msg="connecting to shim 5657a3030ab2e65028858a9f0e30a81fac7c8786c74e2f3f64bc1e6892f83ec8" address="unix:///run/containerd/s/84d815db7ca429ed09e54e5657895524671cd67e067043e879559a5ef3725011" protocol=ttrpc version=3 Dec 16 13:17:25.882716 systemd[1]: Started cri-containerd-5657a3030ab2e65028858a9f0e30a81fac7c8786c74e2f3f64bc1e6892f83ec8.scope - libcontainer container 5657a3030ab2e65028858a9f0e30a81fac7c8786c74e2f3f64bc1e6892f83ec8. Dec 16 13:17:25.982857 containerd[1785]: time="2025-12-16T13:17:25.982740080Z" level=info msg="StartContainer for \"5657a3030ab2e65028858a9f0e30a81fac7c8786c74e2f3f64bc1e6892f83ec8\" returns successfully" Dec 16 13:17:26.402808 containerd[1785]: time="2025-12-16T13:17:26.402744528Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 16 13:17:26.404673 systemd[1]: cri-containerd-5657a3030ab2e65028858a9f0e30a81fac7c8786c74e2f3f64bc1e6892f83ec8.scope: Deactivated successfully. Dec 16 13:17:26.405090 systemd[1]: cri-containerd-5657a3030ab2e65028858a9f0e30a81fac7c8786c74e2f3f64bc1e6892f83ec8.scope: Consumed 586ms CPU time, 191.8M memory peak, 171.3M written to disk. Dec 16 13:17:26.405804 containerd[1785]: time="2025-12-16T13:17:26.405772193Z" level=info msg="received container exit event container_id:\"5657a3030ab2e65028858a9f0e30a81fac7c8786c74e2f3f64bc1e6892f83ec8\" id:\"5657a3030ab2e65028858a9f0e30a81fac7c8786c74e2f3f64bc1e6892f83ec8\" pid:3862 exited_at:{seconds:1765891046 nanos:405544155}" Dec 16 13:17:26.427155 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5657a3030ab2e65028858a9f0e30a81fac7c8786c74e2f3f64bc1e6892f83ec8-rootfs.mount: Deactivated successfully. Dec 16 13:17:26.480061 kubelet[3038]: I1216 13:17:26.480026 3038 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Dec 16 13:17:26.521334 systemd[1]: Created slice kubepods-burstable-podc4a23620_f532_415e_b3e7_243feb3b5c27.slice - libcontainer container kubepods-burstable-podc4a23620_f532_415e_b3e7_243feb3b5c27.slice. Dec 16 13:17:26.527002 systemd[1]: Created slice kubepods-besteffort-podfe36e307_b3ce_4532_b1d5_65732a7edfab.slice - libcontainer container kubepods-besteffort-podfe36e307_b3ce_4532_b1d5_65732a7edfab.slice. Dec 16 13:17:26.534123 systemd[1]: Created slice kubepods-besteffort-pod4fb5337e_3d19_4a26_9e2e_58b08d0a6154.slice - libcontainer container kubepods-besteffort-pod4fb5337e_3d19_4a26_9e2e_58b08d0a6154.slice. Dec 16 13:17:26.538133 systemd[1]: Created slice kubepods-burstable-podf4f96a7b_1e66_4e29_b3f0_890217e05473.slice - libcontainer container kubepods-burstable-podf4f96a7b_1e66_4e29_b3f0_890217e05473.slice. Dec 16 13:17:26.542571 systemd[1]: Created slice kubepods-besteffort-pod8d366b0a_e187_4483_8bcf_46758c32eaee.slice - libcontainer container kubepods-besteffort-pod8d366b0a_e187_4483_8bcf_46758c32eaee.slice. Dec 16 13:17:26.547580 systemd[1]: Created slice kubepods-besteffort-pod783ebdab_8697_4127_867a_9ceb640da894.slice - libcontainer container kubepods-besteffort-pod783ebdab_8697_4127_867a_9ceb640da894.slice. Dec 16 13:17:26.550559 systemd[1]: Created slice kubepods-besteffort-pod14065d61_1c00_4d68_849b_4eefd74615de.slice - libcontainer container kubepods-besteffort-pod14065d61_1c00_4d68_849b_4eefd74615de.slice. Dec 16 13:17:26.602233 kubelet[3038]: I1216 13:17:26.602166 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14065d61-1c00-4d68-849b-4eefd74615de-tigera-ca-bundle\") pod \"calico-kube-controllers-5d8cb67fb9-kbt5r\" (UID: \"14065d61-1c00-4d68-849b-4eefd74615de\") " pod="calico-system/calico-kube-controllers-5d8cb67fb9-kbt5r" Dec 16 13:17:26.602233 kubelet[3038]: I1216 13:17:26.602216 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjsz4\" (UniqueName: \"kubernetes.io/projected/f4f96a7b-1e66-4e29-b3f0-890217e05473-kube-api-access-kjsz4\") pod \"coredns-668d6bf9bc-7jsz2\" (UID: \"f4f96a7b-1e66-4e29-b3f0-890217e05473\") " pod="kube-system/coredns-668d6bf9bc-7jsz2" Dec 16 13:17:26.602233 kubelet[3038]: I1216 13:17:26.602236 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkfg5\" (UniqueName: \"kubernetes.io/projected/14065d61-1c00-4d68-849b-4eefd74615de-kube-api-access-mkfg5\") pod \"calico-kube-controllers-5d8cb67fb9-kbt5r\" (UID: \"14065d61-1c00-4d68-849b-4eefd74615de\") " pod="calico-system/calico-kube-controllers-5d8cb67fb9-kbt5r" Dec 16 13:17:26.602427 kubelet[3038]: I1216 13:17:26.602256 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k7p2\" (UniqueName: \"kubernetes.io/projected/8d366b0a-e187-4483-8bcf-46758c32eaee-kube-api-access-5k7p2\") pod \"calico-apiserver-76768d65dd-q8jbl\" (UID: \"8d366b0a-e187-4483-8bcf-46758c32eaee\") " pod="calico-apiserver/calico-apiserver-76768d65dd-q8jbl" Dec 16 13:17:26.602427 kubelet[3038]: I1216 13:17:26.602278 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn2rm\" (UniqueName: \"kubernetes.io/projected/fe36e307-b3ce-4532-b1d5-65732a7edfab-kube-api-access-tn2rm\") pod \"calico-apiserver-76768d65dd-ncz5v\" (UID: \"fe36e307-b3ce-4532-b1d5-65732a7edfab\") " pod="calico-apiserver/calico-apiserver-76768d65dd-ncz5v" Dec 16 13:17:26.602427 kubelet[3038]: I1216 13:17:26.602297 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/783ebdab-8697-4127-867a-9ceb640da894-whisker-backend-key-pair\") pod \"whisker-6fd8fcb68d-rk72m\" (UID: \"783ebdab-8697-4127-867a-9ceb640da894\") " pod="calico-system/whisker-6fd8fcb68d-rk72m" Dec 16 13:17:26.602427 kubelet[3038]: I1216 13:17:26.602314 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ps7h\" (UniqueName: \"kubernetes.io/projected/783ebdab-8697-4127-867a-9ceb640da894-kube-api-access-2ps7h\") pod \"whisker-6fd8fcb68d-rk72m\" (UID: \"783ebdab-8697-4127-867a-9ceb640da894\") " pod="calico-system/whisker-6fd8fcb68d-rk72m" Dec 16 13:17:26.602427 kubelet[3038]: I1216 13:17:26.602329 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4f96a7b-1e66-4e29-b3f0-890217e05473-config-volume\") pod \"coredns-668d6bf9bc-7jsz2\" (UID: \"f4f96a7b-1e66-4e29-b3f0-890217e05473\") " pod="kube-system/coredns-668d6bf9bc-7jsz2" Dec 16 13:17:26.602574 kubelet[3038]: I1216 13:17:26.602344 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/4fb5337e-3d19-4a26-9e2e-58b08d0a6154-goldmane-key-pair\") pod \"goldmane-666569f655-nz8db\" (UID: \"4fb5337e-3d19-4a26-9e2e-58b08d0a6154\") " pod="calico-system/goldmane-666569f655-nz8db" Dec 16 13:17:26.602574 kubelet[3038]: I1216 13:17:26.602359 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8d366b0a-e187-4483-8bcf-46758c32eaee-calico-apiserver-certs\") pod \"calico-apiserver-76768d65dd-q8jbl\" (UID: \"8d366b0a-e187-4483-8bcf-46758c32eaee\") " pod="calico-apiserver/calico-apiserver-76768d65dd-q8jbl" Dec 16 13:17:26.602574 kubelet[3038]: I1216 13:17:26.602376 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/fe36e307-b3ce-4532-b1d5-65732a7edfab-calico-apiserver-certs\") pod \"calico-apiserver-76768d65dd-ncz5v\" (UID: \"fe36e307-b3ce-4532-b1d5-65732a7edfab\") " pod="calico-apiserver/calico-apiserver-76768d65dd-ncz5v" Dec 16 13:17:26.602574 kubelet[3038]: I1216 13:17:26.602391 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4a23620-f532-415e-b3e7-243feb3b5c27-config-volume\") pod \"coredns-668d6bf9bc-xhgqj\" (UID: \"c4a23620-f532-415e-b3e7-243feb3b5c27\") " pod="kube-system/coredns-668d6bf9bc-xhgqj" Dec 16 13:17:26.602574 kubelet[3038]: I1216 13:17:26.602431 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64znz\" (UniqueName: \"kubernetes.io/projected/c4a23620-f532-415e-b3e7-243feb3b5c27-kube-api-access-64znz\") pod \"coredns-668d6bf9bc-xhgqj\" (UID: \"c4a23620-f532-415e-b3e7-243feb3b5c27\") " pod="kube-system/coredns-668d6bf9bc-xhgqj" Dec 16 13:17:26.602685 kubelet[3038]: I1216 13:17:26.602547 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/783ebdab-8697-4127-867a-9ceb640da894-whisker-ca-bundle\") pod \"whisker-6fd8fcb68d-rk72m\" (UID: \"783ebdab-8697-4127-867a-9ceb640da894\") " pod="calico-system/whisker-6fd8fcb68d-rk72m" Dec 16 13:17:26.602685 kubelet[3038]: I1216 13:17:26.602614 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fb5337e-3d19-4a26-9e2e-58b08d0a6154-config\") pod \"goldmane-666569f655-nz8db\" (UID: \"4fb5337e-3d19-4a26-9e2e-58b08d0a6154\") " pod="calico-system/goldmane-666569f655-nz8db" Dec 16 13:17:26.602685 kubelet[3038]: I1216 13:17:26.602641 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4fb5337e-3d19-4a26-9e2e-58b08d0a6154-goldmane-ca-bundle\") pod \"goldmane-666569f655-nz8db\" (UID: \"4fb5337e-3d19-4a26-9e2e-58b08d0a6154\") " pod="calico-system/goldmane-666569f655-nz8db" Dec 16 13:17:26.602685 kubelet[3038]: I1216 13:17:26.602665 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8hl9\" (UniqueName: \"kubernetes.io/projected/4fb5337e-3d19-4a26-9e2e-58b08d0a6154-kube-api-access-x8hl9\") pod \"goldmane-666569f655-nz8db\" (UID: \"4fb5337e-3d19-4a26-9e2e-58b08d0a6154\") " pod="calico-system/goldmane-666569f655-nz8db" Dec 16 13:17:26.825500 containerd[1785]: time="2025-12-16T13:17:26.825417246Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-xhgqj,Uid:c4a23620-f532-415e-b3e7-243feb3b5c27,Namespace:kube-system,Attempt:0,}" Dec 16 13:17:26.830706 containerd[1785]: time="2025-12-16T13:17:26.830668400Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76768d65dd-ncz5v,Uid:fe36e307-b3ce-4532-b1d5-65732a7edfab,Namespace:calico-apiserver,Attempt:0,}" Dec 16 13:17:26.837552 containerd[1785]: time="2025-12-16T13:17:26.837511402Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-nz8db,Uid:4fb5337e-3d19-4a26-9e2e-58b08d0a6154,Namespace:calico-system,Attempt:0,}" Dec 16 13:17:26.841278 containerd[1785]: time="2025-12-16T13:17:26.841249266Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-7jsz2,Uid:f4f96a7b-1e66-4e29-b3f0-890217e05473,Namespace:kube-system,Attempt:0,}" Dec 16 13:17:26.845942 containerd[1785]: time="2025-12-16T13:17:26.845910997Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76768d65dd-q8jbl,Uid:8d366b0a-e187-4483-8bcf-46758c32eaee,Namespace:calico-apiserver,Attempt:0,}" Dec 16 13:17:26.852404 containerd[1785]: time="2025-12-16T13:17:26.851921209Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6fd8fcb68d-rk72m,Uid:783ebdab-8697-4127-867a-9ceb640da894,Namespace:calico-system,Attempt:0,}" Dec 16 13:17:26.855253 containerd[1785]: time="2025-12-16T13:17:26.855203022Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5d8cb67fb9-kbt5r,Uid:14065d61-1c00-4d68-849b-4eefd74615de,Namespace:calico-system,Attempt:0,}" Dec 16 13:17:26.900624 containerd[1785]: time="2025-12-16T13:17:26.900570789Z" level=error msg="Failed to destroy network for sandbox \"7612632a4518f90f0b1100c63298733fb7933679ab190074134faadad86d783c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:17:26.904881 containerd[1785]: time="2025-12-16T13:17:26.904810356Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-xhgqj,Uid:c4a23620-f532-415e-b3e7-243feb3b5c27,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7612632a4518f90f0b1100c63298733fb7933679ab190074134faadad86d783c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:17:26.905141 kubelet[3038]: E1216 13:17:26.905063 3038 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7612632a4518f90f0b1100c63298733fb7933679ab190074134faadad86d783c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:17:26.905199 kubelet[3038]: E1216 13:17:26.905146 3038 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7612632a4518f90f0b1100c63298733fb7933679ab190074134faadad86d783c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-xhgqj" Dec 16 13:17:26.905199 kubelet[3038]: E1216 13:17:26.905170 3038 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7612632a4518f90f0b1100c63298733fb7933679ab190074134faadad86d783c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-xhgqj" Dec 16 13:17:26.905253 kubelet[3038]: E1216 13:17:26.905216 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-xhgqj_kube-system(c4a23620-f532-415e-b3e7-243feb3b5c27)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-xhgqj_kube-system(c4a23620-f532-415e-b3e7-243feb3b5c27)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7612632a4518f90f0b1100c63298733fb7933679ab190074134faadad86d783c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-xhgqj" podUID="c4a23620-f532-415e-b3e7-243feb3b5c27" Dec 16 13:17:26.928357 containerd[1785]: time="2025-12-16T13:17:26.928299555Z" level=error msg="Failed to destroy network for sandbox \"3e314ed546ef9e3a994b61783f2e8a33d908415c55b311129e64988babe8364e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:17:26.931269 containerd[1785]: time="2025-12-16T13:17:26.931215764Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76768d65dd-ncz5v,Uid:fe36e307-b3ce-4532-b1d5-65732a7edfab,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3e314ed546ef9e3a994b61783f2e8a33d908415c55b311129e64988babe8364e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:17:26.931502 kubelet[3038]: E1216 13:17:26.931470 3038 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3e314ed546ef9e3a994b61783f2e8a33d908415c55b311129e64988babe8364e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:17:26.931571 kubelet[3038]: E1216 13:17:26.931526 3038 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3e314ed546ef9e3a994b61783f2e8a33d908415c55b311129e64988babe8364e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-76768d65dd-ncz5v" Dec 16 13:17:26.931571 kubelet[3038]: E1216 13:17:26.931547 3038 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3e314ed546ef9e3a994b61783f2e8a33d908415c55b311129e64988babe8364e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-76768d65dd-ncz5v" Dec 16 13:17:26.931629 kubelet[3038]: E1216 13:17:26.931595 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-76768d65dd-ncz5v_calico-apiserver(fe36e307-b3ce-4532-b1d5-65732a7edfab)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-76768d65dd-ncz5v_calico-apiserver(fe36e307-b3ce-4532-b1d5-65732a7edfab)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3e314ed546ef9e3a994b61783f2e8a33d908415c55b311129e64988babe8364e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-76768d65dd-ncz5v" podUID="fe36e307-b3ce-4532-b1d5-65732a7edfab" Dec 16 13:17:26.934384 containerd[1785]: time="2025-12-16T13:17:26.934340829Z" level=error msg="Failed to destroy network for sandbox \"c0609ad6a5a417f4dfdc97dca8d9b27811c7afe6d2770d8666b6cd26f7abce0e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:17:26.934753 containerd[1785]: time="2025-12-16T13:17:26.934717385Z" level=error msg="Failed to destroy network for sandbox \"259f030c2315860ff390ea122c2d0a8a29ae5b008a1969256127b85df1ae980f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:17:26.935720 containerd[1785]: time="2025-12-16T13:17:26.935692990Z" level=error msg="Failed to destroy network for sandbox \"dc394488bebcef6baffdf7477fdfe2b652f7fa37af5fc8e4fd5610de1d1b22e9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:17:26.936550 containerd[1785]: time="2025-12-16T13:17:26.936493600Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-nz8db,Uid:4fb5337e-3d19-4a26-9e2e-58b08d0a6154,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c0609ad6a5a417f4dfdc97dca8d9b27811c7afe6d2770d8666b6cd26f7abce0e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:17:26.936861 kubelet[3038]: E1216 13:17:26.936824 3038 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c0609ad6a5a417f4dfdc97dca8d9b27811c7afe6d2770d8666b6cd26f7abce0e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:17:26.937008 kubelet[3038]: E1216 13:17:26.936890 3038 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c0609ad6a5a417f4dfdc97dca8d9b27811c7afe6d2770d8666b6cd26f7abce0e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-nz8db" Dec 16 13:17:26.937008 kubelet[3038]: E1216 13:17:26.936922 3038 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c0609ad6a5a417f4dfdc97dca8d9b27811c7afe6d2770d8666b6cd26f7abce0e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-nz8db" Dec 16 13:17:26.937008 kubelet[3038]: E1216 13:17:26.936963 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-nz8db_calico-system(4fb5337e-3d19-4a26-9e2e-58b08d0a6154)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-nz8db_calico-system(4fb5337e-3d19-4a26-9e2e-58b08d0a6154)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c0609ad6a5a417f4dfdc97dca8d9b27811c7afe6d2770d8666b6cd26f7abce0e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-nz8db" podUID="4fb5337e-3d19-4a26-9e2e-58b08d0a6154" Dec 16 13:17:26.938095 containerd[1785]: time="2025-12-16T13:17:26.937979627Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6fd8fcb68d-rk72m,Uid:783ebdab-8697-4127-867a-9ceb640da894,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"259f030c2315860ff390ea122c2d0a8a29ae5b008a1969256127b85df1ae980f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:17:26.939135 kubelet[3038]: E1216 13:17:26.938161 3038 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"259f030c2315860ff390ea122c2d0a8a29ae5b008a1969256127b85df1ae980f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:17:26.939135 kubelet[3038]: E1216 13:17:26.938203 3038 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"259f030c2315860ff390ea122c2d0a8a29ae5b008a1969256127b85df1ae980f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6fd8fcb68d-rk72m" Dec 16 13:17:26.939135 kubelet[3038]: E1216 13:17:26.938219 3038 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"259f030c2315860ff390ea122c2d0a8a29ae5b008a1969256127b85df1ae980f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6fd8fcb68d-rk72m" Dec 16 13:17:26.939256 kubelet[3038]: E1216 13:17:26.938253 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6fd8fcb68d-rk72m_calico-system(783ebdab-8697-4127-867a-9ceb640da894)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6fd8fcb68d-rk72m_calico-system(783ebdab-8697-4127-867a-9ceb640da894)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"259f030c2315860ff390ea122c2d0a8a29ae5b008a1969256127b85df1ae980f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6fd8fcb68d-rk72m" podUID="783ebdab-8697-4127-867a-9ceb640da894" Dec 16 13:17:26.939677 containerd[1785]: time="2025-12-16T13:17:26.939645677Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-7jsz2,Uid:f4f96a7b-1e66-4e29-b3f0-890217e05473,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"dc394488bebcef6baffdf7477fdfe2b652f7fa37af5fc8e4fd5610de1d1b22e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:17:26.939853 kubelet[3038]: E1216 13:17:26.939819 3038 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dc394488bebcef6baffdf7477fdfe2b652f7fa37af5fc8e4fd5610de1d1b22e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:17:26.939892 kubelet[3038]: E1216 13:17:26.939875 3038 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dc394488bebcef6baffdf7477fdfe2b652f7fa37af5fc8e4fd5610de1d1b22e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-7jsz2" Dec 16 13:17:26.939917 kubelet[3038]: E1216 13:17:26.939896 3038 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dc394488bebcef6baffdf7477fdfe2b652f7fa37af5fc8e4fd5610de1d1b22e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-7jsz2" Dec 16 13:17:26.939958 kubelet[3038]: E1216 13:17:26.939937 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-7jsz2_kube-system(f4f96a7b-1e66-4e29-b3f0-890217e05473)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-7jsz2_kube-system(f4f96a7b-1e66-4e29-b3f0-890217e05473)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dc394488bebcef6baffdf7477fdfe2b652f7fa37af5fc8e4fd5610de1d1b22e9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-7jsz2" podUID="f4f96a7b-1e66-4e29-b3f0-890217e05473" Dec 16 13:17:26.945480 containerd[1785]: time="2025-12-16T13:17:26.945423172Z" level=error msg="Failed to destroy network for sandbox \"3f0d8d802d41856216203b2b6bfb58bab3c3b5b31f306f0f32b78e1571c053dc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:17:26.947545 containerd[1785]: time="2025-12-16T13:17:26.947346794Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5d8cb67fb9-kbt5r,Uid:14065d61-1c00-4d68-849b-4eefd74615de,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f0d8d802d41856216203b2b6bfb58bab3c3b5b31f306f0f32b78e1571c053dc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:17:26.947985 kubelet[3038]: E1216 13:17:26.947548 3038 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f0d8d802d41856216203b2b6bfb58bab3c3b5b31f306f0f32b78e1571c053dc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:17:26.947985 kubelet[3038]: E1216 13:17:26.947599 3038 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f0d8d802d41856216203b2b6bfb58bab3c3b5b31f306f0f32b78e1571c053dc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5d8cb67fb9-kbt5r" Dec 16 13:17:26.947985 kubelet[3038]: E1216 13:17:26.947619 3038 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f0d8d802d41856216203b2b6bfb58bab3c3b5b31f306f0f32b78e1571c053dc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5d8cb67fb9-kbt5r" Dec 16 13:17:26.948101 kubelet[3038]: E1216 13:17:26.947657 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5d8cb67fb9-kbt5r_calico-system(14065d61-1c00-4d68-849b-4eefd74615de)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5d8cb67fb9-kbt5r_calico-system(14065d61-1c00-4d68-849b-4eefd74615de)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3f0d8d802d41856216203b2b6bfb58bab3c3b5b31f306f0f32b78e1571c053dc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5d8cb67fb9-kbt5r" podUID="14065d61-1c00-4d68-849b-4eefd74615de" Dec 16 13:17:26.948986 containerd[1785]: time="2025-12-16T13:17:26.948948824Z" level=error msg="Failed to destroy network for sandbox \"d57a752a91ca2cad5076ca08175446f1ab4ccccf4cfb640b9716220195b7c543\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:17:26.950621 containerd[1785]: time="2025-12-16T13:17:26.950588490Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76768d65dd-q8jbl,Uid:8d366b0a-e187-4483-8bcf-46758c32eaee,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d57a752a91ca2cad5076ca08175446f1ab4ccccf4cfb640b9716220195b7c543\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:17:26.950772 kubelet[3038]: E1216 13:17:26.950747 3038 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d57a752a91ca2cad5076ca08175446f1ab4ccccf4cfb640b9716220195b7c543\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:17:26.950805 kubelet[3038]: E1216 13:17:26.950785 3038 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d57a752a91ca2cad5076ca08175446f1ab4ccccf4cfb640b9716220195b7c543\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-76768d65dd-q8jbl" Dec 16 13:17:26.950833 kubelet[3038]: E1216 13:17:26.950802 3038 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d57a752a91ca2cad5076ca08175446f1ab4ccccf4cfb640b9716220195b7c543\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-76768d65dd-q8jbl" Dec 16 13:17:26.950859 kubelet[3038]: E1216 13:17:26.950839 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-76768d65dd-q8jbl_calico-apiserver(8d366b0a-e187-4483-8bcf-46758c32eaee)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-76768d65dd-q8jbl_calico-apiserver(8d366b0a-e187-4483-8bcf-46758c32eaee)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d57a752a91ca2cad5076ca08175446f1ab4ccccf4cfb640b9716220195b7c543\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-76768d65dd-q8jbl" podUID="8d366b0a-e187-4483-8bcf-46758c32eaee" Dec 16 13:17:27.135969 systemd[1]: Created slice kubepods-besteffort-pod2c2ad34d_d82b_4624_87cd_76ece6a8970b.slice - libcontainer container kubepods-besteffort-pod2c2ad34d_d82b_4624_87cd_76ece6a8970b.slice. Dec 16 13:17:27.138601 containerd[1785]: time="2025-12-16T13:17:27.138561117Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-4vgpv,Uid:2c2ad34d-d82b-4624-87cd-76ece6a8970b,Namespace:calico-system,Attempt:0,}" Dec 16 13:17:27.189889 containerd[1785]: time="2025-12-16T13:17:27.189826635Z" level=error msg="Failed to destroy network for sandbox \"01b31f779e6cd4826ea011086bed1bf6231ddf98ab6ccaf45d63b5ca62033a32\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:17:27.192065 containerd[1785]: time="2025-12-16T13:17:27.192026490Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-4vgpv,Uid:2c2ad34d-d82b-4624-87cd-76ece6a8970b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"01b31f779e6cd4826ea011086bed1bf6231ddf98ab6ccaf45d63b5ca62033a32\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:17:27.192381 kubelet[3038]: E1216 13:17:27.192317 3038 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01b31f779e6cd4826ea011086bed1bf6231ddf98ab6ccaf45d63b5ca62033a32\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:17:27.192436 kubelet[3038]: E1216 13:17:27.192409 3038 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01b31f779e6cd4826ea011086bed1bf6231ddf98ab6ccaf45d63b5ca62033a32\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-4vgpv" Dec 16 13:17:27.192473 kubelet[3038]: E1216 13:17:27.192440 3038 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01b31f779e6cd4826ea011086bed1bf6231ddf98ab6ccaf45d63b5ca62033a32\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-4vgpv" Dec 16 13:17:27.192550 kubelet[3038]: E1216 13:17:27.192518 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-4vgpv_calico-system(2c2ad34d-d82b-4624-87cd-76ece6a8970b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-4vgpv_calico-system(2c2ad34d-d82b-4624-87cd-76ece6a8970b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"01b31f779e6cd4826ea011086bed1bf6231ddf98ab6ccaf45d63b5ca62033a32\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-4vgpv" podUID="2c2ad34d-d82b-4624-87cd-76ece6a8970b" Dec 16 13:17:27.214296 containerd[1785]: time="2025-12-16T13:17:27.214239343Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Dec 16 13:17:27.841936 systemd[1]: run-netns-cni\x2d8a0a52e5\x2dd993\x2d6662\x2d41bd\x2d4a35c3001e31.mount: Deactivated successfully. Dec 16 13:17:27.842152 systemd[1]: run-netns-cni\x2de3d10277\x2db5d5\x2ddc20\x2d0d11\x2d9c27f376a08d.mount: Deactivated successfully. Dec 16 13:17:27.842276 systemd[1]: run-netns-cni\x2d198722e2\x2d7faf\x2da616\x2d05fc\x2dff31998aec9b.mount: Deactivated successfully. Dec 16 13:17:27.842406 systemd[1]: run-netns-cni\x2d39683cb6\x2dd8a5\x2dc856\x2dd346\x2d08ff15652e35.mount: Deactivated successfully. Dec 16 13:17:27.842577 systemd[1]: run-netns-cni\x2dbb8ab409\x2da62b\x2d2ee0\x2d3140\x2d2f033d4db778.mount: Deactivated successfully. Dec 16 13:17:27.842708 systemd[1]: run-netns-cni\x2d9d561b07\x2d9470\x2dcada\x2d975f\x2d1bf088b771f5.mount: Deactivated successfully. Dec 16 13:17:27.842856 systemd[1]: run-netns-cni\x2d4d9676b1\x2dc7f2\x2d8ef2\x2d403c\x2d03e5212a9bb9.mount: Deactivated successfully. Dec 16 13:17:33.926735 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2541462553.mount: Deactivated successfully. Dec 16 13:17:33.950921 containerd[1785]: time="2025-12-16T13:17:33.950867343Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:17:33.951959 containerd[1785]: time="2025-12-16T13:17:33.951940332Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156883675" Dec 16 13:17:33.953508 containerd[1785]: time="2025-12-16T13:17:33.953423780Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:17:33.955629 containerd[1785]: time="2025-12-16T13:17:33.955593234Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:17:33.955983 containerd[1785]: time="2025-12-16T13:17:33.955958164Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 6.741674001s" Dec 16 13:17:33.956015 containerd[1785]: time="2025-12-16T13:17:33.955989568Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Dec 16 13:17:33.964462 containerd[1785]: time="2025-12-16T13:17:33.964416412Z" level=info msg="CreateContainer within sandbox \"b3d6a73f7d6057dd9b10e8911e1df9823eff8f3aea25eaa820db1b40e9d5f187\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Dec 16 13:17:33.984640 containerd[1785]: time="2025-12-16T13:17:33.984591824Z" level=info msg="Container a4eefd4a29669b73a4efc1e8e63837f58e8beb4918214be378e8c4d722bd38f6: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:17:33.999122 containerd[1785]: time="2025-12-16T13:17:33.999071654Z" level=info msg="CreateContainer within sandbox \"b3d6a73f7d6057dd9b10e8911e1df9823eff8f3aea25eaa820db1b40e9d5f187\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"a4eefd4a29669b73a4efc1e8e63837f58e8beb4918214be378e8c4d722bd38f6\"" Dec 16 13:17:33.999858 containerd[1785]: time="2025-12-16T13:17:33.999594203Z" level=info msg="StartContainer for \"a4eefd4a29669b73a4efc1e8e63837f58e8beb4918214be378e8c4d722bd38f6\"" Dec 16 13:17:34.001083 containerd[1785]: time="2025-12-16T13:17:34.001055128Z" level=info msg="connecting to shim a4eefd4a29669b73a4efc1e8e63837f58e8beb4918214be378e8c4d722bd38f6" address="unix:///run/containerd/s/84d815db7ca429ed09e54e5657895524671cd67e067043e879559a5ef3725011" protocol=ttrpc version=3 Dec 16 13:17:34.025643 systemd[1]: Started cri-containerd-a4eefd4a29669b73a4efc1e8e63837f58e8beb4918214be378e8c4d722bd38f6.scope - libcontainer container a4eefd4a29669b73a4efc1e8e63837f58e8beb4918214be378e8c4d722bd38f6. Dec 16 13:17:34.123658 containerd[1785]: time="2025-12-16T13:17:34.123620428Z" level=info msg="StartContainer for \"a4eefd4a29669b73a4efc1e8e63837f58e8beb4918214be378e8c4d722bd38f6\" returns successfully" Dec 16 13:17:34.243737 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Dec 16 13:17:34.243814 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Dec 16 13:17:34.328732 kubelet[3038]: I1216 13:17:34.328676 3038 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-7sz26" podStartSLOduration=1.031493533 podStartE2EDuration="17.328659199s" podCreationTimestamp="2025-12-16 13:17:17 +0000 UTC" firstStartedPulling="2025-12-16 13:17:17.65936445 +0000 UTC m=+18.611051512" lastFinishedPulling="2025-12-16 13:17:33.956530117 +0000 UTC m=+34.908217178" observedRunningTime="2025-12-16 13:17:34.253690876 +0000 UTC m=+35.205377971" watchObservedRunningTime="2025-12-16 13:17:34.328659199 +0000 UTC m=+35.280346277" Dec 16 13:17:34.360040 kubelet[3038]: I1216 13:17:34.359975 3038 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/783ebdab-8697-4127-867a-9ceb640da894-whisker-ca-bundle\") pod \"783ebdab-8697-4127-867a-9ceb640da894\" (UID: \"783ebdab-8697-4127-867a-9ceb640da894\") " Dec 16 13:17:34.360040 kubelet[3038]: I1216 13:17:34.360027 3038 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/783ebdab-8697-4127-867a-9ceb640da894-whisker-backend-key-pair\") pod \"783ebdab-8697-4127-867a-9ceb640da894\" (UID: \"783ebdab-8697-4127-867a-9ceb640da894\") " Dec 16 13:17:34.360196 kubelet[3038]: I1216 13:17:34.360081 3038 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ps7h\" (UniqueName: \"kubernetes.io/projected/783ebdab-8697-4127-867a-9ceb640da894-kube-api-access-2ps7h\") pod \"783ebdab-8697-4127-867a-9ceb640da894\" (UID: \"783ebdab-8697-4127-867a-9ceb640da894\") " Dec 16 13:17:34.360480 kubelet[3038]: I1216 13:17:34.360403 3038 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/783ebdab-8697-4127-867a-9ceb640da894-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "783ebdab-8697-4127-867a-9ceb640da894" (UID: "783ebdab-8697-4127-867a-9ceb640da894"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Dec 16 13:17:34.365355 kubelet[3038]: I1216 13:17:34.365260 3038 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/783ebdab-8697-4127-867a-9ceb640da894-kube-api-access-2ps7h" (OuterVolumeSpecName: "kube-api-access-2ps7h") pod "783ebdab-8697-4127-867a-9ceb640da894" (UID: "783ebdab-8697-4127-867a-9ceb640da894"). InnerVolumeSpecName "kube-api-access-2ps7h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Dec 16 13:17:34.366606 kubelet[3038]: I1216 13:17:34.366552 3038 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/783ebdab-8697-4127-867a-9ceb640da894-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "783ebdab-8697-4127-867a-9ceb640da894" (UID: "783ebdab-8697-4127-867a-9ceb640da894"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Dec 16 13:17:34.460881 kubelet[3038]: I1216 13:17:34.460802 3038 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2ps7h\" (UniqueName: \"kubernetes.io/projected/783ebdab-8697-4127-867a-9ceb640da894-kube-api-access-2ps7h\") on node \"ci-4459-2-2-0-839c7337fa\" DevicePath \"\"" Dec 16 13:17:34.460881 kubelet[3038]: I1216 13:17:34.460837 3038 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/783ebdab-8697-4127-867a-9ceb640da894-whisker-ca-bundle\") on node \"ci-4459-2-2-0-839c7337fa\" DevicePath \"\"" Dec 16 13:17:34.460881 kubelet[3038]: I1216 13:17:34.460848 3038 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/783ebdab-8697-4127-867a-9ceb640da894-whisker-backend-key-pair\") on node \"ci-4459-2-2-0-839c7337fa\" DevicePath \"\"" Dec 16 13:17:34.932494 systemd[1]: var-lib-kubelet-pods-783ebdab\x2d8697\x2d4127\x2d867a\x2d9ceb640da894-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d2ps7h.mount: Deactivated successfully. Dec 16 13:17:34.932768 systemd[1]: var-lib-kubelet-pods-783ebdab\x2d8697\x2d4127\x2d867a\x2d9ceb640da894-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Dec 16 13:17:35.144835 systemd[1]: Removed slice kubepods-besteffort-pod783ebdab_8697_4127_867a_9ceb640da894.slice - libcontainer container kubepods-besteffort-pod783ebdab_8697_4127_867a_9ceb640da894.slice. Dec 16 13:17:35.320193 systemd[1]: Created slice kubepods-besteffort-podcb2a00f3_8bc2_4b68_a631_62b5470c7b77.slice - libcontainer container kubepods-besteffort-podcb2a00f3_8bc2_4b68_a631_62b5470c7b77.slice. Dec 16 13:17:35.366673 kubelet[3038]: I1216 13:17:35.366614 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cb2a00f3-8bc2-4b68-a631-62b5470c7b77-whisker-ca-bundle\") pod \"whisker-7f659d6578-trd8w\" (UID: \"cb2a00f3-8bc2-4b68-a631-62b5470c7b77\") " pod="calico-system/whisker-7f659d6578-trd8w" Dec 16 13:17:35.366673 kubelet[3038]: I1216 13:17:35.366673 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkx8g\" (UniqueName: \"kubernetes.io/projected/cb2a00f3-8bc2-4b68-a631-62b5470c7b77-kube-api-access-tkx8g\") pod \"whisker-7f659d6578-trd8w\" (UID: \"cb2a00f3-8bc2-4b68-a631-62b5470c7b77\") " pod="calico-system/whisker-7f659d6578-trd8w" Dec 16 13:17:35.366673 kubelet[3038]: I1216 13:17:35.366694 3038 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/cb2a00f3-8bc2-4b68-a631-62b5470c7b77-whisker-backend-key-pair\") pod \"whisker-7f659d6578-trd8w\" (UID: \"cb2a00f3-8bc2-4b68-a631-62b5470c7b77\") " pod="calico-system/whisker-7f659d6578-trd8w" Dec 16 13:17:35.623853 containerd[1785]: time="2025-12-16T13:17:35.623702611Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7f659d6578-trd8w,Uid:cb2a00f3-8bc2-4b68-a631-62b5470c7b77,Namespace:calico-system,Attempt:0,}" Dec 16 13:17:35.720748 systemd-networkd[1596]: calid30f8b59eb7: Link UP Dec 16 13:17:35.721278 systemd-networkd[1596]: calid30f8b59eb7: Gained carrier Dec 16 13:17:35.736959 containerd[1785]: 2025-12-16 13:17:35.654 [INFO][4479] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--0--839c7337fa-k8s-whisker--7f659d6578--trd8w-eth0 whisker-7f659d6578- calico-system cb2a00f3-8bc2-4b68-a631-62b5470c7b77 860 0 2025-12-16 13:17:35 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7f659d6578 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-2-0-839c7337fa whisker-7f659d6578-trd8w eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calid30f8b59eb7 [] [] }} ContainerID="802f4a3e5814401ef67baa4667110e2dd48e76210410afd04466688ae8c3781f" Namespace="calico-system" Pod="whisker-7f659d6578-trd8w" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-whisker--7f659d6578--trd8w-" Dec 16 13:17:35.736959 containerd[1785]: 2025-12-16 13:17:35.655 [INFO][4479] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="802f4a3e5814401ef67baa4667110e2dd48e76210410afd04466688ae8c3781f" Namespace="calico-system" Pod="whisker-7f659d6578-trd8w" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-whisker--7f659d6578--trd8w-eth0" Dec 16 13:17:35.736959 containerd[1785]: 2025-12-16 13:17:35.677 [INFO][4497] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="802f4a3e5814401ef67baa4667110e2dd48e76210410afd04466688ae8c3781f" HandleID="k8s-pod-network.802f4a3e5814401ef67baa4667110e2dd48e76210410afd04466688ae8c3781f" Workload="ci--4459--2--2--0--839c7337fa-k8s-whisker--7f659d6578--trd8w-eth0" Dec 16 13:17:35.737207 containerd[1785]: 2025-12-16 13:17:35.677 [INFO][4497] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="802f4a3e5814401ef67baa4667110e2dd48e76210410afd04466688ae8c3781f" HandleID="k8s-pod-network.802f4a3e5814401ef67baa4667110e2dd48e76210410afd04466688ae8c3781f" Workload="ci--4459--2--2--0--839c7337fa-k8s-whisker--7f659d6578--trd8w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000139700), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-0-839c7337fa", "pod":"whisker-7f659d6578-trd8w", "timestamp":"2025-12-16 13:17:35.677238652 +0000 UTC"}, Hostname:"ci-4459-2-2-0-839c7337fa", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:17:35.737207 containerd[1785]: 2025-12-16 13:17:35.677 [INFO][4497] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:17:35.737207 containerd[1785]: 2025-12-16 13:17:35.677 [INFO][4497] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:17:35.737207 containerd[1785]: 2025-12-16 13:17:35.677 [INFO][4497] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-0-839c7337fa' Dec 16 13:17:35.737207 containerd[1785]: 2025-12-16 13:17:35.684 [INFO][4497] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.802f4a3e5814401ef67baa4667110e2dd48e76210410afd04466688ae8c3781f" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:35.737207 containerd[1785]: 2025-12-16 13:17:35.688 [INFO][4497] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:35.737207 containerd[1785]: 2025-12-16 13:17:35.692 [INFO][4497] ipam/ipam.go 511: Trying affinity for 192.168.96.128/26 host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:35.737207 containerd[1785]: 2025-12-16 13:17:35.694 [INFO][4497] ipam/ipam.go 158: Attempting to load block cidr=192.168.96.128/26 host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:35.737207 containerd[1785]: 2025-12-16 13:17:35.695 [INFO][4497] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.96.128/26 host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:35.737425 containerd[1785]: 2025-12-16 13:17:35.695 [INFO][4497] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.96.128/26 handle="k8s-pod-network.802f4a3e5814401ef67baa4667110e2dd48e76210410afd04466688ae8c3781f" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:35.737425 containerd[1785]: 2025-12-16 13:17:35.697 [INFO][4497] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.802f4a3e5814401ef67baa4667110e2dd48e76210410afd04466688ae8c3781f Dec 16 13:17:35.737425 containerd[1785]: 2025-12-16 13:17:35.704 [INFO][4497] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.96.128/26 handle="k8s-pod-network.802f4a3e5814401ef67baa4667110e2dd48e76210410afd04466688ae8c3781f" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:35.737425 containerd[1785]: 2025-12-16 13:17:35.709 [INFO][4497] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.96.129/26] block=192.168.96.128/26 handle="k8s-pod-network.802f4a3e5814401ef67baa4667110e2dd48e76210410afd04466688ae8c3781f" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:35.737425 containerd[1785]: 2025-12-16 13:17:35.709 [INFO][4497] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.96.129/26] handle="k8s-pod-network.802f4a3e5814401ef67baa4667110e2dd48e76210410afd04466688ae8c3781f" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:35.737425 containerd[1785]: 2025-12-16 13:17:35.709 [INFO][4497] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:17:35.737425 containerd[1785]: 2025-12-16 13:17:35.709 [INFO][4497] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.96.129/26] IPv6=[] ContainerID="802f4a3e5814401ef67baa4667110e2dd48e76210410afd04466688ae8c3781f" HandleID="k8s-pod-network.802f4a3e5814401ef67baa4667110e2dd48e76210410afd04466688ae8c3781f" Workload="ci--4459--2--2--0--839c7337fa-k8s-whisker--7f659d6578--trd8w-eth0" Dec 16 13:17:35.737565 containerd[1785]: 2025-12-16 13:17:35.711 [INFO][4479] cni-plugin/k8s.go 418: Populated endpoint ContainerID="802f4a3e5814401ef67baa4667110e2dd48e76210410afd04466688ae8c3781f" Namespace="calico-system" Pod="whisker-7f659d6578-trd8w" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-whisker--7f659d6578--trd8w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--839c7337fa-k8s-whisker--7f659d6578--trd8w-eth0", GenerateName:"whisker-7f659d6578-", Namespace:"calico-system", SelfLink:"", UID:"cb2a00f3-8bc2-4b68-a631-62b5470c7b77", ResourceVersion:"860", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 17, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7f659d6578", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-839c7337fa", ContainerID:"", Pod:"whisker-7f659d6578-trd8w", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.96.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calid30f8b59eb7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:17:35.737565 containerd[1785]: 2025-12-16 13:17:35.712 [INFO][4479] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.96.129/32] ContainerID="802f4a3e5814401ef67baa4667110e2dd48e76210410afd04466688ae8c3781f" Namespace="calico-system" Pod="whisker-7f659d6578-trd8w" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-whisker--7f659d6578--trd8w-eth0" Dec 16 13:17:35.737634 containerd[1785]: 2025-12-16 13:17:35.712 [INFO][4479] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid30f8b59eb7 ContainerID="802f4a3e5814401ef67baa4667110e2dd48e76210410afd04466688ae8c3781f" Namespace="calico-system" Pod="whisker-7f659d6578-trd8w" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-whisker--7f659d6578--trd8w-eth0" Dec 16 13:17:35.737634 containerd[1785]: 2025-12-16 13:17:35.721 [INFO][4479] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="802f4a3e5814401ef67baa4667110e2dd48e76210410afd04466688ae8c3781f" Namespace="calico-system" Pod="whisker-7f659d6578-trd8w" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-whisker--7f659d6578--trd8w-eth0" Dec 16 13:17:35.737675 containerd[1785]: 2025-12-16 13:17:35.721 [INFO][4479] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="802f4a3e5814401ef67baa4667110e2dd48e76210410afd04466688ae8c3781f" Namespace="calico-system" Pod="whisker-7f659d6578-trd8w" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-whisker--7f659d6578--trd8w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--839c7337fa-k8s-whisker--7f659d6578--trd8w-eth0", GenerateName:"whisker-7f659d6578-", Namespace:"calico-system", SelfLink:"", UID:"cb2a00f3-8bc2-4b68-a631-62b5470c7b77", ResourceVersion:"860", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 17, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7f659d6578", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-839c7337fa", ContainerID:"802f4a3e5814401ef67baa4667110e2dd48e76210410afd04466688ae8c3781f", Pod:"whisker-7f659d6578-trd8w", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.96.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calid30f8b59eb7", MAC:"82:8f:9e:9c:ae:88", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:17:35.737722 containerd[1785]: 2025-12-16 13:17:35.734 [INFO][4479] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="802f4a3e5814401ef67baa4667110e2dd48e76210410afd04466688ae8c3781f" Namespace="calico-system" Pod="whisker-7f659d6578-trd8w" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-whisker--7f659d6578--trd8w-eth0" Dec 16 13:17:35.767430 containerd[1785]: time="2025-12-16T13:17:35.765855029Z" level=info msg="connecting to shim 802f4a3e5814401ef67baa4667110e2dd48e76210410afd04466688ae8c3781f" address="unix:///run/containerd/s/34a5f4949c3b4f4575161616c6676e06ea639cb87d1c76a194290a14ed081929" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:17:35.791683 systemd[1]: Started cri-containerd-802f4a3e5814401ef67baa4667110e2dd48e76210410afd04466688ae8c3781f.scope - libcontainer container 802f4a3e5814401ef67baa4667110e2dd48e76210410afd04466688ae8c3781f. Dec 16 13:17:35.844074 containerd[1785]: time="2025-12-16T13:17:35.844028110Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7f659d6578-trd8w,Uid:cb2a00f3-8bc2-4b68-a631-62b5470c7b77,Namespace:calico-system,Attempt:0,} returns sandbox id \"802f4a3e5814401ef67baa4667110e2dd48e76210410afd04466688ae8c3781f\"" Dec 16 13:17:35.846193 containerd[1785]: time="2025-12-16T13:17:35.846172592Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 13:17:35.853763 systemd-networkd[1596]: vxlan.calico: Link UP Dec 16 13:17:35.853770 systemd-networkd[1596]: vxlan.calico: Gained carrier Dec 16 13:17:36.193815 containerd[1785]: time="2025-12-16T13:17:36.193735482Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:17:36.195856 containerd[1785]: time="2025-12-16T13:17:36.195755494Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 13:17:36.196020 containerd[1785]: time="2025-12-16T13:17:36.195790211Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 13:17:36.196145 kubelet[3038]: E1216 13:17:36.196077 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:17:36.196207 kubelet[3038]: E1216 13:17:36.196158 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:17:36.196329 kubelet[3038]: E1216 13:17:36.196292 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:dda5a456290f47049c1dcb0ab4e8d120,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tkx8g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7f659d6578-trd8w_calico-system(cb2a00f3-8bc2-4b68-a631-62b5470c7b77): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 13:17:36.198708 containerd[1785]: time="2025-12-16T13:17:36.198652812Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 13:17:36.546423 containerd[1785]: time="2025-12-16T13:17:36.546335026Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:17:36.548804 containerd[1785]: time="2025-12-16T13:17:36.548730833Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 13:17:36.548886 containerd[1785]: time="2025-12-16T13:17:36.548789883Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 13:17:36.549119 kubelet[3038]: E1216 13:17:36.549059 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:17:36.549609 kubelet[3038]: E1216 13:17:36.549135 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:17:36.549642 kubelet[3038]: E1216 13:17:36.549337 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tkx8g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7f659d6578-trd8w_calico-system(cb2a00f3-8bc2-4b68-a631-62b5470c7b77): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 13:17:36.550631 kubelet[3038]: E1216 13:17:36.550565 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f659d6578-trd8w" podUID="cb2a00f3-8bc2-4b68-a631-62b5470c7b77" Dec 16 13:17:36.989991 systemd-networkd[1596]: calid30f8b59eb7: Gained IPv6LL Dec 16 13:17:37.129489 containerd[1785]: time="2025-12-16T13:17:37.129368190Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76768d65dd-q8jbl,Uid:8d366b0a-e187-4483-8bcf-46758c32eaee,Namespace:calico-apiserver,Attempt:0,}" Dec 16 13:17:37.133245 kubelet[3038]: I1216 13:17:37.133184 3038 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="783ebdab-8697-4127-867a-9ceb640da894" path="/var/lib/kubelet/pods/783ebdab-8697-4127-867a-9ceb640da894/volumes" Dec 16 13:17:37.241870 kubelet[3038]: E1216 13:17:37.241703 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f659d6578-trd8w" podUID="cb2a00f3-8bc2-4b68-a631-62b5470c7b77" Dec 16 13:17:37.249556 systemd-networkd[1596]: cali1d48a72481e: Link UP Dec 16 13:17:37.249835 systemd-networkd[1596]: cali1d48a72481e: Gained carrier Dec 16 13:17:37.265078 containerd[1785]: 2025-12-16 13:17:37.177 [INFO][4641] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--q8jbl-eth0 calico-apiserver-76768d65dd- calico-apiserver 8d366b0a-e187-4483-8bcf-46758c32eaee 789 0 2025-12-16 13:17:13 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:76768d65dd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-0-839c7337fa calico-apiserver-76768d65dd-q8jbl eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali1d48a72481e [] [] }} ContainerID="3cd2f4475d9ad59a6aaaf66b4c43cfa05d587340d5da182d0423fedbedc36830" Namespace="calico-apiserver" Pod="calico-apiserver-76768d65dd-q8jbl" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--q8jbl-" Dec 16 13:17:37.265078 containerd[1785]: 2025-12-16 13:17:37.177 [INFO][4641] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3cd2f4475d9ad59a6aaaf66b4c43cfa05d587340d5da182d0423fedbedc36830" Namespace="calico-apiserver" Pod="calico-apiserver-76768d65dd-q8jbl" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--q8jbl-eth0" Dec 16 13:17:37.265078 containerd[1785]: 2025-12-16 13:17:37.208 [INFO][4658] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3cd2f4475d9ad59a6aaaf66b4c43cfa05d587340d5da182d0423fedbedc36830" HandleID="k8s-pod-network.3cd2f4475d9ad59a6aaaf66b4c43cfa05d587340d5da182d0423fedbedc36830" Workload="ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--q8jbl-eth0" Dec 16 13:17:37.265365 containerd[1785]: 2025-12-16 13:17:37.208 [INFO][4658] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="3cd2f4475d9ad59a6aaaf66b4c43cfa05d587340d5da182d0423fedbedc36830" HandleID="k8s-pod-network.3cd2f4475d9ad59a6aaaf66b4c43cfa05d587340d5da182d0423fedbedc36830" Workload="ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--q8jbl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002df5f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-0-839c7337fa", "pod":"calico-apiserver-76768d65dd-q8jbl", "timestamp":"2025-12-16 13:17:37.208412615 +0000 UTC"}, Hostname:"ci-4459-2-2-0-839c7337fa", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:17:37.265365 containerd[1785]: 2025-12-16 13:17:37.208 [INFO][4658] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:17:37.265365 containerd[1785]: 2025-12-16 13:17:37.209 [INFO][4658] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:17:37.265365 containerd[1785]: 2025-12-16 13:17:37.209 [INFO][4658] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-0-839c7337fa' Dec 16 13:17:37.265365 containerd[1785]: 2025-12-16 13:17:37.217 [INFO][4658] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3cd2f4475d9ad59a6aaaf66b4c43cfa05d587340d5da182d0423fedbedc36830" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:37.265365 containerd[1785]: 2025-12-16 13:17:37.223 [INFO][4658] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:37.265365 containerd[1785]: 2025-12-16 13:17:37.228 [INFO][4658] ipam/ipam.go 511: Trying affinity for 192.168.96.128/26 host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:37.265365 containerd[1785]: 2025-12-16 13:17:37.229 [INFO][4658] ipam/ipam.go 158: Attempting to load block cidr=192.168.96.128/26 host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:37.265365 containerd[1785]: 2025-12-16 13:17:37.231 [INFO][4658] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.96.128/26 host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:37.265605 containerd[1785]: 2025-12-16 13:17:37.231 [INFO][4658] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.96.128/26 handle="k8s-pod-network.3cd2f4475d9ad59a6aaaf66b4c43cfa05d587340d5da182d0423fedbedc36830" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:37.265605 containerd[1785]: 2025-12-16 13:17:37.232 [INFO][4658] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.3cd2f4475d9ad59a6aaaf66b4c43cfa05d587340d5da182d0423fedbedc36830 Dec 16 13:17:37.265605 containerd[1785]: 2025-12-16 13:17:37.237 [INFO][4658] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.96.128/26 handle="k8s-pod-network.3cd2f4475d9ad59a6aaaf66b4c43cfa05d587340d5da182d0423fedbedc36830" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:37.265605 containerd[1785]: 2025-12-16 13:17:37.244 [INFO][4658] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.96.130/26] block=192.168.96.128/26 handle="k8s-pod-network.3cd2f4475d9ad59a6aaaf66b4c43cfa05d587340d5da182d0423fedbedc36830" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:37.265605 containerd[1785]: 2025-12-16 13:17:37.244 [INFO][4658] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.96.130/26] handle="k8s-pod-network.3cd2f4475d9ad59a6aaaf66b4c43cfa05d587340d5da182d0423fedbedc36830" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:37.265605 containerd[1785]: 2025-12-16 13:17:37.244 [INFO][4658] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:17:37.265605 containerd[1785]: 2025-12-16 13:17:37.244 [INFO][4658] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.96.130/26] IPv6=[] ContainerID="3cd2f4475d9ad59a6aaaf66b4c43cfa05d587340d5da182d0423fedbedc36830" HandleID="k8s-pod-network.3cd2f4475d9ad59a6aaaf66b4c43cfa05d587340d5da182d0423fedbedc36830" Workload="ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--q8jbl-eth0" Dec 16 13:17:37.265766 containerd[1785]: 2025-12-16 13:17:37.247 [INFO][4641] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3cd2f4475d9ad59a6aaaf66b4c43cfa05d587340d5da182d0423fedbedc36830" Namespace="calico-apiserver" Pod="calico-apiserver-76768d65dd-q8jbl" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--q8jbl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--q8jbl-eth0", GenerateName:"calico-apiserver-76768d65dd-", Namespace:"calico-apiserver", SelfLink:"", UID:"8d366b0a-e187-4483-8bcf-46758c32eaee", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 17, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76768d65dd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-839c7337fa", ContainerID:"", Pod:"calico-apiserver-76768d65dd-q8jbl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.96.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1d48a72481e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:17:37.265833 containerd[1785]: 2025-12-16 13:17:37.247 [INFO][4641] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.96.130/32] ContainerID="3cd2f4475d9ad59a6aaaf66b4c43cfa05d587340d5da182d0423fedbedc36830" Namespace="calico-apiserver" Pod="calico-apiserver-76768d65dd-q8jbl" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--q8jbl-eth0" Dec 16 13:17:37.265833 containerd[1785]: 2025-12-16 13:17:37.247 [INFO][4641] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1d48a72481e ContainerID="3cd2f4475d9ad59a6aaaf66b4c43cfa05d587340d5da182d0423fedbedc36830" Namespace="calico-apiserver" Pod="calico-apiserver-76768d65dd-q8jbl" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--q8jbl-eth0" Dec 16 13:17:37.265833 containerd[1785]: 2025-12-16 13:17:37.249 [INFO][4641] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3cd2f4475d9ad59a6aaaf66b4c43cfa05d587340d5da182d0423fedbedc36830" Namespace="calico-apiserver" Pod="calico-apiserver-76768d65dd-q8jbl" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--q8jbl-eth0" Dec 16 13:17:37.265901 containerd[1785]: 2025-12-16 13:17:37.250 [INFO][4641] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3cd2f4475d9ad59a6aaaf66b4c43cfa05d587340d5da182d0423fedbedc36830" Namespace="calico-apiserver" Pod="calico-apiserver-76768d65dd-q8jbl" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--q8jbl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--q8jbl-eth0", GenerateName:"calico-apiserver-76768d65dd-", Namespace:"calico-apiserver", SelfLink:"", UID:"8d366b0a-e187-4483-8bcf-46758c32eaee", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 17, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76768d65dd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-839c7337fa", ContainerID:"3cd2f4475d9ad59a6aaaf66b4c43cfa05d587340d5da182d0423fedbedc36830", Pod:"calico-apiserver-76768d65dd-q8jbl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.96.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1d48a72481e", MAC:"be:e7:82:64:60:b8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:17:37.266012 containerd[1785]: 2025-12-16 13:17:37.263 [INFO][4641] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3cd2f4475d9ad59a6aaaf66b4c43cfa05d587340d5da182d0423fedbedc36830" Namespace="calico-apiserver" Pod="calico-apiserver-76768d65dd-q8jbl" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--q8jbl-eth0" Dec 16 13:17:37.303002 containerd[1785]: time="2025-12-16T13:17:37.302948728Z" level=info msg="connecting to shim 3cd2f4475d9ad59a6aaaf66b4c43cfa05d587340d5da182d0423fedbedc36830" address="unix:///run/containerd/s/95f2e934f47709afcae392ba8ee5be09d0dc91d67495c3546003140ba2324a57" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:17:37.329681 systemd[1]: Started cri-containerd-3cd2f4475d9ad59a6aaaf66b4c43cfa05d587340d5da182d0423fedbedc36830.scope - libcontainer container 3cd2f4475d9ad59a6aaaf66b4c43cfa05d587340d5da182d0423fedbedc36830. Dec 16 13:17:37.376461 containerd[1785]: time="2025-12-16T13:17:37.376412411Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76768d65dd-q8jbl,Uid:8d366b0a-e187-4483-8bcf-46758c32eaee,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"3cd2f4475d9ad59a6aaaf66b4c43cfa05d587340d5da182d0423fedbedc36830\"" Dec 16 13:17:37.377928 containerd[1785]: time="2025-12-16T13:17:37.377894219Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:17:37.746091 containerd[1785]: time="2025-12-16T13:17:37.745992945Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:17:37.748340 containerd[1785]: time="2025-12-16T13:17:37.748233674Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:17:37.748562 containerd[1785]: time="2025-12-16T13:17:37.748398967Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:17:37.748772 kubelet[3038]: E1216 13:17:37.748701 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:17:37.749323 kubelet[3038]: E1216 13:17:37.748781 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:17:37.749323 kubelet[3038]: E1216 13:17:37.749008 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5k7p2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76768d65dd-q8jbl_calico-apiserver(8d366b0a-e187-4483-8bcf-46758c32eaee): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:17:37.750315 kubelet[3038]: E1216 13:17:37.750236 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-q8jbl" podUID="8d366b0a-e187-4483-8bcf-46758c32eaee" Dec 16 13:17:37.821772 systemd-networkd[1596]: vxlan.calico: Gained IPv6LL Dec 16 13:17:38.130293 containerd[1785]: time="2025-12-16T13:17:38.130117239Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-4vgpv,Uid:2c2ad34d-d82b-4624-87cd-76ece6a8970b,Namespace:calico-system,Attempt:0,}" Dec 16 13:17:38.245230 kubelet[3038]: E1216 13:17:38.245173 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-q8jbl" podUID="8d366b0a-e187-4483-8bcf-46758c32eaee" Dec 16 13:17:38.285802 systemd-networkd[1596]: cali1d078a501ae: Link UP Dec 16 13:17:38.286097 systemd-networkd[1596]: cali1d078a501ae: Gained carrier Dec 16 13:17:38.299546 containerd[1785]: 2025-12-16 13:17:38.178 [INFO][4729] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--0--839c7337fa-k8s-csi--node--driver--4vgpv-eth0 csi-node-driver- calico-system 2c2ad34d-d82b-4624-87cd-76ece6a8970b 683 0 2025-12-16 13:17:17 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-2-0-839c7337fa csi-node-driver-4vgpv eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali1d078a501ae [] [] }} ContainerID="c250d01e7772f0af88b316701aed8d53382078aa1148597fa43107ffacdbe4d0" Namespace="calico-system" Pod="csi-node-driver-4vgpv" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-csi--node--driver--4vgpv-" Dec 16 13:17:38.299546 containerd[1785]: 2025-12-16 13:17:38.178 [INFO][4729] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c250d01e7772f0af88b316701aed8d53382078aa1148597fa43107ffacdbe4d0" Namespace="calico-system" Pod="csi-node-driver-4vgpv" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-csi--node--driver--4vgpv-eth0" Dec 16 13:17:38.299546 containerd[1785]: 2025-12-16 13:17:38.233 [INFO][4746] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c250d01e7772f0af88b316701aed8d53382078aa1148597fa43107ffacdbe4d0" HandleID="k8s-pod-network.c250d01e7772f0af88b316701aed8d53382078aa1148597fa43107ffacdbe4d0" Workload="ci--4459--2--2--0--839c7337fa-k8s-csi--node--driver--4vgpv-eth0" Dec 16 13:17:38.299825 containerd[1785]: 2025-12-16 13:17:38.234 [INFO][4746] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c250d01e7772f0af88b316701aed8d53382078aa1148597fa43107ffacdbe4d0" HandleID="k8s-pod-network.c250d01e7772f0af88b316701aed8d53382078aa1148597fa43107ffacdbe4d0" Workload="ci--4459--2--2--0--839c7337fa-k8s-csi--node--driver--4vgpv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000119860), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-0-839c7337fa", "pod":"csi-node-driver-4vgpv", "timestamp":"2025-12-16 13:17:38.233804987 +0000 UTC"}, Hostname:"ci-4459-2-2-0-839c7337fa", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:17:38.299825 containerd[1785]: 2025-12-16 13:17:38.234 [INFO][4746] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:17:38.299825 containerd[1785]: 2025-12-16 13:17:38.234 [INFO][4746] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:17:38.299825 containerd[1785]: 2025-12-16 13:17:38.234 [INFO][4746] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-0-839c7337fa' Dec 16 13:17:38.299825 containerd[1785]: 2025-12-16 13:17:38.246 [INFO][4746] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c250d01e7772f0af88b316701aed8d53382078aa1148597fa43107ffacdbe4d0" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:38.299825 containerd[1785]: 2025-12-16 13:17:38.252 [INFO][4746] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:38.299825 containerd[1785]: 2025-12-16 13:17:38.261 [INFO][4746] ipam/ipam.go 511: Trying affinity for 192.168.96.128/26 host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:38.299825 containerd[1785]: 2025-12-16 13:17:38.264 [INFO][4746] ipam/ipam.go 158: Attempting to load block cidr=192.168.96.128/26 host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:38.299825 containerd[1785]: 2025-12-16 13:17:38.266 [INFO][4746] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.96.128/26 host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:38.300147 containerd[1785]: 2025-12-16 13:17:38.266 [INFO][4746] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.96.128/26 handle="k8s-pod-network.c250d01e7772f0af88b316701aed8d53382078aa1148597fa43107ffacdbe4d0" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:38.300147 containerd[1785]: 2025-12-16 13:17:38.268 [INFO][4746] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c250d01e7772f0af88b316701aed8d53382078aa1148597fa43107ffacdbe4d0 Dec 16 13:17:38.300147 containerd[1785]: 2025-12-16 13:17:38.274 [INFO][4746] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.96.128/26 handle="k8s-pod-network.c250d01e7772f0af88b316701aed8d53382078aa1148597fa43107ffacdbe4d0" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:38.300147 containerd[1785]: 2025-12-16 13:17:38.281 [INFO][4746] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.96.131/26] block=192.168.96.128/26 handle="k8s-pod-network.c250d01e7772f0af88b316701aed8d53382078aa1148597fa43107ffacdbe4d0" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:38.300147 containerd[1785]: 2025-12-16 13:17:38.281 [INFO][4746] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.96.131/26] handle="k8s-pod-network.c250d01e7772f0af88b316701aed8d53382078aa1148597fa43107ffacdbe4d0" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:38.300147 containerd[1785]: 2025-12-16 13:17:38.281 [INFO][4746] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:17:38.300147 containerd[1785]: 2025-12-16 13:17:38.281 [INFO][4746] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.96.131/26] IPv6=[] ContainerID="c250d01e7772f0af88b316701aed8d53382078aa1148597fa43107ffacdbe4d0" HandleID="k8s-pod-network.c250d01e7772f0af88b316701aed8d53382078aa1148597fa43107ffacdbe4d0" Workload="ci--4459--2--2--0--839c7337fa-k8s-csi--node--driver--4vgpv-eth0" Dec 16 13:17:38.300320 containerd[1785]: 2025-12-16 13:17:38.283 [INFO][4729] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c250d01e7772f0af88b316701aed8d53382078aa1148597fa43107ffacdbe4d0" Namespace="calico-system" Pod="csi-node-driver-4vgpv" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-csi--node--driver--4vgpv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--839c7337fa-k8s-csi--node--driver--4vgpv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2c2ad34d-d82b-4624-87cd-76ece6a8970b", ResourceVersion:"683", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 17, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-839c7337fa", ContainerID:"", Pod:"csi-node-driver-4vgpv", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.96.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali1d078a501ae", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:17:38.300394 containerd[1785]: 2025-12-16 13:17:38.283 [INFO][4729] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.96.131/32] ContainerID="c250d01e7772f0af88b316701aed8d53382078aa1148597fa43107ffacdbe4d0" Namespace="calico-system" Pod="csi-node-driver-4vgpv" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-csi--node--driver--4vgpv-eth0" Dec 16 13:17:38.300394 containerd[1785]: 2025-12-16 13:17:38.283 [INFO][4729] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1d078a501ae ContainerID="c250d01e7772f0af88b316701aed8d53382078aa1148597fa43107ffacdbe4d0" Namespace="calico-system" Pod="csi-node-driver-4vgpv" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-csi--node--driver--4vgpv-eth0" Dec 16 13:17:38.300394 containerd[1785]: 2025-12-16 13:17:38.286 [INFO][4729] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c250d01e7772f0af88b316701aed8d53382078aa1148597fa43107ffacdbe4d0" Namespace="calico-system" Pod="csi-node-driver-4vgpv" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-csi--node--driver--4vgpv-eth0" Dec 16 13:17:38.300496 containerd[1785]: 2025-12-16 13:17:38.287 [INFO][4729] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c250d01e7772f0af88b316701aed8d53382078aa1148597fa43107ffacdbe4d0" Namespace="calico-system" Pod="csi-node-driver-4vgpv" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-csi--node--driver--4vgpv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--839c7337fa-k8s-csi--node--driver--4vgpv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2c2ad34d-d82b-4624-87cd-76ece6a8970b", ResourceVersion:"683", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 17, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-839c7337fa", ContainerID:"c250d01e7772f0af88b316701aed8d53382078aa1148597fa43107ffacdbe4d0", Pod:"csi-node-driver-4vgpv", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.96.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali1d078a501ae", MAC:"3a:c2:ba:62:61:d9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:17:38.300655 containerd[1785]: 2025-12-16 13:17:38.298 [INFO][4729] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c250d01e7772f0af88b316701aed8d53382078aa1148597fa43107ffacdbe4d0" Namespace="calico-system" Pod="csi-node-driver-4vgpv" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-csi--node--driver--4vgpv-eth0" Dec 16 13:17:38.327121 containerd[1785]: time="2025-12-16T13:17:38.327069043Z" level=info msg="connecting to shim c250d01e7772f0af88b316701aed8d53382078aa1148597fa43107ffacdbe4d0" address="unix:///run/containerd/s/c3a7767b4fa62281047071a753cfa9b335d24c67554d40729830c9ecf7c9ce23" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:17:38.356615 systemd[1]: Started cri-containerd-c250d01e7772f0af88b316701aed8d53382078aa1148597fa43107ffacdbe4d0.scope - libcontainer container c250d01e7772f0af88b316701aed8d53382078aa1148597fa43107ffacdbe4d0. Dec 16 13:17:38.387333 containerd[1785]: time="2025-12-16T13:17:38.387211704Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-4vgpv,Uid:2c2ad34d-d82b-4624-87cd-76ece6a8970b,Namespace:calico-system,Attempt:0,} returns sandbox id \"c250d01e7772f0af88b316701aed8d53382078aa1148597fa43107ffacdbe4d0\"" Dec 16 13:17:38.388886 containerd[1785]: time="2025-12-16T13:17:38.388857740Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 13:17:38.736900 containerd[1785]: time="2025-12-16T13:17:38.736611614Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:17:38.738509 containerd[1785]: time="2025-12-16T13:17:38.738403819Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 13:17:38.738685 containerd[1785]: time="2025-12-16T13:17:38.738534813Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 13:17:38.738823 kubelet[3038]: E1216 13:17:38.738766 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:17:38.738927 kubelet[3038]: E1216 13:17:38.738837 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:17:38.739092 kubelet[3038]: E1216 13:17:38.739022 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9zgzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-4vgpv_calico-system(2c2ad34d-d82b-4624-87cd-76ece6a8970b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 13:17:38.741304 containerd[1785]: time="2025-12-16T13:17:38.741267133Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 13:17:39.068126 containerd[1785]: time="2025-12-16T13:17:39.068055606Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:17:39.069942 containerd[1785]: time="2025-12-16T13:17:39.069859684Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 13:17:39.069942 containerd[1785]: time="2025-12-16T13:17:39.069901937Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 13:17:39.070234 kubelet[3038]: E1216 13:17:39.070166 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:17:39.070895 kubelet[3038]: E1216 13:17:39.070246 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:17:39.070895 kubelet[3038]: E1216 13:17:39.070407 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9zgzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-4vgpv_calico-system(2c2ad34d-d82b-4624-87cd-76ece6a8970b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 13:17:39.071755 kubelet[3038]: E1216 13:17:39.071689 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-4vgpv" podUID="2c2ad34d-d82b-4624-87cd-76ece6a8970b" Dec 16 13:17:39.229823 systemd-networkd[1596]: cali1d48a72481e: Gained IPv6LL Dec 16 13:17:39.251374 kubelet[3038]: E1216 13:17:39.251308 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-q8jbl" podUID="8d366b0a-e187-4483-8bcf-46758c32eaee" Dec 16 13:17:39.252667 kubelet[3038]: E1216 13:17:39.252600 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-4vgpv" podUID="2c2ad34d-d82b-4624-87cd-76ece6a8970b" Dec 16 13:17:40.130030 containerd[1785]: time="2025-12-16T13:17:40.129925712Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5d8cb67fb9-kbt5r,Uid:14065d61-1c00-4d68-849b-4eefd74615de,Namespace:calico-system,Attempt:0,}" Dec 16 13:17:40.130030 containerd[1785]: time="2025-12-16T13:17:40.130019883Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76768d65dd-ncz5v,Uid:fe36e307-b3ce-4532-b1d5-65732a7edfab,Namespace:calico-apiserver,Attempt:0,}" Dec 16 13:17:40.130692 containerd[1785]: time="2025-12-16T13:17:40.130412350Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-nz8db,Uid:4fb5337e-3d19-4a26-9e2e-58b08d0a6154,Namespace:calico-system,Attempt:0,}" Dec 16 13:17:40.253182 kubelet[3038]: E1216 13:17:40.252659 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-4vgpv" podUID="2c2ad34d-d82b-4624-87cd-76ece6a8970b" Dec 16 13:17:40.254622 systemd-networkd[1596]: cali1d078a501ae: Gained IPv6LL Dec 16 13:17:40.267015 systemd-networkd[1596]: calic306f8c29b1: Link UP Dec 16 13:17:40.267197 systemd-networkd[1596]: calic306f8c29b1: Gained carrier Dec 16 13:17:40.279414 containerd[1785]: 2025-12-16 13:17:40.184 [INFO][4817] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--0--839c7337fa-k8s-calico--kube--controllers--5d8cb67fb9--kbt5r-eth0 calico-kube-controllers-5d8cb67fb9- calico-system 14065d61-1c00-4d68-849b-4eefd74615de 793 0 2025-12-16 13:17:17 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5d8cb67fb9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-2-0-839c7337fa calico-kube-controllers-5d8cb67fb9-kbt5r eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calic306f8c29b1 [] [] }} ContainerID="19fcd816384a4917a91fdea4028122ee12da9de9b79e800515f6a8d36eac32b2" Namespace="calico-system" Pod="calico-kube-controllers-5d8cb67fb9-kbt5r" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-calico--kube--controllers--5d8cb67fb9--kbt5r-" Dec 16 13:17:40.279414 containerd[1785]: 2025-12-16 13:17:40.184 [INFO][4817] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="19fcd816384a4917a91fdea4028122ee12da9de9b79e800515f6a8d36eac32b2" Namespace="calico-system" Pod="calico-kube-controllers-5d8cb67fb9-kbt5r" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-calico--kube--controllers--5d8cb67fb9--kbt5r-eth0" Dec 16 13:17:40.279414 containerd[1785]: 2025-12-16 13:17:40.219 [INFO][4871] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="19fcd816384a4917a91fdea4028122ee12da9de9b79e800515f6a8d36eac32b2" HandleID="k8s-pod-network.19fcd816384a4917a91fdea4028122ee12da9de9b79e800515f6a8d36eac32b2" Workload="ci--4459--2--2--0--839c7337fa-k8s-calico--kube--controllers--5d8cb67fb9--kbt5r-eth0" Dec 16 13:17:40.279730 containerd[1785]: 2025-12-16 13:17:40.219 [INFO][4871] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="19fcd816384a4917a91fdea4028122ee12da9de9b79e800515f6a8d36eac32b2" HandleID="k8s-pod-network.19fcd816384a4917a91fdea4028122ee12da9de9b79e800515f6a8d36eac32b2" Workload="ci--4459--2--2--0--839c7337fa-k8s-calico--kube--controllers--5d8cb67fb9--kbt5r-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4fd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-0-839c7337fa", "pod":"calico-kube-controllers-5d8cb67fb9-kbt5r", "timestamp":"2025-12-16 13:17:40.21976784 +0000 UTC"}, Hostname:"ci-4459-2-2-0-839c7337fa", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:17:40.279730 containerd[1785]: 2025-12-16 13:17:40.220 [INFO][4871] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:17:40.279730 containerd[1785]: 2025-12-16 13:17:40.220 [INFO][4871] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:17:40.279730 containerd[1785]: 2025-12-16 13:17:40.220 [INFO][4871] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-0-839c7337fa' Dec 16 13:17:40.279730 containerd[1785]: 2025-12-16 13:17:40.227 [INFO][4871] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.19fcd816384a4917a91fdea4028122ee12da9de9b79e800515f6a8d36eac32b2" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.279730 containerd[1785]: 2025-12-16 13:17:40.238 [INFO][4871] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.279730 containerd[1785]: 2025-12-16 13:17:40.242 [INFO][4871] ipam/ipam.go 511: Trying affinity for 192.168.96.128/26 host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.279730 containerd[1785]: 2025-12-16 13:17:40.244 [INFO][4871] ipam/ipam.go 158: Attempting to load block cidr=192.168.96.128/26 host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.279730 containerd[1785]: 2025-12-16 13:17:40.246 [INFO][4871] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.96.128/26 host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.280136 containerd[1785]: 2025-12-16 13:17:40.246 [INFO][4871] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.96.128/26 handle="k8s-pod-network.19fcd816384a4917a91fdea4028122ee12da9de9b79e800515f6a8d36eac32b2" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.280136 containerd[1785]: 2025-12-16 13:17:40.248 [INFO][4871] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.19fcd816384a4917a91fdea4028122ee12da9de9b79e800515f6a8d36eac32b2 Dec 16 13:17:40.280136 containerd[1785]: 2025-12-16 13:17:40.254 [INFO][4871] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.96.128/26 handle="k8s-pod-network.19fcd816384a4917a91fdea4028122ee12da9de9b79e800515f6a8d36eac32b2" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.280136 containerd[1785]: 2025-12-16 13:17:40.261 [INFO][4871] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.96.132/26] block=192.168.96.128/26 handle="k8s-pod-network.19fcd816384a4917a91fdea4028122ee12da9de9b79e800515f6a8d36eac32b2" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.280136 containerd[1785]: 2025-12-16 13:17:40.261 [INFO][4871] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.96.132/26] handle="k8s-pod-network.19fcd816384a4917a91fdea4028122ee12da9de9b79e800515f6a8d36eac32b2" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.280136 containerd[1785]: 2025-12-16 13:17:40.262 [INFO][4871] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:17:40.280136 containerd[1785]: 2025-12-16 13:17:40.262 [INFO][4871] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.96.132/26] IPv6=[] ContainerID="19fcd816384a4917a91fdea4028122ee12da9de9b79e800515f6a8d36eac32b2" HandleID="k8s-pod-network.19fcd816384a4917a91fdea4028122ee12da9de9b79e800515f6a8d36eac32b2" Workload="ci--4459--2--2--0--839c7337fa-k8s-calico--kube--controllers--5d8cb67fb9--kbt5r-eth0" Dec 16 13:17:40.280459 containerd[1785]: 2025-12-16 13:17:40.265 [INFO][4817] cni-plugin/k8s.go 418: Populated endpoint ContainerID="19fcd816384a4917a91fdea4028122ee12da9de9b79e800515f6a8d36eac32b2" Namespace="calico-system" Pod="calico-kube-controllers-5d8cb67fb9-kbt5r" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-calico--kube--controllers--5d8cb67fb9--kbt5r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--839c7337fa-k8s-calico--kube--controllers--5d8cb67fb9--kbt5r-eth0", GenerateName:"calico-kube-controllers-5d8cb67fb9-", Namespace:"calico-system", SelfLink:"", UID:"14065d61-1c00-4d68-849b-4eefd74615de", ResourceVersion:"793", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 17, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5d8cb67fb9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-839c7337fa", ContainerID:"", Pod:"calico-kube-controllers-5d8cb67fb9-kbt5r", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.96.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic306f8c29b1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:17:40.280536 containerd[1785]: 2025-12-16 13:17:40.265 [INFO][4817] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.96.132/32] ContainerID="19fcd816384a4917a91fdea4028122ee12da9de9b79e800515f6a8d36eac32b2" Namespace="calico-system" Pod="calico-kube-controllers-5d8cb67fb9-kbt5r" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-calico--kube--controllers--5d8cb67fb9--kbt5r-eth0" Dec 16 13:17:40.280536 containerd[1785]: 2025-12-16 13:17:40.265 [INFO][4817] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic306f8c29b1 ContainerID="19fcd816384a4917a91fdea4028122ee12da9de9b79e800515f6a8d36eac32b2" Namespace="calico-system" Pod="calico-kube-controllers-5d8cb67fb9-kbt5r" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-calico--kube--controllers--5d8cb67fb9--kbt5r-eth0" Dec 16 13:17:40.280536 containerd[1785]: 2025-12-16 13:17:40.267 [INFO][4817] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="19fcd816384a4917a91fdea4028122ee12da9de9b79e800515f6a8d36eac32b2" Namespace="calico-system" Pod="calico-kube-controllers-5d8cb67fb9-kbt5r" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-calico--kube--controllers--5d8cb67fb9--kbt5r-eth0" Dec 16 13:17:40.280609 containerd[1785]: 2025-12-16 13:17:40.269 [INFO][4817] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="19fcd816384a4917a91fdea4028122ee12da9de9b79e800515f6a8d36eac32b2" Namespace="calico-system" Pod="calico-kube-controllers-5d8cb67fb9-kbt5r" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-calico--kube--controllers--5d8cb67fb9--kbt5r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--839c7337fa-k8s-calico--kube--controllers--5d8cb67fb9--kbt5r-eth0", GenerateName:"calico-kube-controllers-5d8cb67fb9-", Namespace:"calico-system", SelfLink:"", UID:"14065d61-1c00-4d68-849b-4eefd74615de", ResourceVersion:"793", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 17, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5d8cb67fb9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-839c7337fa", ContainerID:"19fcd816384a4917a91fdea4028122ee12da9de9b79e800515f6a8d36eac32b2", Pod:"calico-kube-controllers-5d8cb67fb9-kbt5r", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.96.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic306f8c29b1", MAC:"aa:3d:a0:b3:95:a2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:17:40.280668 containerd[1785]: 2025-12-16 13:17:40.278 [INFO][4817] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="19fcd816384a4917a91fdea4028122ee12da9de9b79e800515f6a8d36eac32b2" Namespace="calico-system" Pod="calico-kube-controllers-5d8cb67fb9-kbt5r" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-calico--kube--controllers--5d8cb67fb9--kbt5r-eth0" Dec 16 13:17:40.308023 containerd[1785]: time="2025-12-16T13:17:40.307969187Z" level=info msg="connecting to shim 19fcd816384a4917a91fdea4028122ee12da9de9b79e800515f6a8d36eac32b2" address="unix:///run/containerd/s/d59de5406e8940b99cc7c838001637ca563c2ee96c4c893a3a58414c877f6492" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:17:40.345665 systemd[1]: Started cri-containerd-19fcd816384a4917a91fdea4028122ee12da9de9b79e800515f6a8d36eac32b2.scope - libcontainer container 19fcd816384a4917a91fdea4028122ee12da9de9b79e800515f6a8d36eac32b2. Dec 16 13:17:40.361152 systemd-networkd[1596]: cali77fa39b1a22: Link UP Dec 16 13:17:40.361880 systemd-networkd[1596]: cali77fa39b1a22: Gained carrier Dec 16 13:17:40.371851 containerd[1785]: 2025-12-16 13:17:40.193 [INFO][4837] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--0--839c7337fa-k8s-goldmane--666569f655--nz8db-eth0 goldmane-666569f655- calico-system 4fb5337e-3d19-4a26-9e2e-58b08d0a6154 792 0 2025-12-16 13:17:15 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-2-0-839c7337fa goldmane-666569f655-nz8db eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali77fa39b1a22 [] [] }} ContainerID="e964305a7348078ba73f959cba8c1e7eb5329eeb321840b3250ccd06e8f946c3" Namespace="calico-system" Pod="goldmane-666569f655-nz8db" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-goldmane--666569f655--nz8db-" Dec 16 13:17:40.371851 containerd[1785]: 2025-12-16 13:17:40.193 [INFO][4837] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e964305a7348078ba73f959cba8c1e7eb5329eeb321840b3250ccd06e8f946c3" Namespace="calico-system" Pod="goldmane-666569f655-nz8db" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-goldmane--666569f655--nz8db-eth0" Dec 16 13:17:40.371851 containerd[1785]: 2025-12-16 13:17:40.232 [INFO][4878] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e964305a7348078ba73f959cba8c1e7eb5329eeb321840b3250ccd06e8f946c3" HandleID="k8s-pod-network.e964305a7348078ba73f959cba8c1e7eb5329eeb321840b3250ccd06e8f946c3" Workload="ci--4459--2--2--0--839c7337fa-k8s-goldmane--666569f655--nz8db-eth0" Dec 16 13:17:40.372076 containerd[1785]: 2025-12-16 13:17:40.232 [INFO][4878] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="e964305a7348078ba73f959cba8c1e7eb5329eeb321840b3250ccd06e8f946c3" HandleID="k8s-pod-network.e964305a7348078ba73f959cba8c1e7eb5329eeb321840b3250ccd06e8f946c3" Workload="ci--4459--2--2--0--839c7337fa-k8s-goldmane--666569f655--nz8db-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002df6d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-0-839c7337fa", "pod":"goldmane-666569f655-nz8db", "timestamp":"2025-12-16 13:17:40.232498981 +0000 UTC"}, Hostname:"ci-4459-2-2-0-839c7337fa", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:17:40.372076 containerd[1785]: 2025-12-16 13:17:40.232 [INFO][4878] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:17:40.372076 containerd[1785]: 2025-12-16 13:17:40.261 [INFO][4878] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:17:40.372076 containerd[1785]: 2025-12-16 13:17:40.262 [INFO][4878] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-0-839c7337fa' Dec 16 13:17:40.372076 containerd[1785]: 2025-12-16 13:17:40.327 [INFO][4878] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e964305a7348078ba73f959cba8c1e7eb5329eeb321840b3250ccd06e8f946c3" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.372076 containerd[1785]: 2025-12-16 13:17:40.332 [INFO][4878] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.372076 containerd[1785]: 2025-12-16 13:17:40.343 [INFO][4878] ipam/ipam.go 511: Trying affinity for 192.168.96.128/26 host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.372076 containerd[1785]: 2025-12-16 13:17:40.344 [INFO][4878] ipam/ipam.go 158: Attempting to load block cidr=192.168.96.128/26 host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.372076 containerd[1785]: 2025-12-16 13:17:40.347 [INFO][4878] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.96.128/26 host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.372265 containerd[1785]: 2025-12-16 13:17:40.347 [INFO][4878] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.96.128/26 handle="k8s-pod-network.e964305a7348078ba73f959cba8c1e7eb5329eeb321840b3250ccd06e8f946c3" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.372265 containerd[1785]: 2025-12-16 13:17:40.348 [INFO][4878] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.e964305a7348078ba73f959cba8c1e7eb5329eeb321840b3250ccd06e8f946c3 Dec 16 13:17:40.372265 containerd[1785]: 2025-12-16 13:17:40.351 [INFO][4878] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.96.128/26 handle="k8s-pod-network.e964305a7348078ba73f959cba8c1e7eb5329eeb321840b3250ccd06e8f946c3" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.372265 containerd[1785]: 2025-12-16 13:17:40.357 [INFO][4878] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.96.133/26] block=192.168.96.128/26 handle="k8s-pod-network.e964305a7348078ba73f959cba8c1e7eb5329eeb321840b3250ccd06e8f946c3" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.372265 containerd[1785]: 2025-12-16 13:17:40.357 [INFO][4878] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.96.133/26] handle="k8s-pod-network.e964305a7348078ba73f959cba8c1e7eb5329eeb321840b3250ccd06e8f946c3" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.372265 containerd[1785]: 2025-12-16 13:17:40.357 [INFO][4878] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:17:40.372265 containerd[1785]: 2025-12-16 13:17:40.357 [INFO][4878] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.96.133/26] IPv6=[] ContainerID="e964305a7348078ba73f959cba8c1e7eb5329eeb321840b3250ccd06e8f946c3" HandleID="k8s-pod-network.e964305a7348078ba73f959cba8c1e7eb5329eeb321840b3250ccd06e8f946c3" Workload="ci--4459--2--2--0--839c7337fa-k8s-goldmane--666569f655--nz8db-eth0" Dec 16 13:17:40.372396 containerd[1785]: 2025-12-16 13:17:40.359 [INFO][4837] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e964305a7348078ba73f959cba8c1e7eb5329eeb321840b3250ccd06e8f946c3" Namespace="calico-system" Pod="goldmane-666569f655-nz8db" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-goldmane--666569f655--nz8db-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--839c7337fa-k8s-goldmane--666569f655--nz8db-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"4fb5337e-3d19-4a26-9e2e-58b08d0a6154", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 17, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-839c7337fa", ContainerID:"", Pod:"goldmane-666569f655-nz8db", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.96.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali77fa39b1a22", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:17:40.372468 containerd[1785]: 2025-12-16 13:17:40.359 [INFO][4837] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.96.133/32] ContainerID="e964305a7348078ba73f959cba8c1e7eb5329eeb321840b3250ccd06e8f946c3" Namespace="calico-system" Pod="goldmane-666569f655-nz8db" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-goldmane--666569f655--nz8db-eth0" Dec 16 13:17:40.372468 containerd[1785]: 2025-12-16 13:17:40.359 [INFO][4837] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali77fa39b1a22 ContainerID="e964305a7348078ba73f959cba8c1e7eb5329eeb321840b3250ccd06e8f946c3" Namespace="calico-system" Pod="goldmane-666569f655-nz8db" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-goldmane--666569f655--nz8db-eth0" Dec 16 13:17:40.372468 containerd[1785]: 2025-12-16 13:17:40.361 [INFO][4837] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e964305a7348078ba73f959cba8c1e7eb5329eeb321840b3250ccd06e8f946c3" Namespace="calico-system" Pod="goldmane-666569f655-nz8db" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-goldmane--666569f655--nz8db-eth0" Dec 16 13:17:40.372532 containerd[1785]: 2025-12-16 13:17:40.362 [INFO][4837] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e964305a7348078ba73f959cba8c1e7eb5329eeb321840b3250ccd06e8f946c3" Namespace="calico-system" Pod="goldmane-666569f655-nz8db" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-goldmane--666569f655--nz8db-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--839c7337fa-k8s-goldmane--666569f655--nz8db-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"4fb5337e-3d19-4a26-9e2e-58b08d0a6154", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 17, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-839c7337fa", ContainerID:"e964305a7348078ba73f959cba8c1e7eb5329eeb321840b3250ccd06e8f946c3", Pod:"goldmane-666569f655-nz8db", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.96.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali77fa39b1a22", MAC:"2a:f2:6e:43:df:22", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:17:40.372585 containerd[1785]: 2025-12-16 13:17:40.370 [INFO][4837] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e964305a7348078ba73f959cba8c1e7eb5329eeb321840b3250ccd06e8f946c3" Namespace="calico-system" Pod="goldmane-666569f655-nz8db" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-goldmane--666569f655--nz8db-eth0" Dec 16 13:17:40.393716 containerd[1785]: time="2025-12-16T13:17:40.393589530Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5d8cb67fb9-kbt5r,Uid:14065d61-1c00-4d68-849b-4eefd74615de,Namespace:calico-system,Attempt:0,} returns sandbox id \"19fcd816384a4917a91fdea4028122ee12da9de9b79e800515f6a8d36eac32b2\"" Dec 16 13:17:40.396473 containerd[1785]: time="2025-12-16T13:17:40.395308101Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 13:17:40.406982 containerd[1785]: time="2025-12-16T13:17:40.406937135Z" level=info msg="connecting to shim e964305a7348078ba73f959cba8c1e7eb5329eeb321840b3250ccd06e8f946c3" address="unix:///run/containerd/s/e7bfa3f8390b3398968a6d4f01e411aaf0a18d3e534b7fbecb1e0509eb549bd4" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:17:40.433610 systemd[1]: Started cri-containerd-e964305a7348078ba73f959cba8c1e7eb5329eeb321840b3250ccd06e8f946c3.scope - libcontainer container e964305a7348078ba73f959cba8c1e7eb5329eeb321840b3250ccd06e8f946c3. Dec 16 13:17:40.463655 systemd-networkd[1596]: cali899f7f82330: Link UP Dec 16 13:17:40.464198 systemd-networkd[1596]: cali899f7f82330: Gained carrier Dec 16 13:17:40.478199 containerd[1785]: 2025-12-16 13:17:40.216 [INFO][4819] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--ncz5v-eth0 calico-apiserver-76768d65dd- calico-apiserver fe36e307-b3ce-4532-b1d5-65732a7edfab 791 0 2025-12-16 13:17:13 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:76768d65dd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-0-839c7337fa calico-apiserver-76768d65dd-ncz5v eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali899f7f82330 [] [] }} ContainerID="e34a4047a53bcf7f19d0a29dc1afd020c79077cf9ef1ab57f38a099bb9704a68" Namespace="calico-apiserver" Pod="calico-apiserver-76768d65dd-ncz5v" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--ncz5v-" Dec 16 13:17:40.478199 containerd[1785]: 2025-12-16 13:17:40.216 [INFO][4819] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e34a4047a53bcf7f19d0a29dc1afd020c79077cf9ef1ab57f38a099bb9704a68" Namespace="calico-apiserver" Pod="calico-apiserver-76768d65dd-ncz5v" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--ncz5v-eth0" Dec 16 13:17:40.478199 containerd[1785]: 2025-12-16 13:17:40.253 [INFO][4890] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e34a4047a53bcf7f19d0a29dc1afd020c79077cf9ef1ab57f38a099bb9704a68" HandleID="k8s-pod-network.e34a4047a53bcf7f19d0a29dc1afd020c79077cf9ef1ab57f38a099bb9704a68" Workload="ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--ncz5v-eth0" Dec 16 13:17:40.478463 containerd[1785]: 2025-12-16 13:17:40.253 [INFO][4890] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="e34a4047a53bcf7f19d0a29dc1afd020c79077cf9ef1ab57f38a099bb9704a68" HandleID="k8s-pod-network.e34a4047a53bcf7f19d0a29dc1afd020c79077cf9ef1ab57f38a099bb9704a68" Workload="ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--ncz5v-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003435a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-0-839c7337fa", "pod":"calico-apiserver-76768d65dd-ncz5v", "timestamp":"2025-12-16 13:17:40.253558866 +0000 UTC"}, Hostname:"ci-4459-2-2-0-839c7337fa", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:17:40.478463 containerd[1785]: 2025-12-16 13:17:40.253 [INFO][4890] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:17:40.478463 containerd[1785]: 2025-12-16 13:17:40.357 [INFO][4890] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:17:40.478463 containerd[1785]: 2025-12-16 13:17:40.357 [INFO][4890] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-0-839c7337fa' Dec 16 13:17:40.478463 containerd[1785]: 2025-12-16 13:17:40.428 [INFO][4890] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e34a4047a53bcf7f19d0a29dc1afd020c79077cf9ef1ab57f38a099bb9704a68" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.478463 containerd[1785]: 2025-12-16 13:17:40.433 [INFO][4890] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.478463 containerd[1785]: 2025-12-16 13:17:40.443 [INFO][4890] ipam/ipam.go 511: Trying affinity for 192.168.96.128/26 host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.478463 containerd[1785]: 2025-12-16 13:17:40.445 [INFO][4890] ipam/ipam.go 158: Attempting to load block cidr=192.168.96.128/26 host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.478463 containerd[1785]: 2025-12-16 13:17:40.447 [INFO][4890] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.96.128/26 host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.478713 containerd[1785]: 2025-12-16 13:17:40.447 [INFO][4890] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.96.128/26 handle="k8s-pod-network.e34a4047a53bcf7f19d0a29dc1afd020c79077cf9ef1ab57f38a099bb9704a68" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.478713 containerd[1785]: 2025-12-16 13:17:40.449 [INFO][4890] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.e34a4047a53bcf7f19d0a29dc1afd020c79077cf9ef1ab57f38a099bb9704a68 Dec 16 13:17:40.478713 containerd[1785]: 2025-12-16 13:17:40.452 [INFO][4890] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.96.128/26 handle="k8s-pod-network.e34a4047a53bcf7f19d0a29dc1afd020c79077cf9ef1ab57f38a099bb9704a68" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.478713 containerd[1785]: 2025-12-16 13:17:40.459 [INFO][4890] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.96.134/26] block=192.168.96.128/26 handle="k8s-pod-network.e34a4047a53bcf7f19d0a29dc1afd020c79077cf9ef1ab57f38a099bb9704a68" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.478713 containerd[1785]: 2025-12-16 13:17:40.459 [INFO][4890] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.96.134/26] handle="k8s-pod-network.e34a4047a53bcf7f19d0a29dc1afd020c79077cf9ef1ab57f38a099bb9704a68" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:40.478713 containerd[1785]: 2025-12-16 13:17:40.460 [INFO][4890] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:17:40.478713 containerd[1785]: 2025-12-16 13:17:40.460 [INFO][4890] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.96.134/26] IPv6=[] ContainerID="e34a4047a53bcf7f19d0a29dc1afd020c79077cf9ef1ab57f38a099bb9704a68" HandleID="k8s-pod-network.e34a4047a53bcf7f19d0a29dc1afd020c79077cf9ef1ab57f38a099bb9704a68" Workload="ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--ncz5v-eth0" Dec 16 13:17:40.478849 containerd[1785]: 2025-12-16 13:17:40.461 [INFO][4819] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e34a4047a53bcf7f19d0a29dc1afd020c79077cf9ef1ab57f38a099bb9704a68" Namespace="calico-apiserver" Pod="calico-apiserver-76768d65dd-ncz5v" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--ncz5v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--ncz5v-eth0", GenerateName:"calico-apiserver-76768d65dd-", Namespace:"calico-apiserver", SelfLink:"", UID:"fe36e307-b3ce-4532-b1d5-65732a7edfab", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 17, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76768d65dd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-839c7337fa", ContainerID:"", Pod:"calico-apiserver-76768d65dd-ncz5v", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.96.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali899f7f82330", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:17:40.478902 containerd[1785]: 2025-12-16 13:17:40.462 [INFO][4819] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.96.134/32] ContainerID="e34a4047a53bcf7f19d0a29dc1afd020c79077cf9ef1ab57f38a099bb9704a68" Namespace="calico-apiserver" Pod="calico-apiserver-76768d65dd-ncz5v" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--ncz5v-eth0" Dec 16 13:17:40.478902 containerd[1785]: 2025-12-16 13:17:40.462 [INFO][4819] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali899f7f82330 ContainerID="e34a4047a53bcf7f19d0a29dc1afd020c79077cf9ef1ab57f38a099bb9704a68" Namespace="calico-apiserver" Pod="calico-apiserver-76768d65dd-ncz5v" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--ncz5v-eth0" Dec 16 13:17:40.478902 containerd[1785]: 2025-12-16 13:17:40.464 [INFO][4819] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e34a4047a53bcf7f19d0a29dc1afd020c79077cf9ef1ab57f38a099bb9704a68" Namespace="calico-apiserver" Pod="calico-apiserver-76768d65dd-ncz5v" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--ncz5v-eth0" Dec 16 13:17:40.478960 containerd[1785]: 2025-12-16 13:17:40.464 [INFO][4819] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e34a4047a53bcf7f19d0a29dc1afd020c79077cf9ef1ab57f38a099bb9704a68" Namespace="calico-apiserver" Pod="calico-apiserver-76768d65dd-ncz5v" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--ncz5v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--ncz5v-eth0", GenerateName:"calico-apiserver-76768d65dd-", Namespace:"calico-apiserver", SelfLink:"", UID:"fe36e307-b3ce-4532-b1d5-65732a7edfab", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 17, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76768d65dd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-839c7337fa", ContainerID:"e34a4047a53bcf7f19d0a29dc1afd020c79077cf9ef1ab57f38a099bb9704a68", Pod:"calico-apiserver-76768d65dd-ncz5v", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.96.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali899f7f82330", MAC:"e6:78:89:06:83:cc", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:17:40.479012 containerd[1785]: 2025-12-16 13:17:40.476 [INFO][4819] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e34a4047a53bcf7f19d0a29dc1afd020c79077cf9ef1ab57f38a099bb9704a68" Namespace="calico-apiserver" Pod="calico-apiserver-76768d65dd-ncz5v" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-calico--apiserver--76768d65dd--ncz5v-eth0" Dec 16 13:17:40.486317 containerd[1785]: time="2025-12-16T13:17:40.486273624Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-nz8db,Uid:4fb5337e-3d19-4a26-9e2e-58b08d0a6154,Namespace:calico-system,Attempt:0,} returns sandbox id \"e964305a7348078ba73f959cba8c1e7eb5329eeb321840b3250ccd06e8f946c3\"" Dec 16 13:17:40.509841 containerd[1785]: time="2025-12-16T13:17:40.509797575Z" level=info msg="connecting to shim e34a4047a53bcf7f19d0a29dc1afd020c79077cf9ef1ab57f38a099bb9704a68" address="unix:///run/containerd/s/d5f22c41a4036504184725825f1726b2518811b7a2efb5af597849bf19185276" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:17:40.537620 systemd[1]: Started cri-containerd-e34a4047a53bcf7f19d0a29dc1afd020c79077cf9ef1ab57f38a099bb9704a68.scope - libcontainer container e34a4047a53bcf7f19d0a29dc1afd020c79077cf9ef1ab57f38a099bb9704a68. Dec 16 13:17:40.587181 containerd[1785]: time="2025-12-16T13:17:40.587122849Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76768d65dd-ncz5v,Uid:fe36e307-b3ce-4532-b1d5-65732a7edfab,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"e34a4047a53bcf7f19d0a29dc1afd020c79077cf9ef1ab57f38a099bb9704a68\"" Dec 16 13:17:40.745278 containerd[1785]: time="2025-12-16T13:17:40.745178060Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:17:40.747245 containerd[1785]: time="2025-12-16T13:17:40.747138701Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 13:17:40.747245 containerd[1785]: time="2025-12-16T13:17:40.747181241Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 13:17:40.747664 kubelet[3038]: E1216 13:17:40.747580 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:17:40.747761 kubelet[3038]: E1216 13:17:40.747686 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:17:40.748275 containerd[1785]: time="2025-12-16T13:17:40.748191825Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 13:17:40.748560 kubelet[3038]: E1216 13:17:40.748318 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mkfg5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-5d8cb67fb9-kbt5r_calico-system(14065d61-1c00-4d68-849b-4eefd74615de): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 13:17:40.749864 kubelet[3038]: E1216 13:17:40.749810 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8cb67fb9-kbt5r" podUID="14065d61-1c00-4d68-849b-4eefd74615de" Dec 16 13:17:41.095341 containerd[1785]: time="2025-12-16T13:17:41.095169802Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:17:41.097589 containerd[1785]: time="2025-12-16T13:17:41.097547622Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 13:17:41.097701 containerd[1785]: time="2025-12-16T13:17:41.097648509Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 13:17:41.097868 kubelet[3038]: E1216 13:17:41.097809 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:17:41.097938 kubelet[3038]: E1216 13:17:41.097874 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:17:41.098159 kubelet[3038]: E1216 13:17:41.098105 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x8hl9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-nz8db_calico-system(4fb5337e-3d19-4a26-9e2e-58b08d0a6154): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 13:17:41.098588 containerd[1785]: time="2025-12-16T13:17:41.098558788Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:17:41.100040 kubelet[3038]: E1216 13:17:41.100009 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-nz8db" podUID="4fb5337e-3d19-4a26-9e2e-58b08d0a6154" Dec 16 13:17:41.129536 containerd[1785]: time="2025-12-16T13:17:41.129492482Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-xhgqj,Uid:c4a23620-f532-415e-b3e7-243feb3b5c27,Namespace:kube-system,Attempt:0,}" Dec 16 13:17:41.226166 systemd-networkd[1596]: caliba718b7b1b3: Link UP Dec 16 13:17:41.226954 systemd-networkd[1596]: caliba718b7b1b3: Gained carrier Dec 16 13:17:41.238152 containerd[1785]: 2025-12-16 13:17:41.166 [INFO][5074] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--xhgqj-eth0 coredns-668d6bf9bc- kube-system c4a23620-f532-415e-b3e7-243feb3b5c27 784 0 2025-12-16 13:17:04 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-0-839c7337fa coredns-668d6bf9bc-xhgqj eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] caliba718b7b1b3 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d" Namespace="kube-system" Pod="coredns-668d6bf9bc-xhgqj" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--xhgqj-" Dec 16 13:17:41.238152 containerd[1785]: 2025-12-16 13:17:41.167 [INFO][5074] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d" Namespace="kube-system" Pod="coredns-668d6bf9bc-xhgqj" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--xhgqj-eth0" Dec 16 13:17:41.238152 containerd[1785]: 2025-12-16 13:17:41.192 [INFO][5098] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d" HandleID="k8s-pod-network.b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d" Workload="ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--xhgqj-eth0" Dec 16 13:17:41.240137 containerd[1785]: 2025-12-16 13:17:41.192 [INFO][5098] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d" HandleID="k8s-pod-network.b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d" Workload="ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--xhgqj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003ba520), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-0-839c7337fa", "pod":"coredns-668d6bf9bc-xhgqj", "timestamp":"2025-12-16 13:17:41.192459849 +0000 UTC"}, Hostname:"ci-4459-2-2-0-839c7337fa", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:17:41.240137 containerd[1785]: 2025-12-16 13:17:41.192 [INFO][5098] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:17:41.240137 containerd[1785]: 2025-12-16 13:17:41.192 [INFO][5098] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:17:41.240137 containerd[1785]: 2025-12-16 13:17:41.192 [INFO][5098] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-0-839c7337fa' Dec 16 13:17:41.240137 containerd[1785]: 2025-12-16 13:17:41.198 [INFO][5098] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:41.240137 containerd[1785]: 2025-12-16 13:17:41.202 [INFO][5098] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:41.240137 containerd[1785]: 2025-12-16 13:17:41.206 [INFO][5098] ipam/ipam.go 511: Trying affinity for 192.168.96.128/26 host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:41.240137 containerd[1785]: 2025-12-16 13:17:41.207 [INFO][5098] ipam/ipam.go 158: Attempting to load block cidr=192.168.96.128/26 host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:41.240137 containerd[1785]: 2025-12-16 13:17:41.209 [INFO][5098] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.96.128/26 host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:41.240319 containerd[1785]: 2025-12-16 13:17:41.209 [INFO][5098] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.96.128/26 handle="k8s-pod-network.b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:41.240319 containerd[1785]: 2025-12-16 13:17:41.210 [INFO][5098] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d Dec 16 13:17:41.240319 containerd[1785]: 2025-12-16 13:17:41.214 [INFO][5098] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.96.128/26 handle="k8s-pod-network.b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:41.240319 containerd[1785]: 2025-12-16 13:17:41.221 [INFO][5098] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.96.135/26] block=192.168.96.128/26 handle="k8s-pod-network.b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:41.240319 containerd[1785]: 2025-12-16 13:17:41.221 [INFO][5098] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.96.135/26] handle="k8s-pod-network.b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:41.240319 containerd[1785]: 2025-12-16 13:17:41.222 [INFO][5098] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:17:41.240319 containerd[1785]: 2025-12-16 13:17:41.222 [INFO][5098] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.96.135/26] IPv6=[] ContainerID="b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d" HandleID="k8s-pod-network.b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d" Workload="ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--xhgqj-eth0" Dec 16 13:17:41.240486 containerd[1785]: 2025-12-16 13:17:41.223 [INFO][5074] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d" Namespace="kube-system" Pod="coredns-668d6bf9bc-xhgqj" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--xhgqj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--xhgqj-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"c4a23620-f532-415e-b3e7-243feb3b5c27", ResourceVersion:"784", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 17, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-839c7337fa", ContainerID:"", Pod:"coredns-668d6bf9bc-xhgqj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.96.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliba718b7b1b3", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:17:41.240486 containerd[1785]: 2025-12-16 13:17:41.223 [INFO][5074] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.96.135/32] ContainerID="b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d" Namespace="kube-system" Pod="coredns-668d6bf9bc-xhgqj" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--xhgqj-eth0" Dec 16 13:17:41.240486 containerd[1785]: 2025-12-16 13:17:41.223 [INFO][5074] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliba718b7b1b3 ContainerID="b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d" Namespace="kube-system" Pod="coredns-668d6bf9bc-xhgqj" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--xhgqj-eth0" Dec 16 13:17:41.240486 containerd[1785]: 2025-12-16 13:17:41.227 [INFO][5074] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d" Namespace="kube-system" Pod="coredns-668d6bf9bc-xhgqj" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--xhgqj-eth0" Dec 16 13:17:41.240486 containerd[1785]: 2025-12-16 13:17:41.228 [INFO][5074] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d" Namespace="kube-system" Pod="coredns-668d6bf9bc-xhgqj" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--xhgqj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--xhgqj-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"c4a23620-f532-415e-b3e7-243feb3b5c27", ResourceVersion:"784", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 17, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-839c7337fa", ContainerID:"b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d", Pod:"coredns-668d6bf9bc-xhgqj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.96.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliba718b7b1b3", MAC:"da:63:85:42:45:32", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:17:41.240486 containerd[1785]: 2025-12-16 13:17:41.236 [INFO][5074] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d" Namespace="kube-system" Pod="coredns-668d6bf9bc-xhgqj" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--xhgqj-eth0" Dec 16 13:17:41.254416 kubelet[3038]: E1216 13:17:41.254381 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8cb67fb9-kbt5r" podUID="14065d61-1c00-4d68-849b-4eefd74615de" Dec 16 13:17:41.256379 kubelet[3038]: E1216 13:17:41.256356 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-nz8db" podUID="4fb5337e-3d19-4a26-9e2e-58b08d0a6154" Dec 16 13:17:41.269808 containerd[1785]: time="2025-12-16T13:17:41.269770984Z" level=info msg="connecting to shim b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d" address="unix:///run/containerd/s/64b912684f43da9241237ca5586fb35d53b85237e86bb96756149f0408ad765d" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:17:41.294670 systemd[1]: Started cri-containerd-b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d.scope - libcontainer container b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d. Dec 16 13:17:41.337776 containerd[1785]: time="2025-12-16T13:17:41.337724922Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-xhgqj,Uid:c4a23620-f532-415e-b3e7-243feb3b5c27,Namespace:kube-system,Attempt:0,} returns sandbox id \"b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d\"" Dec 16 13:17:41.340050 containerd[1785]: time="2025-12-16T13:17:41.340021014Z" level=info msg="CreateContainer within sandbox \"b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 13:17:41.353947 containerd[1785]: time="2025-12-16T13:17:41.353830814Z" level=info msg="Container f9a0dd756126e2c2ce43dfe1cf268b989dedb0c749d676fcd4afdf80382ff873: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:17:41.365116 containerd[1785]: time="2025-12-16T13:17:41.365075153Z" level=info msg="CreateContainer within sandbox \"b1306d7a491acada76dd10c754c0d5154d6991781569e291132cba545029600d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"f9a0dd756126e2c2ce43dfe1cf268b989dedb0c749d676fcd4afdf80382ff873\"" Dec 16 13:17:41.365555 containerd[1785]: time="2025-12-16T13:17:41.365532066Z" level=info msg="StartContainer for \"f9a0dd756126e2c2ce43dfe1cf268b989dedb0c749d676fcd4afdf80382ff873\"" Dec 16 13:17:41.366257 containerd[1785]: time="2025-12-16T13:17:41.366233593Z" level=info msg="connecting to shim f9a0dd756126e2c2ce43dfe1cf268b989dedb0c749d676fcd4afdf80382ff873" address="unix:///run/containerd/s/64b912684f43da9241237ca5586fb35d53b85237e86bb96756149f0408ad765d" protocol=ttrpc version=3 Dec 16 13:17:41.390675 systemd[1]: Started cri-containerd-f9a0dd756126e2c2ce43dfe1cf268b989dedb0c749d676fcd4afdf80382ff873.scope - libcontainer container f9a0dd756126e2c2ce43dfe1cf268b989dedb0c749d676fcd4afdf80382ff873. Dec 16 13:17:41.406611 systemd-networkd[1596]: cali77fa39b1a22: Gained IPv6LL Dec 16 13:17:41.420940 containerd[1785]: time="2025-12-16T13:17:41.420896385Z" level=info msg="StartContainer for \"f9a0dd756126e2c2ce43dfe1cf268b989dedb0c749d676fcd4afdf80382ff873\" returns successfully" Dec 16 13:17:41.436695 containerd[1785]: time="2025-12-16T13:17:41.436651506Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:17:41.438492 containerd[1785]: time="2025-12-16T13:17:41.438460139Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:17:41.439113 containerd[1785]: time="2025-12-16T13:17:41.438538699Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:17:41.439161 kubelet[3038]: E1216 13:17:41.438650 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:17:41.439161 kubelet[3038]: E1216 13:17:41.438693 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:17:41.439161 kubelet[3038]: E1216 13:17:41.438812 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tn2rm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76768d65dd-ncz5v_calico-apiserver(fe36e307-b3ce-4532-b1d5-65732a7edfab): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:17:41.440396 kubelet[3038]: E1216 13:17:41.440369 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-ncz5v" podUID="fe36e307-b3ce-4532-b1d5-65732a7edfab" Dec 16 13:17:41.533660 systemd-networkd[1596]: calic306f8c29b1: Gained IPv6LL Dec 16 13:17:41.917669 systemd-networkd[1596]: cali899f7f82330: Gained IPv6LL Dec 16 13:17:42.130178 containerd[1785]: time="2025-12-16T13:17:42.130077487Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-7jsz2,Uid:f4f96a7b-1e66-4e29-b3f0-890217e05473,Namespace:kube-system,Attempt:0,}" Dec 16 13:17:42.259601 kubelet[3038]: E1216 13:17:42.259552 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-ncz5v" podUID="fe36e307-b3ce-4532-b1d5-65732a7edfab" Dec 16 13:17:42.260501 kubelet[3038]: E1216 13:17:42.260476 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8cb67fb9-kbt5r" podUID="14065d61-1c00-4d68-849b-4eefd74615de" Dec 16 13:17:42.260618 kubelet[3038]: E1216 13:17:42.260598 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-nz8db" podUID="4fb5337e-3d19-4a26-9e2e-58b08d0a6154" Dec 16 13:17:42.268083 systemd-networkd[1596]: cali0f7a267bc95: Link UP Dec 16 13:17:42.268229 systemd-networkd[1596]: cali0f7a267bc95: Gained carrier Dec 16 13:17:42.281557 containerd[1785]: 2025-12-16 13:17:42.202 [INFO][5202] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--7jsz2-eth0 coredns-668d6bf9bc- kube-system f4f96a7b-1e66-4e29-b3f0-890217e05473 788 0 2025-12-16 13:17:04 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-0-839c7337fa coredns-668d6bf9bc-7jsz2 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali0f7a267bc95 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980" Namespace="kube-system" Pod="coredns-668d6bf9bc-7jsz2" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--7jsz2-" Dec 16 13:17:42.281557 containerd[1785]: 2025-12-16 13:17:42.202 [INFO][5202] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980" Namespace="kube-system" Pod="coredns-668d6bf9bc-7jsz2" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--7jsz2-eth0" Dec 16 13:17:42.281557 containerd[1785]: 2025-12-16 13:17:42.226 [INFO][5221] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980" HandleID="k8s-pod-network.421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980" Workload="ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--7jsz2-eth0" Dec 16 13:17:42.281557 containerd[1785]: 2025-12-16 13:17:42.226 [INFO][5221] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980" HandleID="k8s-pod-network.421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980" Workload="ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--7jsz2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5b20), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-0-839c7337fa", "pod":"coredns-668d6bf9bc-7jsz2", "timestamp":"2025-12-16 13:17:42.226685745 +0000 UTC"}, Hostname:"ci-4459-2-2-0-839c7337fa", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:17:42.281557 containerd[1785]: 2025-12-16 13:17:42.226 [INFO][5221] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:17:42.281557 containerd[1785]: 2025-12-16 13:17:42.227 [INFO][5221] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:17:42.281557 containerd[1785]: 2025-12-16 13:17:42.227 [INFO][5221] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-0-839c7337fa' Dec 16 13:17:42.281557 containerd[1785]: 2025-12-16 13:17:42.235 [INFO][5221] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:42.281557 containerd[1785]: 2025-12-16 13:17:42.242 [INFO][5221] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:42.281557 containerd[1785]: 2025-12-16 13:17:42.246 [INFO][5221] ipam/ipam.go 511: Trying affinity for 192.168.96.128/26 host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:42.281557 containerd[1785]: 2025-12-16 13:17:42.248 [INFO][5221] ipam/ipam.go 158: Attempting to load block cidr=192.168.96.128/26 host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:42.281557 containerd[1785]: 2025-12-16 13:17:42.250 [INFO][5221] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.96.128/26 host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:42.281557 containerd[1785]: 2025-12-16 13:17:42.250 [INFO][5221] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.96.128/26 handle="k8s-pod-network.421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:42.281557 containerd[1785]: 2025-12-16 13:17:42.251 [INFO][5221] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980 Dec 16 13:17:42.281557 containerd[1785]: 2025-12-16 13:17:42.255 [INFO][5221] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.96.128/26 handle="k8s-pod-network.421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:42.281557 containerd[1785]: 2025-12-16 13:17:42.264 [INFO][5221] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.96.136/26] block=192.168.96.128/26 handle="k8s-pod-network.421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:42.281557 containerd[1785]: 2025-12-16 13:17:42.264 [INFO][5221] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.96.136/26] handle="k8s-pod-network.421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980" host="ci-4459-2-2-0-839c7337fa" Dec 16 13:17:42.281557 containerd[1785]: 2025-12-16 13:17:42.264 [INFO][5221] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:17:42.281557 containerd[1785]: 2025-12-16 13:17:42.264 [INFO][5221] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.96.136/26] IPv6=[] ContainerID="421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980" HandleID="k8s-pod-network.421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980" Workload="ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--7jsz2-eth0" Dec 16 13:17:42.282271 containerd[1785]: 2025-12-16 13:17:42.266 [INFO][5202] cni-plugin/k8s.go 418: Populated endpoint ContainerID="421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980" Namespace="kube-system" Pod="coredns-668d6bf9bc-7jsz2" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--7jsz2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--7jsz2-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"f4f96a7b-1e66-4e29-b3f0-890217e05473", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 17, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-839c7337fa", ContainerID:"", Pod:"coredns-668d6bf9bc-7jsz2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.96.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0f7a267bc95", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:17:42.282271 containerd[1785]: 2025-12-16 13:17:42.266 [INFO][5202] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.96.136/32] ContainerID="421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980" Namespace="kube-system" Pod="coredns-668d6bf9bc-7jsz2" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--7jsz2-eth0" Dec 16 13:17:42.282271 containerd[1785]: 2025-12-16 13:17:42.266 [INFO][5202] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0f7a267bc95 ContainerID="421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980" Namespace="kube-system" Pod="coredns-668d6bf9bc-7jsz2" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--7jsz2-eth0" Dec 16 13:17:42.282271 containerd[1785]: 2025-12-16 13:17:42.268 [INFO][5202] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980" Namespace="kube-system" Pod="coredns-668d6bf9bc-7jsz2" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--7jsz2-eth0" Dec 16 13:17:42.282271 containerd[1785]: 2025-12-16 13:17:42.268 [INFO][5202] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980" Namespace="kube-system" Pod="coredns-668d6bf9bc-7jsz2" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--7jsz2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--7jsz2-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"f4f96a7b-1e66-4e29-b3f0-890217e05473", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 17, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-0-839c7337fa", ContainerID:"421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980", Pod:"coredns-668d6bf9bc-7jsz2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.96.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0f7a267bc95", MAC:"8a:79:60:16:46:3e", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:17:42.282271 containerd[1785]: 2025-12-16 13:17:42.278 [INFO][5202] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980" Namespace="kube-system" Pod="coredns-668d6bf9bc-7jsz2" WorkloadEndpoint="ci--4459--2--2--0--839c7337fa-k8s-coredns--668d6bf9bc--7jsz2-eth0" Dec 16 13:17:42.300586 kubelet[3038]: I1216 13:17:42.300522 3038 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-xhgqj" podStartSLOduration=38.300504993 podStartE2EDuration="38.300504993s" podCreationTimestamp="2025-12-16 13:17:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:17:42.299843077 +0000 UTC m=+43.251530160" watchObservedRunningTime="2025-12-16 13:17:42.300504993 +0000 UTC m=+43.252192075" Dec 16 13:17:42.311839 containerd[1785]: time="2025-12-16T13:17:42.311774283Z" level=info msg="connecting to shim 421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980" address="unix:///run/containerd/s/f852c5201bf4446fb813eecfa08afd071038878ffb5c8b076bc72a3cc12968e4" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:17:42.336623 systemd[1]: Started cri-containerd-421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980.scope - libcontainer container 421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980. Dec 16 13:17:42.381393 containerd[1785]: time="2025-12-16T13:17:42.381339197Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-7jsz2,Uid:f4f96a7b-1e66-4e29-b3f0-890217e05473,Namespace:kube-system,Attempt:0,} returns sandbox id \"421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980\"" Dec 16 13:17:42.383823 containerd[1785]: time="2025-12-16T13:17:42.383568257Z" level=info msg="CreateContainer within sandbox \"421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 13:17:42.400340 containerd[1785]: time="2025-12-16T13:17:42.400281455Z" level=info msg="Container f5e60eb7c03724d3ae2b167fed4c9713f458302b651a20dd73108ec54a360d2f: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:17:42.408500 containerd[1785]: time="2025-12-16T13:17:42.408441094Z" level=info msg="CreateContainer within sandbox \"421cfb2ae6f49db3e5c97193a6a48b4ba17a076229ac197387c2766387548980\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"f5e60eb7c03724d3ae2b167fed4c9713f458302b651a20dd73108ec54a360d2f\"" Dec 16 13:17:42.408980 containerd[1785]: time="2025-12-16T13:17:42.408956750Z" level=info msg="StartContainer for \"f5e60eb7c03724d3ae2b167fed4c9713f458302b651a20dd73108ec54a360d2f\"" Dec 16 13:17:42.409747 containerd[1785]: time="2025-12-16T13:17:42.409635525Z" level=info msg="connecting to shim f5e60eb7c03724d3ae2b167fed4c9713f458302b651a20dd73108ec54a360d2f" address="unix:///run/containerd/s/f852c5201bf4446fb813eecfa08afd071038878ffb5c8b076bc72a3cc12968e4" protocol=ttrpc version=3 Dec 16 13:17:42.431654 systemd[1]: Started cri-containerd-f5e60eb7c03724d3ae2b167fed4c9713f458302b651a20dd73108ec54a360d2f.scope - libcontainer container f5e60eb7c03724d3ae2b167fed4c9713f458302b651a20dd73108ec54a360d2f. Dec 16 13:17:42.457965 containerd[1785]: time="2025-12-16T13:17:42.457926495Z" level=info msg="StartContainer for \"f5e60eb7c03724d3ae2b167fed4c9713f458302b651a20dd73108ec54a360d2f\" returns successfully" Dec 16 13:17:42.814001 systemd-networkd[1596]: caliba718b7b1b3: Gained IPv6LL Dec 16 13:17:43.282371 kubelet[3038]: I1216 13:17:43.282287 3038 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-7jsz2" podStartSLOduration=39.282264167 podStartE2EDuration="39.282264167s" podCreationTimestamp="2025-12-16 13:17:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:17:43.28222643 +0000 UTC m=+44.233913533" watchObservedRunningTime="2025-12-16 13:17:43.282264167 +0000 UTC m=+44.233951313" Dec 16 13:17:43.389730 systemd-networkd[1596]: cali0f7a267bc95: Gained IPv6LL Dec 16 13:17:51.130697 containerd[1785]: time="2025-12-16T13:17:51.130432533Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:17:51.674407 containerd[1785]: time="2025-12-16T13:17:51.674283842Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:17:51.676113 containerd[1785]: time="2025-12-16T13:17:51.676075207Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:17:51.676189 containerd[1785]: time="2025-12-16T13:17:51.676158123Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:17:51.676396 kubelet[3038]: E1216 13:17:51.676346 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:17:51.676883 kubelet[3038]: E1216 13:17:51.676416 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:17:51.676883 kubelet[3038]: E1216 13:17:51.676781 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5k7p2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76768d65dd-q8jbl_calico-apiserver(8d366b0a-e187-4483-8bcf-46758c32eaee): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:17:51.677164 containerd[1785]: time="2025-12-16T13:17:51.677137618Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 13:17:51.678407 kubelet[3038]: E1216 13:17:51.678369 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-q8jbl" podUID="8d366b0a-e187-4483-8bcf-46758c32eaee" Dec 16 13:17:52.011745 containerd[1785]: time="2025-12-16T13:17:52.011646055Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:17:52.014180 containerd[1785]: time="2025-12-16T13:17:52.014097056Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 13:17:52.014492 containerd[1785]: time="2025-12-16T13:17:52.014212556Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 13:17:52.014578 kubelet[3038]: E1216 13:17:52.014391 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:17:52.014578 kubelet[3038]: E1216 13:17:52.014500 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:17:52.014819 kubelet[3038]: E1216 13:17:52.014746 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9zgzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-4vgpv_calico-system(2c2ad34d-d82b-4624-87cd-76ece6a8970b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 13:17:52.015059 containerd[1785]: time="2025-12-16T13:17:52.014861040Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 13:17:52.339071 containerd[1785]: time="2025-12-16T13:17:52.338913738Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:17:52.341014 containerd[1785]: time="2025-12-16T13:17:52.340937314Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 13:17:52.341102 containerd[1785]: time="2025-12-16T13:17:52.341001362Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 13:17:52.341337 kubelet[3038]: E1216 13:17:52.341245 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:17:52.341337 kubelet[3038]: E1216 13:17:52.341327 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:17:52.342072 kubelet[3038]: E1216 13:17:52.341594 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:dda5a456290f47049c1dcb0ab4e8d120,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tkx8g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7f659d6578-trd8w_calico-system(cb2a00f3-8bc2-4b68-a631-62b5470c7b77): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 13:17:52.342186 containerd[1785]: time="2025-12-16T13:17:52.341740815Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 13:17:52.686748 containerd[1785]: time="2025-12-16T13:17:52.686596803Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:17:52.690483 containerd[1785]: time="2025-12-16T13:17:52.690324640Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 13:17:52.690483 containerd[1785]: time="2025-12-16T13:17:52.690400087Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 13:17:52.690690 kubelet[3038]: E1216 13:17:52.690647 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:17:52.691120 kubelet[3038]: E1216 13:17:52.690706 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:17:52.691120 kubelet[3038]: E1216 13:17:52.691001 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9zgzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-4vgpv_calico-system(2c2ad34d-d82b-4624-87cd-76ece6a8970b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 13:17:52.691319 containerd[1785]: time="2025-12-16T13:17:52.691247842Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 13:17:52.692418 kubelet[3038]: E1216 13:17:52.692346 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-4vgpv" podUID="2c2ad34d-d82b-4624-87cd-76ece6a8970b" Dec 16 13:17:53.024142 containerd[1785]: time="2025-12-16T13:17:53.024073493Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:17:53.025750 containerd[1785]: time="2025-12-16T13:17:53.025693351Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 13:17:53.025809 containerd[1785]: time="2025-12-16T13:17:53.025772944Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 13:17:53.025941 kubelet[3038]: E1216 13:17:53.025902 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:17:53.026011 kubelet[3038]: E1216 13:17:53.025952 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:17:53.026132 kubelet[3038]: E1216 13:17:53.026088 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tkx8g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7f659d6578-trd8w_calico-system(cb2a00f3-8bc2-4b68-a631-62b5470c7b77): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 13:17:53.027410 kubelet[3038]: E1216 13:17:53.027338 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f659d6578-trd8w" podUID="cb2a00f3-8bc2-4b68-a631-62b5470c7b77" Dec 16 13:17:53.130630 containerd[1785]: time="2025-12-16T13:17:53.130541800Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 13:17:53.479759 containerd[1785]: time="2025-12-16T13:17:53.479556072Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:17:53.482254 containerd[1785]: time="2025-12-16T13:17:53.482158452Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 13:17:53.482416 containerd[1785]: time="2025-12-16T13:17:53.482277598Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 13:17:53.482708 kubelet[3038]: E1216 13:17:53.482620 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:17:53.482788 kubelet[3038]: E1216 13:17:53.482724 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:17:53.483233 kubelet[3038]: E1216 13:17:53.483122 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mkfg5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-5d8cb67fb9-kbt5r_calico-system(14065d61-1c00-4d68-849b-4eefd74615de): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 13:17:53.484545 kubelet[3038]: E1216 13:17:53.484435 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8cb67fb9-kbt5r" podUID="14065d61-1c00-4d68-849b-4eefd74615de" Dec 16 13:17:54.130180 containerd[1785]: time="2025-12-16T13:17:54.130106542Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:17:54.496699 containerd[1785]: time="2025-12-16T13:17:54.496606795Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:17:54.498599 containerd[1785]: time="2025-12-16T13:17:54.498518519Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:17:54.498809 containerd[1785]: time="2025-12-16T13:17:54.498615221Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:17:54.498994 kubelet[3038]: E1216 13:17:54.498885 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:17:54.500613 kubelet[3038]: E1216 13:17:54.499005 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:17:54.500613 kubelet[3038]: E1216 13:17:54.499299 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tn2rm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76768d65dd-ncz5v_calico-apiserver(fe36e307-b3ce-4532-b1d5-65732a7edfab): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:17:54.500990 kubelet[3038]: E1216 13:17:54.500617 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-ncz5v" podUID="fe36e307-b3ce-4532-b1d5-65732a7edfab" Dec 16 13:17:57.131268 containerd[1785]: time="2025-12-16T13:17:57.131195620Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 13:17:57.474628 containerd[1785]: time="2025-12-16T13:17:57.474423544Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:17:57.476465 containerd[1785]: time="2025-12-16T13:17:57.476386897Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 13:17:57.476578 containerd[1785]: time="2025-12-16T13:17:57.476480556Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 13:17:57.476642 kubelet[3038]: E1216 13:17:57.476606 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:17:57.477078 kubelet[3038]: E1216 13:17:57.476652 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:17:57.477078 kubelet[3038]: E1216 13:17:57.476816 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x8hl9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-nz8db_calico-system(4fb5337e-3d19-4a26-9e2e-58b08d0a6154): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 13:17:57.478303 kubelet[3038]: E1216 13:17:57.478262 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-nz8db" podUID="4fb5337e-3d19-4a26-9e2e-58b08d0a6154" Dec 16 13:18:03.131006 kubelet[3038]: E1216 13:18:03.130912 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-4vgpv" podUID="2c2ad34d-d82b-4624-87cd-76ece6a8970b" Dec 16 13:18:04.131171 kubelet[3038]: E1216 13:18:04.131074 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f659d6578-trd8w" podUID="cb2a00f3-8bc2-4b68-a631-62b5470c7b77" Dec 16 13:18:06.129780 kubelet[3038]: E1216 13:18:06.129625 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-q8jbl" podUID="8d366b0a-e187-4483-8bcf-46758c32eaee" Dec 16 13:18:07.131883 kubelet[3038]: E1216 13:18:07.131806 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8cb67fb9-kbt5r" podUID="14065d61-1c00-4d68-849b-4eefd74615de" Dec 16 13:18:08.130030 kubelet[3038]: E1216 13:18:08.129945 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-ncz5v" podUID="fe36e307-b3ce-4532-b1d5-65732a7edfab" Dec 16 13:18:10.129937 kubelet[3038]: E1216 13:18:10.129829 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-nz8db" podUID="4fb5337e-3d19-4a26-9e2e-58b08d0a6154" Dec 16 13:18:14.131140 containerd[1785]: time="2025-12-16T13:18:14.131086914Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 13:18:14.483648 containerd[1785]: time="2025-12-16T13:18:14.483488342Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:18:14.485177 containerd[1785]: time="2025-12-16T13:18:14.485132352Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 13:18:14.485232 containerd[1785]: time="2025-12-16T13:18:14.485194828Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 13:18:14.485416 kubelet[3038]: E1216 13:18:14.485368 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:18:14.485736 kubelet[3038]: E1216 13:18:14.485428 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:18:14.485736 kubelet[3038]: E1216 13:18:14.485580 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9zgzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-4vgpv_calico-system(2c2ad34d-d82b-4624-87cd-76ece6a8970b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 13:18:14.487986 containerd[1785]: time="2025-12-16T13:18:14.487950235Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 13:18:14.840615 containerd[1785]: time="2025-12-16T13:18:14.840510157Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:18:14.842784 containerd[1785]: time="2025-12-16T13:18:14.842699550Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 13:18:14.842905 containerd[1785]: time="2025-12-16T13:18:14.842773295Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 13:18:14.843084 kubelet[3038]: E1216 13:18:14.843007 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:18:14.843169 kubelet[3038]: E1216 13:18:14.843093 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:18:14.843334 kubelet[3038]: E1216 13:18:14.843265 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9zgzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-4vgpv_calico-system(2c2ad34d-d82b-4624-87cd-76ece6a8970b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 13:18:14.844918 kubelet[3038]: E1216 13:18:14.844811 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-4vgpv" podUID="2c2ad34d-d82b-4624-87cd-76ece6a8970b" Dec 16 13:18:19.130579 containerd[1785]: time="2025-12-16T13:18:19.130502715Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 13:18:19.475673 containerd[1785]: time="2025-12-16T13:18:19.475383609Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:18:19.478239 containerd[1785]: time="2025-12-16T13:18:19.478005607Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 13:18:19.478239 containerd[1785]: time="2025-12-16T13:18:19.478097834Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 13:18:19.478520 kubelet[3038]: E1216 13:18:19.478331 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:18:19.478520 kubelet[3038]: E1216 13:18:19.478401 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:18:19.479178 kubelet[3038]: E1216 13:18:19.478610 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:dda5a456290f47049c1dcb0ab4e8d120,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tkx8g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7f659d6578-trd8w_calico-system(cb2a00f3-8bc2-4b68-a631-62b5470c7b77): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 13:18:19.483015 containerd[1785]: time="2025-12-16T13:18:19.482597590Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 13:18:20.018845 containerd[1785]: time="2025-12-16T13:18:20.018787747Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:18:20.020604 containerd[1785]: time="2025-12-16T13:18:20.020557694Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 13:18:20.020656 containerd[1785]: time="2025-12-16T13:18:20.020596181Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 13:18:20.020825 kubelet[3038]: E1216 13:18:20.020775 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:18:20.020881 kubelet[3038]: E1216 13:18:20.020833 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:18:20.021001 kubelet[3038]: E1216 13:18:20.020958 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tkx8g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7f659d6578-trd8w_calico-system(cb2a00f3-8bc2-4b68-a631-62b5470c7b77): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 13:18:20.022248 kubelet[3038]: E1216 13:18:20.022186 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f659d6578-trd8w" podUID="cb2a00f3-8bc2-4b68-a631-62b5470c7b77" Dec 16 13:18:20.131641 containerd[1785]: time="2025-12-16T13:18:20.131562022Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 13:18:20.613158 containerd[1785]: time="2025-12-16T13:18:20.613068842Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:18:20.616528 containerd[1785]: time="2025-12-16T13:18:20.616402162Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 13:18:20.616698 containerd[1785]: time="2025-12-16T13:18:20.616460399Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 13:18:20.616787 kubelet[3038]: E1216 13:18:20.616733 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:18:20.617172 kubelet[3038]: E1216 13:18:20.616801 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:18:20.617172 kubelet[3038]: E1216 13:18:20.616960 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mkfg5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-5d8cb67fb9-kbt5r_calico-system(14065d61-1c00-4d68-849b-4eefd74615de): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 13:18:20.618192 kubelet[3038]: E1216 13:18:20.618164 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8cb67fb9-kbt5r" podUID="14065d61-1c00-4d68-849b-4eefd74615de" Dec 16 13:18:21.130900 containerd[1785]: time="2025-12-16T13:18:21.130692410Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:18:21.472768 containerd[1785]: time="2025-12-16T13:18:21.472607409Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:18:21.474817 containerd[1785]: time="2025-12-16T13:18:21.474761600Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:18:21.474911 containerd[1785]: time="2025-12-16T13:18:21.474810336Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:18:21.475299 kubelet[3038]: E1216 13:18:21.474996 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:18:21.475299 kubelet[3038]: E1216 13:18:21.475058 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:18:21.475299 kubelet[3038]: E1216 13:18:21.475186 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5k7p2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76768d65dd-q8jbl_calico-apiserver(8d366b0a-e187-4483-8bcf-46758c32eaee): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:18:21.476417 kubelet[3038]: E1216 13:18:21.476376 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-q8jbl" podUID="8d366b0a-e187-4483-8bcf-46758c32eaee" Dec 16 13:18:22.129727 containerd[1785]: time="2025-12-16T13:18:22.129664568Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:18:22.497462 containerd[1785]: time="2025-12-16T13:18:22.497396151Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:18:22.501011 containerd[1785]: time="2025-12-16T13:18:22.500961135Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:18:22.501096 containerd[1785]: time="2025-12-16T13:18:22.501060068Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:18:22.501335 kubelet[3038]: E1216 13:18:22.501240 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:18:22.501702 kubelet[3038]: E1216 13:18:22.501341 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:18:22.501702 kubelet[3038]: E1216 13:18:22.501632 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tn2rm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76768d65dd-ncz5v_calico-apiserver(fe36e307-b3ce-4532-b1d5-65732a7edfab): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:18:22.501965 containerd[1785]: time="2025-12-16T13:18:22.501898226Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 13:18:22.503018 kubelet[3038]: E1216 13:18:22.502963 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-ncz5v" podUID="fe36e307-b3ce-4532-b1d5-65732a7edfab" Dec 16 13:18:22.860781 containerd[1785]: time="2025-12-16T13:18:22.860513073Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:18:22.862820 containerd[1785]: time="2025-12-16T13:18:22.862729418Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 13:18:22.862820 containerd[1785]: time="2025-12-16T13:18:22.862781044Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 13:18:22.863139 kubelet[3038]: E1216 13:18:22.863041 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:18:22.863139 kubelet[3038]: E1216 13:18:22.863111 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:18:22.863412 kubelet[3038]: E1216 13:18:22.863312 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x8hl9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-nz8db_calico-system(4fb5337e-3d19-4a26-9e2e-58b08d0a6154): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 13:18:22.864683 kubelet[3038]: E1216 13:18:22.864576 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-nz8db" podUID="4fb5337e-3d19-4a26-9e2e-58b08d0a6154" Dec 16 13:18:29.131208 kubelet[3038]: E1216 13:18:29.131109 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-4vgpv" podUID="2c2ad34d-d82b-4624-87cd-76ece6a8970b" Dec 16 13:18:32.130367 kubelet[3038]: E1216 13:18:32.130282 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8cb67fb9-kbt5r" podUID="14065d61-1c00-4d68-849b-4eefd74615de" Dec 16 13:18:32.130973 kubelet[3038]: E1216 13:18:32.130797 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f659d6578-trd8w" podUID="cb2a00f3-8bc2-4b68-a631-62b5470c7b77" Dec 16 13:18:33.131946 kubelet[3038]: E1216 13:18:33.131898 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-q8jbl" podUID="8d366b0a-e187-4483-8bcf-46758c32eaee" Dec 16 13:18:35.130960 kubelet[3038]: E1216 13:18:35.130898 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-ncz5v" podUID="fe36e307-b3ce-4532-b1d5-65732a7edfab" Dec 16 13:18:37.131441 kubelet[3038]: E1216 13:18:37.131300 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-nz8db" podUID="4fb5337e-3d19-4a26-9e2e-58b08d0a6154" Dec 16 13:18:41.130165 kubelet[3038]: E1216 13:18:41.130083 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-4vgpv" podUID="2c2ad34d-d82b-4624-87cd-76ece6a8970b" Dec 16 13:18:43.130786 kubelet[3038]: E1216 13:18:43.130741 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8cb67fb9-kbt5r" podUID="14065d61-1c00-4d68-849b-4eefd74615de" Dec 16 13:18:46.129940 kubelet[3038]: E1216 13:18:46.129868 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-ncz5v" podUID="fe36e307-b3ce-4532-b1d5-65732a7edfab" Dec 16 13:18:47.129871 kubelet[3038]: E1216 13:18:47.129600 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-q8jbl" podUID="8d366b0a-e187-4483-8bcf-46758c32eaee" Dec 16 13:18:47.130241 kubelet[3038]: E1216 13:18:47.130197 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f659d6578-trd8w" podUID="cb2a00f3-8bc2-4b68-a631-62b5470c7b77" Dec 16 13:18:52.130307 kubelet[3038]: E1216 13:18:52.130249 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-nz8db" podUID="4fb5337e-3d19-4a26-9e2e-58b08d0a6154" Dec 16 13:18:52.131102 kubelet[3038]: E1216 13:18:52.130652 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-4vgpv" podUID="2c2ad34d-d82b-4624-87cd-76ece6a8970b" Dec 16 13:18:54.130822 kubelet[3038]: E1216 13:18:54.130731 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8cb67fb9-kbt5r" podUID="14065d61-1c00-4d68-849b-4eefd74615de" Dec 16 13:18:57.130432 kubelet[3038]: E1216 13:18:57.130299 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-ncz5v" podUID="fe36e307-b3ce-4532-b1d5-65732a7edfab" Dec 16 13:19:01.130701 containerd[1785]: time="2025-12-16T13:19:01.130547617Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 13:19:01.474322 containerd[1785]: time="2025-12-16T13:19:01.474098760Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:19:01.476594 containerd[1785]: time="2025-12-16T13:19:01.476504107Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 13:19:01.476820 containerd[1785]: time="2025-12-16T13:19:01.476547820Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 13:19:01.477062 kubelet[3038]: E1216 13:19:01.476954 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:19:01.477800 kubelet[3038]: E1216 13:19:01.477077 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:19:01.477800 kubelet[3038]: E1216 13:19:01.477335 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:dda5a456290f47049c1dcb0ab4e8d120,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tkx8g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7f659d6578-trd8w_calico-system(cb2a00f3-8bc2-4b68-a631-62b5470c7b77): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 13:19:01.480020 containerd[1785]: time="2025-12-16T13:19:01.479930966Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 13:19:01.816944 containerd[1785]: time="2025-12-16T13:19:01.816872661Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:19:01.818700 containerd[1785]: time="2025-12-16T13:19:01.818639575Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 13:19:01.818858 containerd[1785]: time="2025-12-16T13:19:01.818711202Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 13:19:01.818992 kubelet[3038]: E1216 13:19:01.818925 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:19:01.819076 kubelet[3038]: E1216 13:19:01.819007 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:19:01.819979 kubelet[3038]: E1216 13:19:01.819160 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tkx8g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7f659d6578-trd8w_calico-system(cb2a00f3-8bc2-4b68-a631-62b5470c7b77): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 13:19:01.821297 kubelet[3038]: E1216 13:19:01.821232 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f659d6578-trd8w" podUID="cb2a00f3-8bc2-4b68-a631-62b5470c7b77" Dec 16 13:19:02.131436 containerd[1785]: time="2025-12-16T13:19:02.131174175Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:19:02.483290 containerd[1785]: time="2025-12-16T13:19:02.483096855Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:19:02.485327 containerd[1785]: time="2025-12-16T13:19:02.485208033Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:19:02.485559 containerd[1785]: time="2025-12-16T13:19:02.485295717Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:19:02.485766 kubelet[3038]: E1216 13:19:02.485690 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:19:02.485766 kubelet[3038]: E1216 13:19:02.485764 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:19:02.486421 kubelet[3038]: E1216 13:19:02.485924 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5k7p2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76768d65dd-q8jbl_calico-apiserver(8d366b0a-e187-4483-8bcf-46758c32eaee): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:19:02.488054 kubelet[3038]: E1216 13:19:02.487964 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-q8jbl" podUID="8d366b0a-e187-4483-8bcf-46758c32eaee" Dec 16 13:19:05.131737 containerd[1785]: time="2025-12-16T13:19:05.131639061Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 13:19:05.461253 containerd[1785]: time="2025-12-16T13:19:05.461030903Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:19:05.463465 containerd[1785]: time="2025-12-16T13:19:05.463389890Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 13:19:05.463596 containerd[1785]: time="2025-12-16T13:19:05.463466405Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 13:19:05.463782 kubelet[3038]: E1216 13:19:05.463711 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:19:05.464620 kubelet[3038]: E1216 13:19:05.463783 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:19:05.464620 kubelet[3038]: E1216 13:19:05.464004 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9zgzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-4vgpv_calico-system(2c2ad34d-d82b-4624-87cd-76ece6a8970b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 13:19:05.466668 containerd[1785]: time="2025-12-16T13:19:05.466591733Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 13:19:05.806755 containerd[1785]: time="2025-12-16T13:19:05.806655331Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:19:05.809040 containerd[1785]: time="2025-12-16T13:19:05.808943470Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 13:19:05.809183 containerd[1785]: time="2025-12-16T13:19:05.809072385Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 13:19:05.809730 kubelet[3038]: E1216 13:19:05.809660 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:19:05.809878 kubelet[3038]: E1216 13:19:05.809744 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:19:05.809967 kubelet[3038]: E1216 13:19:05.809908 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9zgzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-4vgpv_calico-system(2c2ad34d-d82b-4624-87cd-76ece6a8970b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 13:19:05.811223 kubelet[3038]: E1216 13:19:05.811136 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-4vgpv" podUID="2c2ad34d-d82b-4624-87cd-76ece6a8970b" Dec 16 13:19:07.130402 containerd[1785]: time="2025-12-16T13:19:07.130341574Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 13:19:07.477749 containerd[1785]: time="2025-12-16T13:19:07.477533880Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:19:07.479750 containerd[1785]: time="2025-12-16T13:19:07.479666119Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 13:19:07.480510 containerd[1785]: time="2025-12-16T13:19:07.479745813Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 13:19:07.480585 kubelet[3038]: E1216 13:19:07.479923 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:19:07.480585 kubelet[3038]: E1216 13:19:07.480002 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:19:07.480585 kubelet[3038]: E1216 13:19:07.480233 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x8hl9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-nz8db_calico-system(4fb5337e-3d19-4a26-9e2e-58b08d0a6154): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 13:19:07.481485 kubelet[3038]: E1216 13:19:07.481402 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-nz8db" podUID="4fb5337e-3d19-4a26-9e2e-58b08d0a6154" Dec 16 13:19:08.129637 containerd[1785]: time="2025-12-16T13:19:08.129541444Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 13:19:08.463004 containerd[1785]: time="2025-12-16T13:19:08.462740560Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:19:08.465525 containerd[1785]: time="2025-12-16T13:19:08.465076620Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 13:19:08.465525 containerd[1785]: time="2025-12-16T13:19:08.465145450Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 13:19:08.465745 kubelet[3038]: E1216 13:19:08.465391 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:19:08.465745 kubelet[3038]: E1216 13:19:08.465506 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:19:08.465937 kubelet[3038]: E1216 13:19:08.465794 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mkfg5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-5d8cb67fb9-kbt5r_calico-system(14065d61-1c00-4d68-849b-4eefd74615de): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 13:19:08.467097 kubelet[3038]: E1216 13:19:08.467038 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8cb67fb9-kbt5r" podUID="14065d61-1c00-4d68-849b-4eefd74615de" Dec 16 13:19:09.130945 containerd[1785]: time="2025-12-16T13:19:09.130836918Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:19:09.473729 containerd[1785]: time="2025-12-16T13:19:09.473521558Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:19:09.475826 containerd[1785]: time="2025-12-16T13:19:09.475738857Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:19:09.475955 containerd[1785]: time="2025-12-16T13:19:09.475863436Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:19:09.476202 kubelet[3038]: E1216 13:19:09.476122 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:19:09.476580 kubelet[3038]: E1216 13:19:09.476213 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:19:09.476580 kubelet[3038]: E1216 13:19:09.476471 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tn2rm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76768d65dd-ncz5v_calico-apiserver(fe36e307-b3ce-4532-b1d5-65732a7edfab): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:19:09.478089 kubelet[3038]: E1216 13:19:09.478002 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-ncz5v" podUID="fe36e307-b3ce-4532-b1d5-65732a7edfab" Dec 16 13:19:15.135567 kubelet[3038]: E1216 13:19:15.132628 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-q8jbl" podUID="8d366b0a-e187-4483-8bcf-46758c32eaee" Dec 16 13:19:15.583578 systemd[1]: Started sshd@7-10.0.23.154:22-147.75.109.163:48746.service - OpenSSH per-connection server daemon (147.75.109.163:48746). Dec 16 13:19:16.133400 kubelet[3038]: E1216 13:19:16.133315 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f659d6578-trd8w" podUID="cb2a00f3-8bc2-4b68-a631-62b5470c7b77" Dec 16 13:19:16.660387 sshd[5502]: Accepted publickey for core from 147.75.109.163 port 48746 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:19:16.662872 sshd-session[5502]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:19:16.669917 systemd-logind[1763]: New session 8 of user core. Dec 16 13:19:16.680674 systemd[1]: Started session-8.scope - Session 8 of User core. Dec 16 13:19:17.451772 sshd[5505]: Connection closed by 147.75.109.163 port 48746 Dec 16 13:19:17.452524 sshd-session[5502]: pam_unix(sshd:session): session closed for user core Dec 16 13:19:17.460770 systemd[1]: sshd@7-10.0.23.154:22-147.75.109.163:48746.service: Deactivated successfully. Dec 16 13:19:17.465085 systemd[1]: session-8.scope: Deactivated successfully. Dec 16 13:19:17.466744 systemd-logind[1763]: Session 8 logged out. Waiting for processes to exit. Dec 16 13:19:17.468837 systemd-logind[1763]: Removed session 8. Dec 16 13:19:19.130918 kubelet[3038]: E1216 13:19:19.130799 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-4vgpv" podUID="2c2ad34d-d82b-4624-87cd-76ece6a8970b" Dec 16 13:19:21.129787 kubelet[3038]: E1216 13:19:21.129724 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8cb67fb9-kbt5r" podUID="14065d61-1c00-4d68-849b-4eefd74615de" Dec 16 13:19:22.129350 kubelet[3038]: E1216 13:19:22.129294 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-ncz5v" podUID="fe36e307-b3ce-4532-b1d5-65732a7edfab" Dec 16 13:19:22.129873 kubelet[3038]: E1216 13:19:22.129715 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-nz8db" podUID="4fb5337e-3d19-4a26-9e2e-58b08d0a6154" Dec 16 13:19:22.620635 systemd[1]: Started sshd@8-10.0.23.154:22-147.75.109.163:50496.service - OpenSSH per-connection server daemon (147.75.109.163:50496). Dec 16 13:19:23.685829 sshd[5531]: Accepted publickey for core from 147.75.109.163 port 50496 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:19:23.687337 sshd-session[5531]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:19:23.694702 systemd-logind[1763]: New session 9 of user core. Dec 16 13:19:23.703909 systemd[1]: Started session-9.scope - Session 9 of User core. Dec 16 13:19:24.465767 sshd[5534]: Connection closed by 147.75.109.163 port 50496 Dec 16 13:19:24.466608 sshd-session[5531]: pam_unix(sshd:session): session closed for user core Dec 16 13:19:24.470858 systemd[1]: sshd@8-10.0.23.154:22-147.75.109.163:50496.service: Deactivated successfully. Dec 16 13:19:24.472853 systemd[1]: session-9.scope: Deactivated successfully. Dec 16 13:19:24.473707 systemd-logind[1763]: Session 9 logged out. Waiting for processes to exit. Dec 16 13:19:24.474602 systemd-logind[1763]: Removed session 9. Dec 16 13:19:24.634625 systemd[1]: Started sshd@9-10.0.23.154:22-147.75.109.163:50502.service - OpenSSH per-connection server daemon (147.75.109.163:50502). Dec 16 13:19:25.625699 sshd[5552]: Accepted publickey for core from 147.75.109.163 port 50502 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:19:25.627361 sshd-session[5552]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:19:25.633404 systemd-logind[1763]: New session 10 of user core. Dec 16 13:19:25.653751 systemd[1]: Started session-10.scope - Session 10 of User core. Dec 16 13:19:26.410725 sshd[5555]: Connection closed by 147.75.109.163 port 50502 Dec 16 13:19:26.411125 sshd-session[5552]: pam_unix(sshd:session): session closed for user core Dec 16 13:19:26.414849 systemd[1]: sshd@9-10.0.23.154:22-147.75.109.163:50502.service: Deactivated successfully. Dec 16 13:19:26.416502 systemd[1]: session-10.scope: Deactivated successfully. Dec 16 13:19:26.417141 systemd-logind[1763]: Session 10 logged out. Waiting for processes to exit. Dec 16 13:19:26.417897 systemd-logind[1763]: Removed session 10. Dec 16 13:19:26.594246 systemd[1]: Started sshd@10-10.0.23.154:22-147.75.109.163:50518.service - OpenSSH per-connection server daemon (147.75.109.163:50518). Dec 16 13:19:27.130506 kubelet[3038]: E1216 13:19:27.130365 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f659d6578-trd8w" podUID="cb2a00f3-8bc2-4b68-a631-62b5470c7b77" Dec 16 13:19:27.614537 sshd[5570]: Accepted publickey for core from 147.75.109.163 port 50518 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:19:27.615890 sshd-session[5570]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:19:27.628089 systemd-logind[1763]: New session 11 of user core. Dec 16 13:19:27.646932 systemd[1]: Started session-11.scope - Session 11 of User core. Dec 16 13:19:28.402576 sshd[5573]: Connection closed by 147.75.109.163 port 50518 Dec 16 13:19:28.403377 sshd-session[5570]: pam_unix(sshd:session): session closed for user core Dec 16 13:19:28.410190 systemd[1]: sshd@10-10.0.23.154:22-147.75.109.163:50518.service: Deactivated successfully. Dec 16 13:19:28.412245 systemd[1]: session-11.scope: Deactivated successfully. Dec 16 13:19:28.413461 systemd-logind[1763]: Session 11 logged out. Waiting for processes to exit. Dec 16 13:19:28.414935 systemd-logind[1763]: Removed session 11. Dec 16 13:19:29.131868 kubelet[3038]: E1216 13:19:29.131355 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-q8jbl" podUID="8d366b0a-e187-4483-8bcf-46758c32eaee" Dec 16 13:19:33.130747 kubelet[3038]: E1216 13:19:33.130693 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-4vgpv" podUID="2c2ad34d-d82b-4624-87cd-76ece6a8970b" Dec 16 13:19:33.605903 systemd[1]: Started sshd@11-10.0.23.154:22-147.75.109.163:40220.service - OpenSSH per-connection server daemon (147.75.109.163:40220). Dec 16 13:19:34.131479 kubelet[3038]: E1216 13:19:34.131102 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-nz8db" podUID="4fb5337e-3d19-4a26-9e2e-58b08d0a6154" Dec 16 13:19:34.686002 sshd[5594]: Accepted publickey for core from 147.75.109.163 port 40220 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:19:34.687351 sshd-session[5594]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:19:34.692258 systemd-logind[1763]: New session 12 of user core. Dec 16 13:19:34.699775 systemd[1]: Started session-12.scope - Session 12 of User core. Dec 16 13:19:35.129676 kubelet[3038]: E1216 13:19:35.129601 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8cb67fb9-kbt5r" podUID="14065d61-1c00-4d68-849b-4eefd74615de" Dec 16 13:19:35.517481 sshd[5597]: Connection closed by 147.75.109.163 port 40220 Dec 16 13:19:35.517673 sshd-session[5594]: pam_unix(sshd:session): session closed for user core Dec 16 13:19:35.523564 systemd[1]: sshd@11-10.0.23.154:22-147.75.109.163:40220.service: Deactivated successfully. Dec 16 13:19:35.526188 systemd[1]: session-12.scope: Deactivated successfully. Dec 16 13:19:35.528409 systemd-logind[1763]: Session 12 logged out. Waiting for processes to exit. Dec 16 13:19:35.529897 systemd-logind[1763]: Removed session 12. Dec 16 13:19:36.129986 kubelet[3038]: E1216 13:19:36.129895 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-ncz5v" podUID="fe36e307-b3ce-4532-b1d5-65732a7edfab" Dec 16 13:19:40.684314 systemd[1]: Started sshd@12-10.0.23.154:22-147.75.109.163:40228.service - OpenSSH per-connection server daemon (147.75.109.163:40228). Dec 16 13:19:41.136018 kubelet[3038]: E1216 13:19:41.135968 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f659d6578-trd8w" podUID="cb2a00f3-8bc2-4b68-a631-62b5470c7b77" Dec 16 13:19:41.691408 sshd[5644]: Accepted publickey for core from 147.75.109.163 port 40228 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:19:41.693192 sshd-session[5644]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:19:41.698997 systemd-logind[1763]: New session 13 of user core. Dec 16 13:19:41.710729 systemd[1]: Started session-13.scope - Session 13 of User core. Dec 16 13:19:42.129862 kubelet[3038]: E1216 13:19:42.129782 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-q8jbl" podUID="8d366b0a-e187-4483-8bcf-46758c32eaee" Dec 16 13:19:42.418170 sshd[5647]: Connection closed by 147.75.109.163 port 40228 Dec 16 13:19:42.418883 sshd-session[5644]: pam_unix(sshd:session): session closed for user core Dec 16 13:19:42.425275 systemd[1]: sshd@12-10.0.23.154:22-147.75.109.163:40228.service: Deactivated successfully. Dec 16 13:19:42.427994 systemd[1]: session-13.scope: Deactivated successfully. Dec 16 13:19:42.429026 systemd-logind[1763]: Session 13 logged out. Waiting for processes to exit. Dec 16 13:19:42.430153 systemd-logind[1763]: Removed session 13. Dec 16 13:19:42.593091 systemd[1]: Started sshd@13-10.0.23.154:22-147.75.109.163:57146.service - OpenSSH per-connection server daemon (147.75.109.163:57146). Dec 16 13:19:43.592119 sshd[5664]: Accepted publickey for core from 147.75.109.163 port 57146 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:19:43.594036 sshd-session[5664]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:19:43.606752 systemd-logind[1763]: New session 14 of user core. Dec 16 13:19:43.617646 systemd[1]: Started session-14.scope - Session 14 of User core. Dec 16 13:19:44.371796 sshd[5667]: Connection closed by 147.75.109.163 port 57146 Dec 16 13:19:44.372640 sshd-session[5664]: pam_unix(sshd:session): session closed for user core Dec 16 13:19:44.379499 systemd[1]: sshd@13-10.0.23.154:22-147.75.109.163:57146.service: Deactivated successfully. Dec 16 13:19:44.382931 systemd[1]: session-14.scope: Deactivated successfully. Dec 16 13:19:44.384687 systemd-logind[1763]: Session 14 logged out. Waiting for processes to exit. Dec 16 13:19:44.387105 systemd-logind[1763]: Removed session 14. Dec 16 13:19:44.542910 systemd[1]: Started sshd@14-10.0.23.154:22-147.75.109.163:57160.service - OpenSSH per-connection server daemon (147.75.109.163:57160). Dec 16 13:19:45.130090 kubelet[3038]: E1216 13:19:45.130034 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-4vgpv" podUID="2c2ad34d-d82b-4624-87cd-76ece6a8970b" Dec 16 13:19:45.525660 sshd[5682]: Accepted publickey for core from 147.75.109.163 port 57160 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:19:45.527582 sshd-session[5682]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:19:45.537829 systemd-logind[1763]: New session 15 of user core. Dec 16 13:19:45.553784 systemd[1]: Started session-15.scope - Session 15 of User core. Dec 16 13:19:46.130324 kubelet[3038]: E1216 13:19:46.130249 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-nz8db" podUID="4fb5337e-3d19-4a26-9e2e-58b08d0a6154" Dec 16 13:19:46.984629 sshd[5685]: Connection closed by 147.75.109.163 port 57160 Dec 16 13:19:46.985076 sshd-session[5682]: pam_unix(sshd:session): session closed for user core Dec 16 13:19:46.993662 systemd[1]: sshd@14-10.0.23.154:22-147.75.109.163:57160.service: Deactivated successfully. Dec 16 13:19:46.996262 systemd[1]: session-15.scope: Deactivated successfully. Dec 16 13:19:46.997784 systemd-logind[1763]: Session 15 logged out. Waiting for processes to exit. Dec 16 13:19:46.999865 systemd-logind[1763]: Removed session 15. Dec 16 13:19:47.734518 systemd[1]: Started sshd@15-10.0.23.154:22-147.75.109.163:57170.service - OpenSSH per-connection server daemon (147.75.109.163:57170). Dec 16 13:19:48.130005 kubelet[3038]: E1216 13:19:48.129653 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8cb67fb9-kbt5r" podUID="14065d61-1c00-4d68-849b-4eefd74615de" Dec 16 13:19:48.736853 sshd[5708]: Accepted publickey for core from 147.75.109.163 port 57170 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:19:48.738153 sshd-session[5708]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:19:48.743148 systemd-logind[1763]: New session 16 of user core. Dec 16 13:19:48.752663 systemd[1]: Started session-16.scope - Session 16 of User core. Dec 16 13:19:49.669721 sshd[5711]: Connection closed by 147.75.109.163 port 57170 Dec 16 13:19:49.670594 sshd-session[5708]: pam_unix(sshd:session): session closed for user core Dec 16 13:19:49.677688 systemd[1]: sshd@15-10.0.23.154:22-147.75.109.163:57170.service: Deactivated successfully. Dec 16 13:19:49.680703 systemd[1]: session-16.scope: Deactivated successfully. Dec 16 13:19:49.682259 systemd-logind[1763]: Session 16 logged out. Waiting for processes to exit. Dec 16 13:19:49.683738 systemd-logind[1763]: Removed session 16. Dec 16 13:19:49.848184 systemd[1]: Started sshd@16-10.0.23.154:22-147.75.109.163:57184.service - OpenSSH per-connection server daemon (147.75.109.163:57184). Dec 16 13:19:50.129771 kubelet[3038]: E1216 13:19:50.129687 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-ncz5v" podUID="fe36e307-b3ce-4532-b1d5-65732a7edfab" Dec 16 13:19:50.831279 sshd[5726]: Accepted publickey for core from 147.75.109.163 port 57184 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:19:50.834540 sshd-session[5726]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:19:50.844105 systemd-logind[1763]: New session 17 of user core. Dec 16 13:19:50.858757 systemd[1]: Started session-17.scope - Session 17 of User core. Dec 16 13:19:51.611382 sshd[5729]: Connection closed by 147.75.109.163 port 57184 Dec 16 13:19:51.611537 sshd-session[5726]: pam_unix(sshd:session): session closed for user core Dec 16 13:19:51.615942 systemd[1]: sshd@16-10.0.23.154:22-147.75.109.163:57184.service: Deactivated successfully. Dec 16 13:19:51.618398 systemd[1]: session-17.scope: Deactivated successfully. Dec 16 13:19:51.620579 systemd-logind[1763]: Session 17 logged out. Waiting for processes to exit. Dec 16 13:19:51.622018 systemd-logind[1763]: Removed session 17. Dec 16 13:19:52.129790 kubelet[3038]: E1216 13:19:52.129733 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f659d6578-trd8w" podUID="cb2a00f3-8bc2-4b68-a631-62b5470c7b77" Dec 16 13:19:56.130814 kubelet[3038]: E1216 13:19:56.130721 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-q8jbl" podUID="8d366b0a-e187-4483-8bcf-46758c32eaee" Dec 16 13:19:56.792551 systemd[1]: Started sshd@17-10.0.23.154:22-147.75.109.163:36954.service - OpenSSH per-connection server daemon (147.75.109.163:36954). Dec 16 13:19:57.132298 kubelet[3038]: E1216 13:19:57.131916 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-4vgpv" podUID="2c2ad34d-d82b-4624-87cd-76ece6a8970b" Dec 16 13:19:57.798716 sshd[5748]: Accepted publickey for core from 147.75.109.163 port 36954 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:19:57.801828 sshd-session[5748]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:19:57.813915 systemd-logind[1763]: New session 18 of user core. Dec 16 13:19:57.824866 systemd[1]: Started session-18.scope - Session 18 of User core. Dec 16 13:19:58.565335 sshd[5751]: Connection closed by 147.75.109.163 port 36954 Dec 16 13:19:58.565783 sshd-session[5748]: pam_unix(sshd:session): session closed for user core Dec 16 13:19:58.573170 systemd[1]: sshd@17-10.0.23.154:22-147.75.109.163:36954.service: Deactivated successfully. Dec 16 13:19:58.576713 systemd[1]: session-18.scope: Deactivated successfully. Dec 16 13:19:58.580334 systemd-logind[1763]: Session 18 logged out. Waiting for processes to exit. Dec 16 13:19:58.582387 systemd-logind[1763]: Removed session 18. Dec 16 13:20:00.130482 kubelet[3038]: E1216 13:20:00.130386 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-nz8db" podUID="4fb5337e-3d19-4a26-9e2e-58b08d0a6154" Dec 16 13:20:01.129327 kubelet[3038]: E1216 13:20:01.129274 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8cb67fb9-kbt5r" podUID="14065d61-1c00-4d68-849b-4eefd74615de" Dec 16 13:20:03.745076 systemd[1]: Started sshd@18-10.0.23.154:22-147.75.109.163:52216.service - OpenSSH per-connection server daemon (147.75.109.163:52216). Dec 16 13:20:04.735422 sshd[5770]: Accepted publickey for core from 147.75.109.163 port 52216 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:20:04.738011 sshd-session[5770]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:20:04.747243 systemd-logind[1763]: New session 19 of user core. Dec 16 13:20:04.764848 systemd[1]: Started session-19.scope - Session 19 of User core. Dec 16 13:20:05.130713 kubelet[3038]: E1216 13:20:05.130631 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-ncz5v" podUID="fe36e307-b3ce-4532-b1d5-65732a7edfab" Dec 16 13:20:05.481617 sshd[5773]: Connection closed by 147.75.109.163 port 52216 Dec 16 13:20:05.482064 sshd-session[5770]: pam_unix(sshd:session): session closed for user core Dec 16 13:20:05.490109 systemd[1]: sshd@18-10.0.23.154:22-147.75.109.163:52216.service: Deactivated successfully. Dec 16 13:20:05.496360 systemd[1]: session-19.scope: Deactivated successfully. Dec 16 13:20:05.498033 systemd-logind[1763]: Session 19 logged out. Waiting for processes to exit. Dec 16 13:20:05.499888 systemd-logind[1763]: Removed session 19. Dec 16 13:20:07.139417 kubelet[3038]: E1216 13:20:07.139307 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f659d6578-trd8w" podUID="cb2a00f3-8bc2-4b68-a631-62b5470c7b77" Dec 16 13:20:08.129727 kubelet[3038]: E1216 13:20:08.129661 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-q8jbl" podUID="8d366b0a-e187-4483-8bcf-46758c32eaee" Dec 16 13:20:10.131323 kubelet[3038]: E1216 13:20:10.131216 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-4vgpv" podUID="2c2ad34d-d82b-4624-87cd-76ece6a8970b" Dec 16 13:20:10.655882 systemd[1]: Started sshd@19-10.0.23.154:22-147.75.109.163:52218.service - OpenSSH per-connection server daemon (147.75.109.163:52218). Dec 16 13:20:11.131734 kubelet[3038]: E1216 13:20:11.131635 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-nz8db" podUID="4fb5337e-3d19-4a26-9e2e-58b08d0a6154" Dec 16 13:20:11.636510 sshd[5820]: Accepted publickey for core from 147.75.109.163 port 52218 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:20:11.639501 sshd-session[5820]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:20:11.650220 systemd-logind[1763]: New session 20 of user core. Dec 16 13:20:11.661747 systemd[1]: Started session-20.scope - Session 20 of User core. Dec 16 13:20:12.370091 sshd[5823]: Connection closed by 147.75.109.163 port 52218 Dec 16 13:20:12.371171 sshd-session[5820]: pam_unix(sshd:session): session closed for user core Dec 16 13:20:12.379670 systemd[1]: sshd@19-10.0.23.154:22-147.75.109.163:52218.service: Deactivated successfully. Dec 16 13:20:12.383933 systemd[1]: session-20.scope: Deactivated successfully. Dec 16 13:20:12.385617 systemd-logind[1763]: Session 20 logged out. Waiting for processes to exit. Dec 16 13:20:12.388614 systemd-logind[1763]: Removed session 20. Dec 16 13:20:16.130662 kubelet[3038]: E1216 13:20:16.130575 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8cb67fb9-kbt5r" podUID="14065d61-1c00-4d68-849b-4eefd74615de" Dec 16 13:20:17.137133 kubelet[3038]: E1216 13:20:17.137037 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-ncz5v" podUID="fe36e307-b3ce-4532-b1d5-65732a7edfab" Dec 16 13:20:17.548955 systemd[1]: Started sshd@20-10.0.23.154:22-147.75.109.163:56782.service - OpenSSH per-connection server daemon (147.75.109.163:56782). Dec 16 13:20:18.130823 kubelet[3038]: E1216 13:20:18.130737 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f659d6578-trd8w" podUID="cb2a00f3-8bc2-4b68-a631-62b5470c7b77" Dec 16 13:20:18.556830 sshd[5847]: Accepted publickey for core from 147.75.109.163 port 56782 ssh2: RSA SHA256:cQMxipPJJowRbk5dGSaUREuCPMqg33hAu2Zl+Athpig Dec 16 13:20:18.559430 sshd-session[5847]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:20:18.574568 systemd-logind[1763]: New session 21 of user core. Dec 16 13:20:18.586863 systemd[1]: Started session-21.scope - Session 21 of User core. Dec 16 13:20:19.328602 sshd[5850]: Connection closed by 147.75.109.163 port 56782 Dec 16 13:20:19.329289 sshd-session[5847]: pam_unix(sshd:session): session closed for user core Dec 16 13:20:19.336962 systemd[1]: sshd@20-10.0.23.154:22-147.75.109.163:56782.service: Deactivated successfully. Dec 16 13:20:19.339081 systemd[1]: session-21.scope: Deactivated successfully. Dec 16 13:20:19.339887 systemd-logind[1763]: Session 21 logged out. Waiting for processes to exit. Dec 16 13:20:19.340919 systemd-logind[1763]: Removed session 21. Dec 16 13:20:22.132575 kubelet[3038]: E1216 13:20:22.131904 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-q8jbl" podUID="8d366b0a-e187-4483-8bcf-46758c32eaee" Dec 16 13:20:22.134319 kubelet[3038]: E1216 13:20:22.133915 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-4vgpv" podUID="2c2ad34d-d82b-4624-87cd-76ece6a8970b" Dec 16 13:20:26.130076 kubelet[3038]: E1216 13:20:26.130020 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-nz8db" podUID="4fb5337e-3d19-4a26-9e2e-58b08d0a6154" Dec 16 13:20:28.129626 kubelet[3038]: E1216 13:20:28.129546 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-ncz5v" podUID="fe36e307-b3ce-4532-b1d5-65732a7edfab" Dec 16 13:20:29.131493 containerd[1785]: time="2025-12-16T13:20:29.131232311Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 13:20:29.474788 containerd[1785]: time="2025-12-16T13:20:29.474567814Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:20:29.477549 containerd[1785]: time="2025-12-16T13:20:29.477501967Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 13:20:29.477659 containerd[1785]: time="2025-12-16T13:20:29.477606172Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 13:20:29.478121 kubelet[3038]: E1216 13:20:29.477819 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:20:29.478121 kubelet[3038]: E1216 13:20:29.477891 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:20:29.478121 kubelet[3038]: E1216 13:20:29.478055 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mkfg5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-5d8cb67fb9-kbt5r_calico-system(14065d61-1c00-4d68-849b-4eefd74615de): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 13:20:29.479586 kubelet[3038]: E1216 13:20:29.479535 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8cb67fb9-kbt5r" podUID="14065d61-1c00-4d68-849b-4eefd74615de" Dec 16 13:20:31.129806 containerd[1785]: time="2025-12-16T13:20:31.129740690Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 13:20:31.465612 containerd[1785]: time="2025-12-16T13:20:31.465425410Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:20:31.467594 containerd[1785]: time="2025-12-16T13:20:31.467565174Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 13:20:31.467726 containerd[1785]: time="2025-12-16T13:20:31.467655884Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 13:20:31.468173 kubelet[3038]: E1216 13:20:31.467905 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:20:31.468173 kubelet[3038]: E1216 13:20:31.467972 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:20:31.468173 kubelet[3038]: E1216 13:20:31.468114 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:dda5a456290f47049c1dcb0ab4e8d120,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tkx8g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7f659d6578-trd8w_calico-system(cb2a00f3-8bc2-4b68-a631-62b5470c7b77): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 13:20:31.470491 containerd[1785]: time="2025-12-16T13:20:31.470470425Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 13:20:31.840665 containerd[1785]: time="2025-12-16T13:20:31.840598355Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:20:31.842891 containerd[1785]: time="2025-12-16T13:20:31.842846606Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 13:20:31.843052 containerd[1785]: time="2025-12-16T13:20:31.842929008Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 13:20:31.843225 kubelet[3038]: E1216 13:20:31.843161 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:20:31.843305 kubelet[3038]: E1216 13:20:31.843240 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:20:31.843544 kubelet[3038]: E1216 13:20:31.843440 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tkx8g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7f659d6578-trd8w_calico-system(cb2a00f3-8bc2-4b68-a631-62b5470c7b77): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 13:20:31.844855 kubelet[3038]: E1216 13:20:31.844763 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f659d6578-trd8w" podUID="cb2a00f3-8bc2-4b68-a631-62b5470c7b77" Dec 16 13:20:33.132646 containerd[1785]: time="2025-12-16T13:20:33.132598283Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:20:33.483755 containerd[1785]: time="2025-12-16T13:20:33.483537999Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:20:33.485733 containerd[1785]: time="2025-12-16T13:20:33.485653718Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:20:33.485884 containerd[1785]: time="2025-12-16T13:20:33.485747181Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:20:33.486224 kubelet[3038]: E1216 13:20:33.486093 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:20:33.486707 kubelet[3038]: E1216 13:20:33.486229 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:20:33.487297 kubelet[3038]: E1216 13:20:33.487181 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5k7p2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76768d65dd-q8jbl_calico-apiserver(8d366b0a-e187-4483-8bcf-46758c32eaee): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:20:33.488792 kubelet[3038]: E1216 13:20:33.488710 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-q8jbl" podUID="8d366b0a-e187-4483-8bcf-46758c32eaee" Dec 16 13:20:36.130357 containerd[1785]: time="2025-12-16T13:20:36.130279596Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 13:20:36.458005 containerd[1785]: time="2025-12-16T13:20:36.457816412Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:20:36.459948 containerd[1785]: time="2025-12-16T13:20:36.459869855Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 13:20:36.460141 containerd[1785]: time="2025-12-16T13:20:36.460051352Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 13:20:36.460507 kubelet[3038]: E1216 13:20:36.460381 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:20:36.461000 kubelet[3038]: E1216 13:20:36.460525 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:20:36.461000 kubelet[3038]: E1216 13:20:36.460888 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9zgzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-4vgpv_calico-system(2c2ad34d-d82b-4624-87cd-76ece6a8970b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 13:20:36.463418 containerd[1785]: time="2025-12-16T13:20:36.463364747Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 13:20:36.801034 containerd[1785]: time="2025-12-16T13:20:36.800924897Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:20:36.803469 containerd[1785]: time="2025-12-16T13:20:36.803248394Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 13:20:36.803599 containerd[1785]: time="2025-12-16T13:20:36.803374355Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 13:20:36.803884 kubelet[3038]: E1216 13:20:36.803795 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:20:36.804018 kubelet[3038]: E1216 13:20:36.803903 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:20:36.804285 kubelet[3038]: E1216 13:20:36.804167 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9zgzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-4vgpv_calico-system(2c2ad34d-d82b-4624-87cd-76ece6a8970b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 13:20:36.805751 kubelet[3038]: E1216 13:20:36.805598 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-4vgpv" podUID="2c2ad34d-d82b-4624-87cd-76ece6a8970b" Dec 16 13:20:38.131234 containerd[1785]: time="2025-12-16T13:20:38.131112487Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 13:20:38.493933 containerd[1785]: time="2025-12-16T13:20:38.493854134Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:20:38.496532 containerd[1785]: time="2025-12-16T13:20:38.496412924Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 13:20:38.496721 containerd[1785]: time="2025-12-16T13:20:38.496502333Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 13:20:38.496925 kubelet[3038]: E1216 13:20:38.496861 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:20:38.497582 kubelet[3038]: E1216 13:20:38.496939 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:20:38.497582 kubelet[3038]: E1216 13:20:38.497182 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x8hl9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-nz8db_calico-system(4fb5337e-3d19-4a26-9e2e-58b08d0a6154): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 13:20:38.498637 kubelet[3038]: E1216 13:20:38.498552 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-nz8db" podUID="4fb5337e-3d19-4a26-9e2e-58b08d0a6154" Dec 16 13:20:41.130622 containerd[1785]: time="2025-12-16T13:20:41.130541035Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:20:41.482698 containerd[1785]: time="2025-12-16T13:20:41.482419766Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:20:41.484309 containerd[1785]: time="2025-12-16T13:20:41.484236918Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:20:41.484475 containerd[1785]: time="2025-12-16T13:20:41.484289825Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:20:41.484560 kubelet[3038]: E1216 13:20:41.484513 3038 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:20:41.485248 kubelet[3038]: E1216 13:20:41.484575 3038 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:20:41.485248 kubelet[3038]: E1216 13:20:41.484732 3038 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tn2rm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-76768d65dd-ncz5v_calico-apiserver(fe36e307-b3ce-4532-b1d5-65732a7edfab): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:20:41.486008 kubelet[3038]: E1216 13:20:41.485913 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-ncz5v" podUID="fe36e307-b3ce-4532-b1d5-65732a7edfab" Dec 16 13:20:44.130557 kubelet[3038]: E1216 13:20:44.130420 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8cb67fb9-kbt5r" podUID="14065d61-1c00-4d68-849b-4eefd74615de" Dec 16 13:20:46.130608 kubelet[3038]: E1216 13:20:46.130497 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f659d6578-trd8w" podUID="cb2a00f3-8bc2-4b68-a631-62b5470c7b77" Dec 16 13:20:47.131399 kubelet[3038]: E1216 13:20:47.131307 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-q8jbl" podUID="8d366b0a-e187-4483-8bcf-46758c32eaee" Dec 16 13:20:47.522906 kubelet[3038]: E1216 13:20:47.522805 3038 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.23.154:55820->10.0.23.160:2379: read: connection timed out" Dec 16 13:20:47.526805 systemd[1]: cri-containerd-2aadfad751c96ed258b6c96f21aa44ff7b11f3ed3904f2cd8a8a8c8e0ff2f34a.scope: Deactivated successfully. Dec 16 13:20:47.527547 systemd[1]: cri-containerd-2aadfad751c96ed258b6c96f21aa44ff7b11f3ed3904f2cd8a8a8c8e0ff2f34a.scope: Consumed 3.145s CPU time, 26.8M memory peak. Dec 16 13:20:47.531173 containerd[1785]: time="2025-12-16T13:20:47.531066526Z" level=info msg="received container exit event container_id:\"2aadfad751c96ed258b6c96f21aa44ff7b11f3ed3904f2cd8a8a8c8e0ff2f34a\" id:\"2aadfad751c96ed258b6c96f21aa44ff7b11f3ed3904f2cd8a8a8c8e0ff2f34a\" pid:2860 exit_status:1 exited_at:{seconds:1765891247 nanos:529398444}" Dec 16 13:20:47.548860 systemd[1]: cri-containerd-e686cfb173f2bfeea9cc1f8b91e1880892d6a1d8dc868d71a281ea9e9b81320e.scope: Deactivated successfully. Dec 16 13:20:47.549572 systemd[1]: cri-containerd-e686cfb173f2bfeea9cc1f8b91e1880892d6a1d8dc868d71a281ea9e9b81320e.scope: Consumed 5.373s CPU time, 60.9M memory peak. Dec 16 13:20:47.556955 containerd[1785]: time="2025-12-16T13:20:47.556854846Z" level=info msg="received container exit event container_id:\"e686cfb173f2bfeea9cc1f8b91e1880892d6a1d8dc868d71a281ea9e9b81320e\" id:\"e686cfb173f2bfeea9cc1f8b91e1880892d6a1d8dc868d71a281ea9e9b81320e\" pid:2862 exit_status:1 exited_at:{seconds:1765891247 nanos:554743388}" Dec 16 13:20:47.585611 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2aadfad751c96ed258b6c96f21aa44ff7b11f3ed3904f2cd8a8a8c8e0ff2f34a-rootfs.mount: Deactivated successfully. Dec 16 13:20:47.600444 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e686cfb173f2bfeea9cc1f8b91e1880892d6a1d8dc868d71a281ea9e9b81320e-rootfs.mount: Deactivated successfully. Dec 16 13:20:47.619285 systemd[1]: cri-containerd-1b5e2e85a70cfbbf4548f0587c73c154555c195b5b101d46bdfc8648e36b7b7b.scope: Deactivated successfully. Dec 16 13:20:47.619960 systemd[1]: cri-containerd-1b5e2e85a70cfbbf4548f0587c73c154555c195b5b101d46bdfc8648e36b7b7b.scope: Consumed 40.579s CPU time, 105.9M memory peak. Dec 16 13:20:47.620154 containerd[1785]: time="2025-12-16T13:20:47.619945910Z" level=info msg="received container exit event container_id:\"1b5e2e85a70cfbbf4548f0587c73c154555c195b5b101d46bdfc8648e36b7b7b\" id:\"1b5e2e85a70cfbbf4548f0587c73c154555c195b5b101d46bdfc8648e36b7b7b\" pid:3373 exit_status:1 exited_at:{seconds:1765891247 nanos:619529463}" Dec 16 13:20:47.648458 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1b5e2e85a70cfbbf4548f0587c73c154555c195b5b101d46bdfc8648e36b7b7b-rootfs.mount: Deactivated successfully. Dec 16 13:20:47.769594 kubelet[3038]: I1216 13:20:47.769483 3038 scope.go:117] "RemoveContainer" containerID="e686cfb173f2bfeea9cc1f8b91e1880892d6a1d8dc868d71a281ea9e9b81320e" Dec 16 13:20:47.770566 kubelet[3038]: I1216 13:20:47.770487 3038 scope.go:117] "RemoveContainer" containerID="2aadfad751c96ed258b6c96f21aa44ff7b11f3ed3904f2cd8a8a8c8e0ff2f34a" Dec 16 13:20:47.772694 containerd[1785]: time="2025-12-16T13:20:47.772640565Z" level=info msg="CreateContainer within sandbox \"19a1a07f133317e4fa3379d926bdd9ac36dc6b96e5272ff63196405b8ce0d6e7\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Dec 16 13:20:47.773135 kubelet[3038]: I1216 13:20:47.773026 3038 scope.go:117] "RemoveContainer" containerID="1b5e2e85a70cfbbf4548f0587c73c154555c195b5b101d46bdfc8648e36b7b7b" Dec 16 13:20:47.773568 containerd[1785]: time="2025-12-16T13:20:47.773510248Z" level=info msg="CreateContainer within sandbox \"050870786c83145f5293642d475ac449b3a8d49bbaa94303b0806d679bf40804\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Dec 16 13:20:47.774539 containerd[1785]: time="2025-12-16T13:20:47.774504634Z" level=info msg="CreateContainer within sandbox \"8754c5370505f7dd0617808a2ed70af3ae81b288a984f36a1753ca67bf36e14d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Dec 16 13:20:47.801240 containerd[1785]: time="2025-12-16T13:20:47.801129036Z" level=info msg="Container 4f6ed5c8252933670bc888206d2d7c169a6327125ae477ce232020a4d0098b6f: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:20:47.804491 containerd[1785]: time="2025-12-16T13:20:47.804421809Z" level=info msg="Container 024194ce066fa272e4fe35a2db006929fcb1fe735b31505f9eae8636ec912592: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:20:47.807545 containerd[1785]: time="2025-12-16T13:20:47.807485803Z" level=info msg="Container c6bd50b56079a6e30ae7803f6b83e6c3df1a7e9efc8a57dc59f0d4c7bb27fb2b: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:20:47.821572 containerd[1785]: time="2025-12-16T13:20:47.821502842Z" level=info msg="CreateContainer within sandbox \"19a1a07f133317e4fa3379d926bdd9ac36dc6b96e5272ff63196405b8ce0d6e7\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"024194ce066fa272e4fe35a2db006929fcb1fe735b31505f9eae8636ec912592\"" Dec 16 13:20:47.822247 containerd[1785]: time="2025-12-16T13:20:47.822181414Z" level=info msg="StartContainer for \"024194ce066fa272e4fe35a2db006929fcb1fe735b31505f9eae8636ec912592\"" Dec 16 13:20:47.823867 containerd[1785]: time="2025-12-16T13:20:47.823807283Z" level=info msg="connecting to shim 024194ce066fa272e4fe35a2db006929fcb1fe735b31505f9eae8636ec912592" address="unix:///run/containerd/s/88e9592e2cd695c0fce9e98f49b26a69f1214de7f315c0cef6e3714f3a72f296" protocol=ttrpc version=3 Dec 16 13:20:47.827317 containerd[1785]: time="2025-12-16T13:20:47.827255181Z" level=info msg="CreateContainer within sandbox \"050870786c83145f5293642d475ac449b3a8d49bbaa94303b0806d679bf40804\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"4f6ed5c8252933670bc888206d2d7c169a6327125ae477ce232020a4d0098b6f\"" Dec 16 13:20:47.827971 containerd[1785]: time="2025-12-16T13:20:47.827936107Z" level=info msg="CreateContainer within sandbox \"8754c5370505f7dd0617808a2ed70af3ae81b288a984f36a1753ca67bf36e14d\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"c6bd50b56079a6e30ae7803f6b83e6c3df1a7e9efc8a57dc59f0d4c7bb27fb2b\"" Dec 16 13:20:47.828091 containerd[1785]: time="2025-12-16T13:20:47.828069006Z" level=info msg="StartContainer for \"4f6ed5c8252933670bc888206d2d7c169a6327125ae477ce232020a4d0098b6f\"" Dec 16 13:20:47.828349 containerd[1785]: time="2025-12-16T13:20:47.828323624Z" level=info msg="StartContainer for \"c6bd50b56079a6e30ae7803f6b83e6c3df1a7e9efc8a57dc59f0d4c7bb27fb2b\"" Dec 16 13:20:47.828980 containerd[1785]: time="2025-12-16T13:20:47.828952877Z" level=info msg="connecting to shim 4f6ed5c8252933670bc888206d2d7c169a6327125ae477ce232020a4d0098b6f" address="unix:///run/containerd/s/ba5d5c180b9daf558038cc3c17135656cfab956b2373d7e34ea5e1b0de01967d" protocol=ttrpc version=3 Dec 16 13:20:47.829634 containerd[1785]: time="2025-12-16T13:20:47.829601858Z" level=info msg="connecting to shim c6bd50b56079a6e30ae7803f6b83e6c3df1a7e9efc8a57dc59f0d4c7bb27fb2b" address="unix:///run/containerd/s/1e76db58d781fd4bd48e647f9e35c5b7d1c80957877de751470468fec46ac6e9" protocol=ttrpc version=3 Dec 16 13:20:47.850794 systemd[1]: Started cri-containerd-024194ce066fa272e4fe35a2db006929fcb1fe735b31505f9eae8636ec912592.scope - libcontainer container 024194ce066fa272e4fe35a2db006929fcb1fe735b31505f9eae8636ec912592. Dec 16 13:20:47.855858 systemd[1]: Started cri-containerd-4f6ed5c8252933670bc888206d2d7c169a6327125ae477ce232020a4d0098b6f.scope - libcontainer container 4f6ed5c8252933670bc888206d2d7c169a6327125ae477ce232020a4d0098b6f. Dec 16 13:20:47.857927 systemd[1]: Started cri-containerd-c6bd50b56079a6e30ae7803f6b83e6c3df1a7e9efc8a57dc59f0d4c7bb27fb2b.scope - libcontainer container c6bd50b56079a6e30ae7803f6b83e6c3df1a7e9efc8a57dc59f0d4c7bb27fb2b. Dec 16 13:20:47.903485 containerd[1785]: time="2025-12-16T13:20:47.902362317Z" level=info msg="StartContainer for \"c6bd50b56079a6e30ae7803f6b83e6c3df1a7e9efc8a57dc59f0d4c7bb27fb2b\" returns successfully" Dec 16 13:20:47.910080 containerd[1785]: time="2025-12-16T13:20:47.910021206Z" level=info msg="StartContainer for \"024194ce066fa272e4fe35a2db006929fcb1fe735b31505f9eae8636ec912592\" returns successfully" Dec 16 13:20:47.934062 containerd[1785]: time="2025-12-16T13:20:47.934010127Z" level=info msg="StartContainer for \"4f6ed5c8252933670bc888206d2d7c169a6327125ae477ce232020a4d0098b6f\" returns successfully" Dec 16 13:20:48.608163 kubelet[3038]: E1216 13:20:48.607924 3038 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.23.154:55626->10.0.23.160:2379: read: connection timed out" event="&Event{ObjectMeta:{goldmane-666569f655-nz8db.1881b4903c634617 calico-system 1323 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:calico-system,Name:goldmane-666569f655-nz8db,UID:4fb5337e-3d19-4a26-9e2e-58b08d0a6154,APIVersion:v1,ResourceVersion:775,FieldPath:spec.containers{goldmane},},Reason:Pulling,Message:Pulling image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\",Source:EventSource{Component:kubelet,Host:ci-4459-2-2-0-839c7337fa,},FirstTimestamp:2025-12-16 13:17:40 +0000 UTC,LastTimestamp:2025-12-16 13:20:38.130508027 +0000 UTC m=+219.082195154,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-0-839c7337fa,}" Dec 16 13:20:51.131013 kubelet[3038]: E1216 13:20:51.130935 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-nz8db" podUID="4fb5337e-3d19-4a26-9e2e-58b08d0a6154" Dec 16 13:20:51.132113 kubelet[3038]: E1216 13:20:51.131966 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-4vgpv" podUID="2c2ad34d-d82b-4624-87cd-76ece6a8970b" Dec 16 13:20:53.130561 kubelet[3038]: E1216 13:20:53.130379 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-ncz5v" podUID="fe36e307-b3ce-4532-b1d5-65732a7edfab" Dec 16 13:20:54.560927 kubelet[3038]: I1216 13:20:54.560692 3038 status_manager.go:890] "Failed to get status for pod" podUID="14065d61-1c00-4d68-849b-4eefd74615de" pod="calico-system/calico-kube-controllers-5d8cb67fb9-kbt5r" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.23.154:55742->10.0.23.160:2379: read: connection timed out" Dec 16 13:20:55.130495 kubelet[3038]: E1216 13:20:55.130352 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8cb67fb9-kbt5r" podUID="14065d61-1c00-4d68-849b-4eefd74615de" Dec 16 13:20:57.524292 kubelet[3038]: E1216 13:20:57.524002 3038 controller.go:195] "Failed to update lease" err="the server was unable to return a response in the time allotted, but may still be processing the request (put leases.coordination.k8s.io ci-4459-2-2-0-839c7337fa)" Dec 16 13:20:58.130717 kubelet[3038]: E1216 13:20:58.130622 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7f659d6578-trd8w" podUID="cb2a00f3-8bc2-4b68-a631-62b5470c7b77" Dec 16 13:20:59.181224 systemd[1]: cri-containerd-c6bd50b56079a6e30ae7803f6b83e6c3df1a7e9efc8a57dc59f0d4c7bb27fb2b.scope: Deactivated successfully. Dec 16 13:20:59.182355 containerd[1785]: time="2025-12-16T13:20:59.182306507Z" level=info msg="received container exit event container_id:\"c6bd50b56079a6e30ae7803f6b83e6c3df1a7e9efc8a57dc59f0d4c7bb27fb2b\" id:\"c6bd50b56079a6e30ae7803f6b83e6c3df1a7e9efc8a57dc59f0d4c7bb27fb2b\" pid:6011 exit_status:1 exited_at:{seconds:1765891259 nanos:181942702}" Dec 16 13:20:59.225740 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c6bd50b56079a6e30ae7803f6b83e6c3df1a7e9efc8a57dc59f0d4c7bb27fb2b-rootfs.mount: Deactivated successfully. Dec 16 13:20:59.825302 kubelet[3038]: I1216 13:20:59.825222 3038 scope.go:117] "RemoveContainer" containerID="1b5e2e85a70cfbbf4548f0587c73c154555c195b5b101d46bdfc8648e36b7b7b" Dec 16 13:20:59.826056 kubelet[3038]: I1216 13:20:59.825842 3038 scope.go:117] "RemoveContainer" containerID="c6bd50b56079a6e30ae7803f6b83e6c3df1a7e9efc8a57dc59f0d4c7bb27fb2b" Dec 16 13:20:59.826244 kubelet[3038]: E1216 13:20:59.826194 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-7dcd859c48-c67th_tigera-operator(1b0d8679-fb83-49d2-afd3-fb2459b560f9)\"" pod="tigera-operator/tigera-operator-7dcd859c48-c67th" podUID="1b0d8679-fb83-49d2-afd3-fb2459b560f9" Dec 16 13:20:59.827109 containerd[1785]: time="2025-12-16T13:20:59.827072785Z" level=info msg="RemoveContainer for \"1b5e2e85a70cfbbf4548f0587c73c154555c195b5b101d46bdfc8648e36b7b7b\"" Dec 16 13:20:59.835587 containerd[1785]: time="2025-12-16T13:20:59.835534420Z" level=info msg="RemoveContainer for \"1b5e2e85a70cfbbf4548f0587c73c154555c195b5b101d46bdfc8648e36b7b7b\" returns successfully" Dec 16 13:21:01.130982 kubelet[3038]: E1216 13:21:01.130857 3038 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-76768d65dd-q8jbl" podUID="8d366b0a-e187-4483-8bcf-46758c32eaee"