Dec 16 13:38:57.792277 kernel: Linux version 6.12.61-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Dec 12 15:21:28 -00 2025 Dec 16 13:38:57.792304 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=a214a2d85e162c493e8b13db2df50a43e1005a0e4854a1ae089a14f442a30022 Dec 16 13:38:57.792316 kernel: BIOS-provided physical RAM map: Dec 16 13:38:57.792323 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Dec 16 13:38:57.792328 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable Dec 16 13:38:57.792334 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Dec 16 13:38:57.792341 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable Dec 16 13:38:57.792347 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Dec 16 13:38:57.792352 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable Dec 16 13:38:57.792360 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Dec 16 13:38:57.792365 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000007e73efff] usable Dec 16 13:38:57.792371 kernel: BIOS-e820: [mem 0x000000007e73f000-0x000000007e7fffff] reserved Dec 16 13:38:57.792376 kernel: BIOS-e820: [mem 0x000000007e800000-0x000000007ea70fff] usable Dec 16 13:38:57.792382 kernel: BIOS-e820: [mem 0x000000007ea71000-0x000000007eb84fff] reserved Dec 16 13:38:57.792389 kernel: BIOS-e820: [mem 0x000000007eb85000-0x000000007f6ecfff] usable Dec 16 13:38:57.792397 kernel: BIOS-e820: [mem 0x000000007f6ed000-0x000000007f96cfff] reserved Dec 16 13:38:57.792403 kernel: BIOS-e820: [mem 0x000000007f96d000-0x000000007f97efff] ACPI data Dec 16 13:38:57.792409 kernel: BIOS-e820: [mem 0x000000007f97f000-0x000000007f9fefff] ACPI NVS Dec 16 13:38:57.792415 kernel: BIOS-e820: [mem 0x000000007f9ff000-0x000000007fe4efff] usable Dec 16 13:38:57.792421 kernel: BIOS-e820: [mem 0x000000007fe4f000-0x000000007fe52fff] reserved Dec 16 13:38:57.792429 kernel: BIOS-e820: [mem 0x000000007fe53000-0x000000007fe54fff] ACPI NVS Dec 16 13:38:57.792436 kernel: BIOS-e820: [mem 0x000000007fe55000-0x000000007febbfff] usable Dec 16 13:38:57.792442 kernel: BIOS-e820: [mem 0x000000007febc000-0x000000007ff3ffff] reserved Dec 16 13:38:57.792448 kernel: BIOS-e820: [mem 0x000000007ff40000-0x000000007fffffff] ACPI NVS Dec 16 13:38:57.792454 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Dec 16 13:38:57.792468 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Dec 16 13:38:57.792478 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Dec 16 13:38:57.792490 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000047fffffff] usable Dec 16 13:38:57.792500 kernel: NX (Execute Disable) protection: active Dec 16 13:38:57.792506 kernel: APIC: Static calls initialized Dec 16 13:38:57.792512 kernel: e820: update [mem 0x7dd4e018-0x7dd57a57] usable ==> usable Dec 16 13:38:57.792519 kernel: e820: update [mem 0x7dd26018-0x7dd4d457] usable ==> usable Dec 16 13:38:57.792525 kernel: extended physical RAM map: Dec 16 13:38:57.792531 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Dec 16 13:38:57.792537 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable Dec 16 13:38:57.792543 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Dec 16 13:38:57.792551 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable Dec 16 13:38:57.792557 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Dec 16 13:38:57.792563 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable Dec 16 13:38:57.792569 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Dec 16 13:38:57.792578 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000007dd26017] usable Dec 16 13:38:57.792584 kernel: reserve setup_data: [mem 0x000000007dd26018-0x000000007dd4d457] usable Dec 16 13:38:57.792590 kernel: reserve setup_data: [mem 0x000000007dd4d458-0x000000007dd4e017] usable Dec 16 13:38:57.792599 kernel: reserve setup_data: [mem 0x000000007dd4e018-0x000000007dd57a57] usable Dec 16 13:38:57.792605 kernel: reserve setup_data: [mem 0x000000007dd57a58-0x000000007e73efff] usable Dec 16 13:38:57.792611 kernel: reserve setup_data: [mem 0x000000007e73f000-0x000000007e7fffff] reserved Dec 16 13:38:57.792618 kernel: reserve setup_data: [mem 0x000000007e800000-0x000000007ea70fff] usable Dec 16 13:38:57.792624 kernel: reserve setup_data: [mem 0x000000007ea71000-0x000000007eb84fff] reserved Dec 16 13:38:57.792630 kernel: reserve setup_data: [mem 0x000000007eb85000-0x000000007f6ecfff] usable Dec 16 13:38:57.792636 kernel: reserve setup_data: [mem 0x000000007f6ed000-0x000000007f96cfff] reserved Dec 16 13:38:57.792643 kernel: reserve setup_data: [mem 0x000000007f96d000-0x000000007f97efff] ACPI data Dec 16 13:38:57.792649 kernel: reserve setup_data: [mem 0x000000007f97f000-0x000000007f9fefff] ACPI NVS Dec 16 13:38:57.792657 kernel: reserve setup_data: [mem 0x000000007f9ff000-0x000000007fe4efff] usable Dec 16 13:38:57.792663 kernel: reserve setup_data: [mem 0x000000007fe4f000-0x000000007fe52fff] reserved Dec 16 13:38:57.792670 kernel: reserve setup_data: [mem 0x000000007fe53000-0x000000007fe54fff] ACPI NVS Dec 16 13:38:57.792676 kernel: reserve setup_data: [mem 0x000000007fe55000-0x000000007febbfff] usable Dec 16 13:38:57.792682 kernel: reserve setup_data: [mem 0x000000007febc000-0x000000007ff3ffff] reserved Dec 16 13:38:57.792689 kernel: reserve setup_data: [mem 0x000000007ff40000-0x000000007fffffff] ACPI NVS Dec 16 13:38:57.792695 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Dec 16 13:38:57.792701 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Dec 16 13:38:57.792707 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Dec 16 13:38:57.792714 kernel: reserve setup_data: [mem 0x0000000100000000-0x000000047fffffff] usable Dec 16 13:38:57.792720 kernel: efi: EFI v2.7 by EDK II Dec 16 13:38:57.792728 kernel: efi: SMBIOS=0x7f772000 ACPI=0x7f97e000 ACPI 2.0=0x7f97e014 MEMATTR=0x7e282018 RNG=0x7f972018 Dec 16 13:38:57.792734 kernel: random: crng init done Dec 16 13:38:57.792741 kernel: efi: Remove mem152: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Dec 16 13:38:57.792747 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Dec 16 13:38:57.792754 kernel: secureboot: Secure boot disabled Dec 16 13:38:57.792760 kernel: SMBIOS 2.8 present. Dec 16 13:38:57.792766 kernel: DMI: STACKIT Cloud OpenStack Nova/Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 Dec 16 13:38:57.792772 kernel: DMI: Memory slots populated: 1/1 Dec 16 13:38:57.792779 kernel: Hypervisor detected: KVM Dec 16 13:38:57.792785 kernel: last_pfn = 0x7febc max_arch_pfn = 0x10000000000 Dec 16 13:38:57.792791 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Dec 16 13:38:57.792798 kernel: kvm-clock: using sched offset of 6368234092 cycles Dec 16 13:38:57.792806 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Dec 16 13:38:57.792813 kernel: tsc: Detected 2294.578 MHz processor Dec 16 13:38:57.792820 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Dec 16 13:38:57.792826 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Dec 16 13:38:57.792833 kernel: last_pfn = 0x480000 max_arch_pfn = 0x10000000000 Dec 16 13:38:57.792840 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Dec 16 13:38:57.792847 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Dec 16 13:38:57.792853 kernel: last_pfn = 0x7febc max_arch_pfn = 0x10000000000 Dec 16 13:38:57.792860 kernel: Using GB pages for direct mapping Dec 16 13:38:57.792868 kernel: ACPI: Early table checksum verification disabled Dec 16 13:38:57.792875 kernel: ACPI: RSDP 0x000000007F97E014 000024 (v02 BOCHS ) Dec 16 13:38:57.792890 kernel: ACPI: XSDT 0x000000007F97D0E8 00004C (v01 BOCHS BXPC 00000001 01000013) Dec 16 13:38:57.792897 kernel: ACPI: FACP 0x000000007F977000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 13:38:57.792904 kernel: ACPI: DSDT 0x000000007F978000 004441 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 13:38:57.792910 kernel: ACPI: FACS 0x000000007F9DD000 000040 Dec 16 13:38:57.792917 kernel: ACPI: APIC 0x000000007F976000 0000B0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 13:38:57.792924 kernel: ACPI: MCFG 0x000000007F975000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 13:38:57.792930 kernel: ACPI: WAET 0x000000007F974000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 13:38:57.792938 kernel: ACPI: BGRT 0x000000007F973000 000038 (v01 INTEL EDK2 00000002 01000013) Dec 16 13:38:57.792945 kernel: ACPI: Reserving FACP table memory at [mem 0x7f977000-0x7f9770f3] Dec 16 13:38:57.792952 kernel: ACPI: Reserving DSDT table memory at [mem 0x7f978000-0x7f97c440] Dec 16 13:38:57.792958 kernel: ACPI: Reserving FACS table memory at [mem 0x7f9dd000-0x7f9dd03f] Dec 16 13:38:57.792964 kernel: ACPI: Reserving APIC table memory at [mem 0x7f976000-0x7f9760af] Dec 16 13:38:57.792971 kernel: ACPI: Reserving MCFG table memory at [mem 0x7f975000-0x7f97503b] Dec 16 13:38:57.792977 kernel: ACPI: Reserving WAET table memory at [mem 0x7f974000-0x7f974027] Dec 16 13:38:57.792984 kernel: ACPI: Reserving BGRT table memory at [mem 0x7f973000-0x7f973037] Dec 16 13:38:57.792991 kernel: No NUMA configuration found Dec 16 13:38:57.792999 kernel: Faking a node at [mem 0x0000000000000000-0x000000047fffffff] Dec 16 13:38:57.793006 kernel: NODE_DATA(0) allocated [mem 0x47fff8dc0-0x47fffffff] Dec 16 13:38:57.793013 kernel: Zone ranges: Dec 16 13:38:57.793020 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Dec 16 13:38:57.793026 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Dec 16 13:38:57.793033 kernel: Normal [mem 0x0000000100000000-0x000000047fffffff] Dec 16 13:38:57.793039 kernel: Device empty Dec 16 13:38:57.793046 kernel: Movable zone start for each node Dec 16 13:38:57.793052 kernel: Early memory node ranges Dec 16 13:38:57.793059 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Dec 16 13:38:57.793067 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] Dec 16 13:38:57.793074 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] Dec 16 13:38:57.793080 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] Dec 16 13:38:57.793087 kernel: node 0: [mem 0x0000000000900000-0x000000007e73efff] Dec 16 13:38:57.793093 kernel: node 0: [mem 0x000000007e800000-0x000000007ea70fff] Dec 16 13:38:57.793100 kernel: node 0: [mem 0x000000007eb85000-0x000000007f6ecfff] Dec 16 13:38:57.793113 kernel: node 0: [mem 0x000000007f9ff000-0x000000007fe4efff] Dec 16 13:38:57.793122 kernel: node 0: [mem 0x000000007fe55000-0x000000007febbfff] Dec 16 13:38:57.793129 kernel: node 0: [mem 0x0000000100000000-0x000000047fffffff] Dec 16 13:38:57.793136 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000047fffffff] Dec 16 13:38:57.793143 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Dec 16 13:38:57.793150 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Dec 16 13:38:57.793159 kernel: On node 0, zone DMA: 8 pages in unavailable ranges Dec 16 13:38:57.793166 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Dec 16 13:38:57.793174 kernel: On node 0, zone DMA: 239 pages in unavailable ranges Dec 16 13:38:57.793181 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Dec 16 13:38:57.793188 kernel: On node 0, zone DMA32: 276 pages in unavailable ranges Dec 16 13:38:57.793197 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Dec 16 13:38:57.793206 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges Dec 16 13:38:57.793213 kernel: On node 0, zone Normal: 324 pages in unavailable ranges Dec 16 13:38:57.793223 kernel: ACPI: PM-Timer IO Port: 0x608 Dec 16 13:38:57.793235 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Dec 16 13:38:57.793245 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Dec 16 13:38:57.793254 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Dec 16 13:38:57.793261 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Dec 16 13:38:57.793269 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Dec 16 13:38:57.793278 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Dec 16 13:38:57.793285 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Dec 16 13:38:57.793292 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Dec 16 13:38:57.793299 kernel: TSC deadline timer available Dec 16 13:38:57.793306 kernel: CPU topo: Max. logical packages: 8 Dec 16 13:38:57.793314 kernel: CPU topo: Max. logical dies: 8 Dec 16 13:38:57.793321 kernel: CPU topo: Max. dies per package: 1 Dec 16 13:38:57.793328 kernel: CPU topo: Max. threads per core: 1 Dec 16 13:38:57.793335 kernel: CPU topo: Num. cores per package: 1 Dec 16 13:38:57.793344 kernel: CPU topo: Num. threads per package: 1 Dec 16 13:38:57.793351 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs Dec 16 13:38:57.793358 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Dec 16 13:38:57.793365 kernel: kvm-guest: KVM setup pv remote TLB flush Dec 16 13:38:57.793373 kernel: kvm-guest: setup PV sched yield Dec 16 13:38:57.793380 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Dec 16 13:38:57.793391 kernel: Booting paravirtualized kernel on KVM Dec 16 13:38:57.793399 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Dec 16 13:38:57.793410 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1 Dec 16 13:38:57.793419 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Dec 16 13:38:57.793426 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Dec 16 13:38:57.793434 kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 Dec 16 13:38:57.793441 kernel: kvm-guest: PV spinlocks enabled Dec 16 13:38:57.793448 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Dec 16 13:38:57.793456 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=a214a2d85e162c493e8b13db2df50a43e1005a0e4854a1ae089a14f442a30022 Dec 16 13:38:57.793464 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Dec 16 13:38:57.793471 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Dec 16 13:38:57.793480 kernel: Fallback order for Node 0: 0 Dec 16 13:38:57.793487 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4192374 Dec 16 13:38:57.793494 kernel: Policy zone: Normal Dec 16 13:38:57.793502 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 16 13:38:57.793509 kernel: software IO TLB: area num 8. Dec 16 13:38:57.793516 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1 Dec 16 13:38:57.793524 kernel: ftrace: allocating 40103 entries in 157 pages Dec 16 13:38:57.793531 kernel: ftrace: allocated 157 pages with 5 groups Dec 16 13:38:57.793539 kernel: Dynamic Preempt: voluntary Dec 16 13:38:57.793547 kernel: rcu: Preemptible hierarchical RCU implementation. Dec 16 13:38:57.793555 kernel: rcu: RCU event tracing is enabled. Dec 16 13:38:57.793563 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=8. Dec 16 13:38:57.793570 kernel: Trampoline variant of Tasks RCU enabled. Dec 16 13:38:57.793577 kernel: Rude variant of Tasks RCU enabled. Dec 16 13:38:57.793584 kernel: Tracing variant of Tasks RCU enabled. Dec 16 13:38:57.793592 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 16 13:38:57.793599 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8 Dec 16 13:38:57.793607 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8. Dec 16 13:38:57.793616 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8. Dec 16 13:38:57.793623 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8. Dec 16 13:38:57.793630 kernel: NR_IRQS: 33024, nr_irqs: 488, preallocated irqs: 16 Dec 16 13:38:57.793637 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 16 13:38:57.793645 kernel: Console: colour dummy device 80x25 Dec 16 13:38:57.793652 kernel: printk: legacy console [tty0] enabled Dec 16 13:38:57.793659 kernel: printk: legacy console [ttyS0] enabled Dec 16 13:38:57.793666 kernel: ACPI: Core revision 20240827 Dec 16 13:38:57.793674 kernel: APIC: Switch to symmetric I/O mode setup Dec 16 13:38:57.793682 kernel: x2apic enabled Dec 16 13:38:57.793690 kernel: APIC: Switched APIC routing to: physical x2apic Dec 16 13:38:57.793697 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Dec 16 13:38:57.793704 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Dec 16 13:38:57.793712 kernel: kvm-guest: setup PV IPIs Dec 16 13:38:57.793719 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2113334dc36, max_idle_ns: 440795272915 ns Dec 16 13:38:57.793726 kernel: Calibrating delay loop (skipped) preset value.. 4589.15 BogoMIPS (lpj=2294578) Dec 16 13:38:57.793733 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Dec 16 13:38:57.793741 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Dec 16 13:38:57.793749 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Dec 16 13:38:57.793756 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Dec 16 13:38:57.793763 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Dec 16 13:38:57.793798 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Dec 16 13:38:57.793805 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Dec 16 13:38:57.793812 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Dec 16 13:38:57.793820 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Dec 16 13:38:57.793826 kernel: TAA: Mitigation: Clear CPU buffers Dec 16 13:38:57.793833 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Dec 16 13:38:57.793840 kernel: active return thunk: its_return_thunk Dec 16 13:38:57.793847 kernel: ITS: Mitigation: Aligned branch/return thunks Dec 16 13:38:57.793854 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Dec 16 13:38:57.793863 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Dec 16 13:38:57.793870 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Dec 16 13:38:57.793877 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Dec 16 13:38:57.793890 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Dec 16 13:38:57.793897 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Dec 16 13:38:57.793904 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Dec 16 13:38:57.793911 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Dec 16 13:38:57.793918 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Dec 16 13:38:57.793924 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Dec 16 13:38:57.793931 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Dec 16 13:38:57.793939 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Dec 16 13:38:57.793946 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Dec 16 13:38:57.793953 kernel: Freeing SMP alternatives memory: 32K Dec 16 13:38:57.793960 kernel: pid_max: default: 32768 minimum: 301 Dec 16 13:38:57.793966 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Dec 16 13:38:57.793973 kernel: landlock: Up and running. Dec 16 13:38:57.793980 kernel: SELinux: Initializing. Dec 16 13:38:57.793987 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 16 13:38:57.793994 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 16 13:38:57.794001 kernel: smpboot: CPU0: Intel(R) Xeon(R) Silver 4316 CPU @ 2.30GHz (family: 0x6, model: 0x6a, stepping: 0x6) Dec 16 13:38:57.794008 kernel: Performance Events: PEBS fmt0-, Icelake events, full-width counters, Intel PMU driver. Dec 16 13:38:57.794017 kernel: ... version: 2 Dec 16 13:38:57.794024 kernel: ... bit width: 48 Dec 16 13:38:57.794031 kernel: ... generic registers: 8 Dec 16 13:38:57.794038 kernel: ... value mask: 0000ffffffffffff Dec 16 13:38:57.794045 kernel: ... max period: 00007fffffffffff Dec 16 13:38:57.794052 kernel: ... fixed-purpose events: 3 Dec 16 13:38:57.794059 kernel: ... event mask: 00000007000000ff Dec 16 13:38:57.794066 kernel: signal: max sigframe size: 3632 Dec 16 13:38:57.794073 kernel: rcu: Hierarchical SRCU implementation. Dec 16 13:38:57.794080 kernel: rcu: Max phase no-delay instances is 400. Dec 16 13:38:57.794089 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Dec 16 13:38:57.794096 kernel: smp: Bringing up secondary CPUs ... Dec 16 13:38:57.794103 kernel: smpboot: x86: Booting SMP configuration: Dec 16 13:38:57.794110 kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 Dec 16 13:38:57.794117 kernel: smp: Brought up 1 node, 8 CPUs Dec 16 13:38:57.794124 kernel: smpboot: Total of 8 processors activated (36713.24 BogoMIPS) Dec 16 13:38:57.794131 kernel: Memory: 16308696K/16769496K available (14336K kernel code, 2444K rwdata, 26064K rodata, 46188K init, 2572K bss, 453240K reserved, 0K cma-reserved) Dec 16 13:38:57.794138 kernel: devtmpfs: initialized Dec 16 13:38:57.794145 kernel: x86/mm: Memory block size: 128MB Dec 16 13:38:57.794154 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) Dec 16 13:38:57.794161 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) Dec 16 13:38:57.794168 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) Dec 16 13:38:57.794175 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7f97f000-0x7f9fefff] (524288 bytes) Dec 16 13:38:57.794182 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fe53000-0x7fe54fff] (8192 bytes) Dec 16 13:38:57.794189 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7ff40000-0x7fffffff] (786432 bytes) Dec 16 13:38:57.794196 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 16 13:38:57.794204 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear) Dec 16 13:38:57.794212 kernel: pinctrl core: initialized pinctrl subsystem Dec 16 13:38:57.794219 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 16 13:38:57.794227 kernel: audit: initializing netlink subsys (disabled) Dec 16 13:38:57.794234 kernel: audit: type=2000 audit(1765892335.347:1): state=initialized audit_enabled=0 res=1 Dec 16 13:38:57.794241 kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 16 13:38:57.794248 kernel: thermal_sys: Registered thermal governor 'user_space' Dec 16 13:38:57.794255 kernel: cpuidle: using governor menu Dec 16 13:38:57.794262 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 16 13:38:57.794269 kernel: dca service started, version 1.12.1 Dec 16 13:38:57.794277 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Dec 16 13:38:57.794284 kernel: PCI: Using configuration type 1 for base access Dec 16 13:38:57.794292 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Dec 16 13:38:57.794299 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Dec 16 13:38:57.794306 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Dec 16 13:38:57.794313 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Dec 16 13:38:57.794320 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Dec 16 13:38:57.794327 kernel: ACPI: Added _OSI(Module Device) Dec 16 13:38:57.794334 kernel: ACPI: Added _OSI(Processor Device) Dec 16 13:38:57.794343 kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 16 13:38:57.794350 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 16 13:38:57.794357 kernel: ACPI: Interpreter enabled Dec 16 13:38:57.794364 kernel: ACPI: PM: (supports S0 S3 S5) Dec 16 13:38:57.794371 kernel: ACPI: Using IOAPIC for interrupt routing Dec 16 13:38:57.794378 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Dec 16 13:38:57.794385 kernel: PCI: Using E820 reservations for host bridge windows Dec 16 13:38:57.794392 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Dec 16 13:38:57.794399 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 16 13:38:57.794528 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Dec 16 13:38:57.794604 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Dec 16 13:38:57.794694 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Dec 16 13:38:57.794707 kernel: PCI host bridge to bus 0000:00 Dec 16 13:38:57.794805 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Dec 16 13:38:57.794869 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Dec 16 13:38:57.794940 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Dec 16 13:38:57.795003 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Dec 16 13:38:57.795060 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Dec 16 13:38:57.795120 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x38e800003fff window] Dec 16 13:38:57.795180 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Dec 16 13:38:57.795263 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Dec 16 13:38:57.795343 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Dec 16 13:38:57.795418 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80000000-0x807fffff pref] Dec 16 13:38:57.795486 kernel: pci 0000:00:01.0: BAR 2 [mem 0x38e800000000-0x38e800003fff 64bit pref] Dec 16 13:38:57.795559 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8439e000-0x8439efff] Dec 16 13:38:57.795628 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Dec 16 13:38:57.795696 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Dec 16 13:38:57.795771 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.795841 kernel: pci 0000:00:02.0: BAR 0 [mem 0x8439d000-0x8439dfff] Dec 16 13:38:57.795926 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Dec 16 13:38:57.796000 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Dec 16 13:38:57.796068 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Dec 16 13:38:57.796135 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Dec 16 13:38:57.796207 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.796275 kernel: pci 0000:00:02.1: BAR 0 [mem 0x8439c000-0x8439cfff] Dec 16 13:38:57.796344 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Dec 16 13:38:57.796411 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Dec 16 13:38:57.796480 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Dec 16 13:38:57.796552 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.796620 kernel: pci 0000:00:02.2: BAR 0 [mem 0x8439b000-0x8439bfff] Dec 16 13:38:57.796687 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Dec 16 13:38:57.796753 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Dec 16 13:38:57.796820 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Dec 16 13:38:57.796957 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.797047 kernel: pci 0000:00:02.3: BAR 0 [mem 0x8439a000-0x8439afff] Dec 16 13:38:57.797116 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Dec 16 13:38:57.797183 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Dec 16 13:38:57.797249 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Dec 16 13:38:57.797321 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.797389 kernel: pci 0000:00:02.4: BAR 0 [mem 0x84399000-0x84399fff] Dec 16 13:38:57.797459 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Dec 16 13:38:57.797525 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Dec 16 13:38:57.797593 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Dec 16 13:38:57.797666 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.797736 kernel: pci 0000:00:02.5: BAR 0 [mem 0x84398000-0x84398fff] Dec 16 13:38:57.797814 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Dec 16 13:38:57.797890 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Dec 16 13:38:57.797962 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Dec 16 13:38:57.798039 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.798107 kernel: pci 0000:00:02.6: BAR 0 [mem 0x84397000-0x84397fff] Dec 16 13:38:57.798174 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Dec 16 13:38:57.798242 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Dec 16 13:38:57.798309 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Dec 16 13:38:57.798382 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.798448 kernel: pci 0000:00:02.7: BAR 0 [mem 0x84396000-0x84396fff] Dec 16 13:38:57.798512 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Dec 16 13:38:57.798576 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Dec 16 13:38:57.798640 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Dec 16 13:38:57.798712 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.798777 kernel: pci 0000:00:03.0: BAR 0 [mem 0x84395000-0x84395fff] Dec 16 13:38:57.798844 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Dec 16 13:38:57.798916 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Dec 16 13:38:57.798981 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Dec 16 13:38:57.799051 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.799138 kernel: pci 0000:00:03.1: BAR 0 [mem 0x84394000-0x84394fff] Dec 16 13:38:57.799208 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Dec 16 13:38:57.799274 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Dec 16 13:38:57.799344 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Dec 16 13:38:57.799432 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.799501 kernel: pci 0000:00:03.2: BAR 0 [mem 0x84393000-0x84393fff] Dec 16 13:38:57.799569 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Dec 16 13:38:57.799635 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Dec 16 13:38:57.799703 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Dec 16 13:38:57.799774 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.799845 kernel: pci 0000:00:03.3: BAR 0 [mem 0x84392000-0x84392fff] Dec 16 13:38:57.799928 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Dec 16 13:38:57.799996 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Dec 16 13:38:57.800063 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Dec 16 13:38:57.800137 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.800209 kernel: pci 0000:00:03.4: BAR 0 [mem 0x84391000-0x84391fff] Dec 16 13:38:57.800277 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Dec 16 13:38:57.800344 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Dec 16 13:38:57.800412 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Dec 16 13:38:57.800484 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.800552 kernel: pci 0000:00:03.5: BAR 0 [mem 0x84390000-0x84390fff] Dec 16 13:38:57.800618 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Dec 16 13:38:57.800689 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Dec 16 13:38:57.800755 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Dec 16 13:38:57.800827 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.800902 kernel: pci 0000:00:03.6: BAR 0 [mem 0x8438f000-0x8438ffff] Dec 16 13:38:57.800970 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Dec 16 13:38:57.801037 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Dec 16 13:38:57.801105 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Dec 16 13:38:57.801185 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.801253 kernel: pci 0000:00:03.7: BAR 0 [mem 0x8438e000-0x8438efff] Dec 16 13:38:57.801320 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Dec 16 13:38:57.801387 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Dec 16 13:38:57.801453 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Dec 16 13:38:57.801524 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.801592 kernel: pci 0000:00:04.0: BAR 0 [mem 0x8438d000-0x8438dfff] Dec 16 13:38:57.801661 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Dec 16 13:38:57.801732 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Dec 16 13:38:57.801812 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Dec 16 13:38:57.801891 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.801960 kernel: pci 0000:00:04.1: BAR 0 [mem 0x8438c000-0x8438cfff] Dec 16 13:38:57.802027 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Dec 16 13:38:57.802093 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Dec 16 13:38:57.802173 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Dec 16 13:38:57.802249 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.802317 kernel: pci 0000:00:04.2: BAR 0 [mem 0x8438b000-0x8438bfff] Dec 16 13:38:57.802385 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Dec 16 13:38:57.802449 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Dec 16 13:38:57.802514 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Dec 16 13:38:57.802585 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.802654 kernel: pci 0000:00:04.3: BAR 0 [mem 0x8438a000-0x8438afff] Dec 16 13:38:57.802726 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Dec 16 13:38:57.802794 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Dec 16 13:38:57.802861 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Dec 16 13:38:57.802943 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.803013 kernel: pci 0000:00:04.4: BAR 0 [mem 0x84389000-0x84389fff] Dec 16 13:38:57.803081 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Dec 16 13:38:57.803148 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Dec 16 13:38:57.803219 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Dec 16 13:38:57.803290 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.803358 kernel: pci 0000:00:04.5: BAR 0 [mem 0x84388000-0x84388fff] Dec 16 13:38:57.803426 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Dec 16 13:38:57.803493 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Dec 16 13:38:57.803559 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Dec 16 13:38:57.803632 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.803702 kernel: pci 0000:00:04.6: BAR 0 [mem 0x84387000-0x84387fff] Dec 16 13:38:57.803769 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Dec 16 13:38:57.803836 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Dec 16 13:38:57.803910 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Dec 16 13:38:57.803986 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.804054 kernel: pci 0000:00:04.7: BAR 0 [mem 0x84386000-0x84386fff] Dec 16 13:38:57.804120 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Dec 16 13:38:57.804187 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Dec 16 13:38:57.804253 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Dec 16 13:38:57.804328 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.804396 kernel: pci 0000:00:05.0: BAR 0 [mem 0x84385000-0x84385fff] Dec 16 13:38:57.804470 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Dec 16 13:38:57.804537 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Dec 16 13:38:57.804604 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Dec 16 13:38:57.804675 kernel: pci 0000:00:05.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.804743 kernel: pci 0000:00:05.1: BAR 0 [mem 0x84384000-0x84384fff] Dec 16 13:38:57.804810 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Dec 16 13:38:57.804878 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Dec 16 13:38:57.804974 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Dec 16 13:38:57.805047 kernel: pci 0000:00:05.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.805115 kernel: pci 0000:00:05.2: BAR 0 [mem 0x84383000-0x84383fff] Dec 16 13:38:57.805183 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Dec 16 13:38:57.805250 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Dec 16 13:38:57.805317 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Dec 16 13:38:57.805389 kernel: pci 0000:00:05.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.805461 kernel: pci 0000:00:05.3: BAR 0 [mem 0x84382000-0x84382fff] Dec 16 13:38:57.805528 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Dec 16 13:38:57.805606 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Dec 16 13:38:57.805674 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Dec 16 13:38:57.805746 kernel: pci 0000:00:05.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:38:57.805822 kernel: pci 0000:00:05.4: BAR 0 [mem 0x84381000-0x84381fff] Dec 16 13:38:57.805897 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Dec 16 13:38:57.805968 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Dec 16 13:38:57.806035 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Dec 16 13:38:57.806107 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Dec 16 13:38:57.806175 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Dec 16 13:38:57.806247 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Dec 16 13:38:57.806315 kernel: pci 0000:00:1f.2: BAR 4 [io 0x7040-0x705f] Dec 16 13:38:57.806382 kernel: pci 0000:00:1f.2: BAR 5 [mem 0x84380000-0x84380fff] Dec 16 13:38:57.806457 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Dec 16 13:38:57.806525 kernel: pci 0000:00:1f.3: BAR 4 [io 0x7000-0x703f] Dec 16 13:38:57.806608 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Dec 16 13:38:57.806678 kernel: pci 0000:01:00.0: BAR 0 [mem 0x84200000-0x842000ff 64bit] Dec 16 13:38:57.806748 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Dec 16 13:38:57.806817 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Dec 16 13:38:57.806897 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Dec 16 13:38:57.806974 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Dec 16 13:38:57.807044 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Dec 16 13:38:57.807128 kernel: pci_bus 0000:02: extended config space not accessible Dec 16 13:38:57.807139 kernel: acpiphp: Slot [1] registered Dec 16 13:38:57.807147 kernel: acpiphp: Slot [0] registered Dec 16 13:38:57.807154 kernel: acpiphp: Slot [2] registered Dec 16 13:38:57.807162 kernel: acpiphp: Slot [3] registered Dec 16 13:38:57.807172 kernel: acpiphp: Slot [4] registered Dec 16 13:38:57.807181 kernel: acpiphp: Slot [5] registered Dec 16 13:38:57.807188 kernel: acpiphp: Slot [6] registered Dec 16 13:38:57.807197 kernel: acpiphp: Slot [7] registered Dec 16 13:38:57.807205 kernel: acpiphp: Slot [8] registered Dec 16 13:38:57.807212 kernel: acpiphp: Slot [9] registered Dec 16 13:38:57.807220 kernel: acpiphp: Slot [10] registered Dec 16 13:38:57.807227 kernel: acpiphp: Slot [11] registered Dec 16 13:38:57.807235 kernel: acpiphp: Slot [12] registered Dec 16 13:38:57.807242 kernel: acpiphp: Slot [13] registered Dec 16 13:38:57.807250 kernel: acpiphp: Slot [14] registered Dec 16 13:38:57.807259 kernel: acpiphp: Slot [15] registered Dec 16 13:38:57.807266 kernel: acpiphp: Slot [16] registered Dec 16 13:38:57.807274 kernel: acpiphp: Slot [17] registered Dec 16 13:38:57.807281 kernel: acpiphp: Slot [18] registered Dec 16 13:38:57.807289 kernel: acpiphp: Slot [19] registered Dec 16 13:38:57.807296 kernel: acpiphp: Slot [20] registered Dec 16 13:38:57.807303 kernel: acpiphp: Slot [21] registered Dec 16 13:38:57.807311 kernel: acpiphp: Slot [22] registered Dec 16 13:38:57.807319 kernel: acpiphp: Slot [23] registered Dec 16 13:38:57.807328 kernel: acpiphp: Slot [24] registered Dec 16 13:38:57.807335 kernel: acpiphp: Slot [25] registered Dec 16 13:38:57.807343 kernel: acpiphp: Slot [26] registered Dec 16 13:38:57.807350 kernel: acpiphp: Slot [27] registered Dec 16 13:38:57.807358 kernel: acpiphp: Slot [28] registered Dec 16 13:38:57.807365 kernel: acpiphp: Slot [29] registered Dec 16 13:38:57.807372 kernel: acpiphp: Slot [30] registered Dec 16 13:38:57.807379 kernel: acpiphp: Slot [31] registered Dec 16 13:38:57.807453 kernel: pci 0000:02:01.0: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint Dec 16 13:38:57.807527 kernel: pci 0000:02:01.0: BAR 4 [io 0x6000-0x601f] Dec 16 13:38:57.807594 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Dec 16 13:38:57.807603 kernel: acpiphp: Slot [0-2] registered Dec 16 13:38:57.807678 kernel: pci 0000:03:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 16 13:38:57.807748 kernel: pci 0000:03:00.0: BAR 1 [mem 0x83e00000-0x83e00fff] Dec 16 13:38:57.807820 kernel: pci 0000:03:00.0: BAR 4 [mem 0x380800000000-0x380800003fff 64bit pref] Dec 16 13:38:57.807898 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref] Dec 16 13:38:57.807969 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Dec 16 13:38:57.807982 kernel: acpiphp: Slot [0-3] registered Dec 16 13:38:57.808057 kernel: pci 0000:04:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Dec 16 13:38:57.808128 kernel: pci 0000:04:00.0: BAR 1 [mem 0x83c00000-0x83c00fff] Dec 16 13:38:57.808197 kernel: pci 0000:04:00.0: BAR 4 [mem 0x381000000000-0x381000003fff 64bit pref] Dec 16 13:38:57.808266 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Dec 16 13:38:57.808277 kernel: acpiphp: Slot [0-4] registered Dec 16 13:38:57.808351 kernel: pci 0000:05:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Dec 16 13:38:57.808425 kernel: pci 0000:05:00.0: BAR 4 [mem 0x381800000000-0x381800003fff 64bit pref] Dec 16 13:38:57.808494 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Dec 16 13:38:57.808504 kernel: acpiphp: Slot [0-5] registered Dec 16 13:38:57.808577 kernel: pci 0000:06:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Dec 16 13:38:57.808648 kernel: pci 0000:06:00.0: BAR 1 [mem 0x83800000-0x83800fff] Dec 16 13:38:57.808719 kernel: pci 0000:06:00.0: BAR 4 [mem 0x382000000000-0x382000003fff 64bit pref] Dec 16 13:38:57.808789 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Dec 16 13:38:57.808801 kernel: acpiphp: Slot [0-6] registered Dec 16 13:38:57.808868 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Dec 16 13:38:57.808878 kernel: acpiphp: Slot [0-7] registered Dec 16 13:38:57.808969 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Dec 16 13:38:57.808980 kernel: acpiphp: Slot [0-8] registered Dec 16 13:38:57.809048 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Dec 16 13:38:57.809058 kernel: acpiphp: Slot [0-9] registered Dec 16 13:38:57.809125 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Dec 16 13:38:57.809137 kernel: acpiphp: Slot [0-10] registered Dec 16 13:38:57.809204 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Dec 16 13:38:57.809214 kernel: acpiphp: Slot [0-11] registered Dec 16 13:38:57.809281 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Dec 16 13:38:57.809292 kernel: acpiphp: Slot [0-12] registered Dec 16 13:38:57.809359 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Dec 16 13:38:57.809369 kernel: acpiphp: Slot [0-13] registered Dec 16 13:38:57.809439 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Dec 16 13:38:57.809449 kernel: acpiphp: Slot [0-14] registered Dec 16 13:38:57.809516 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Dec 16 13:38:57.809526 kernel: acpiphp: Slot [0-15] registered Dec 16 13:38:57.809593 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Dec 16 13:38:57.809602 kernel: acpiphp: Slot [0-16] registered Dec 16 13:38:57.809669 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Dec 16 13:38:57.809679 kernel: acpiphp: Slot [0-17] registered Dec 16 13:38:57.809748 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Dec 16 13:38:57.809758 kernel: acpiphp: Slot [0-18] registered Dec 16 13:38:57.809834 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Dec 16 13:38:57.809844 kernel: acpiphp: Slot [0-19] registered Dec 16 13:38:57.809917 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Dec 16 13:38:57.809927 kernel: acpiphp: Slot [0-20] registered Dec 16 13:38:57.809992 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Dec 16 13:38:57.810002 kernel: acpiphp: Slot [0-21] registered Dec 16 13:38:57.810071 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Dec 16 13:38:57.810085 kernel: acpiphp: Slot [0-22] registered Dec 16 13:38:57.810150 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Dec 16 13:38:57.810161 kernel: acpiphp: Slot [0-23] registered Dec 16 13:38:57.810227 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Dec 16 13:38:57.810237 kernel: acpiphp: Slot [0-24] registered Dec 16 13:38:57.810309 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Dec 16 13:38:57.810319 kernel: acpiphp: Slot [0-25] registered Dec 16 13:38:57.810387 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Dec 16 13:38:57.810397 kernel: acpiphp: Slot [0-26] registered Dec 16 13:38:57.810461 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Dec 16 13:38:57.810472 kernel: acpiphp: Slot [0-27] registered Dec 16 13:38:57.810538 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Dec 16 13:38:57.810548 kernel: acpiphp: Slot [0-28] registered Dec 16 13:38:57.810614 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Dec 16 13:38:57.810623 kernel: acpiphp: Slot [0-29] registered Dec 16 13:38:57.810692 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Dec 16 13:38:57.810702 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Dec 16 13:38:57.810710 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Dec 16 13:38:57.810718 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Dec 16 13:38:57.810726 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Dec 16 13:38:57.810733 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Dec 16 13:38:57.810741 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Dec 16 13:38:57.810748 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Dec 16 13:38:57.810758 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Dec 16 13:38:57.810765 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Dec 16 13:38:57.810773 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Dec 16 13:38:57.810780 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Dec 16 13:38:57.810788 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Dec 16 13:38:57.810796 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Dec 16 13:38:57.810803 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Dec 16 13:38:57.810811 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Dec 16 13:38:57.810818 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Dec 16 13:38:57.810828 kernel: iommu: Default domain type: Translated Dec 16 13:38:57.810835 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Dec 16 13:38:57.810843 kernel: efivars: Registered efivars operations Dec 16 13:38:57.810850 kernel: PCI: Using ACPI for IRQ routing Dec 16 13:38:57.810863 kernel: PCI: pci_cache_line_size set to 64 bytes Dec 16 13:38:57.810871 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] Dec 16 13:38:57.810878 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] Dec 16 13:38:57.810896 kernel: e820: reserve RAM buffer [mem 0x7dd26018-0x7fffffff] Dec 16 13:38:57.810907 kernel: e820: reserve RAM buffer [mem 0x7dd4e018-0x7fffffff] Dec 16 13:38:57.810918 kernel: e820: reserve RAM buffer [mem 0x7e73f000-0x7fffffff] Dec 16 13:38:57.810935 kernel: e820: reserve RAM buffer [mem 0x7ea71000-0x7fffffff] Dec 16 13:38:57.810949 kernel: e820: reserve RAM buffer [mem 0x7f6ed000-0x7fffffff] Dec 16 13:38:57.810960 kernel: e820: reserve RAM buffer [mem 0x7fe4f000-0x7fffffff] Dec 16 13:38:57.810971 kernel: e820: reserve RAM buffer [mem 0x7febc000-0x7fffffff] Dec 16 13:38:57.811073 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Dec 16 13:38:57.811142 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Dec 16 13:38:57.811210 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Dec 16 13:38:57.811220 kernel: vgaarb: loaded Dec 16 13:38:57.811231 kernel: clocksource: Switched to clocksource kvm-clock Dec 16 13:38:57.811238 kernel: VFS: Disk quotas dquot_6.6.0 Dec 16 13:38:57.811246 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 16 13:38:57.811254 kernel: pnp: PnP ACPI init Dec 16 13:38:57.811333 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Dec 16 13:38:57.811344 kernel: pnp: PnP ACPI: found 5 devices Dec 16 13:38:57.811352 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Dec 16 13:38:57.811359 kernel: NET: Registered PF_INET protocol family Dec 16 13:38:57.811370 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 16 13:38:57.811377 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Dec 16 13:38:57.811385 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 16 13:38:57.811393 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Dec 16 13:38:57.811401 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Dec 16 13:38:57.811408 kernel: TCP: Hash tables configured (established 131072 bind 65536) Dec 16 13:38:57.811416 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 16 13:38:57.811423 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 16 13:38:57.811441 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 16 13:38:57.811451 kernel: NET: Registered PF_XDP protocol family Dec 16 13:38:57.811528 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Dec 16 13:38:57.811598 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Dec 16 13:38:57.811668 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Dec 16 13:38:57.811738 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Dec 16 13:38:57.811809 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Dec 16 13:38:57.811878 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Dec 16 13:38:57.811953 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Dec 16 13:38:57.812025 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Dec 16 13:38:57.812094 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Dec 16 13:38:57.812162 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Dec 16 13:38:57.812235 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Dec 16 13:38:57.812303 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Dec 16 13:38:57.812370 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Dec 16 13:38:57.812438 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Dec 16 13:38:57.812507 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Dec 16 13:38:57.812578 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Dec 16 13:38:57.812645 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Dec 16 13:38:57.812712 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Dec 16 13:38:57.812781 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Dec 16 13:38:57.812848 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Dec 16 13:38:57.812929 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Dec 16 13:38:57.813000 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Dec 16 13:38:57.813094 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Dec 16 13:38:57.813163 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Dec 16 13:38:57.813231 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Dec 16 13:38:57.813299 kernel: pci 0000:00:05.1: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Dec 16 13:38:57.813366 kernel: pci 0000:00:05.2: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Dec 16 13:38:57.813434 kernel: pci 0000:00:05.3: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Dec 16 13:38:57.813501 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Dec 16 13:38:57.813571 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x1fff]: assigned Dec 16 13:38:57.813641 kernel: pci 0000:00:02.2: bridge window [io 0x2000-0x2fff]: assigned Dec 16 13:38:57.813708 kernel: pci 0000:00:02.3: bridge window [io 0x3000-0x3fff]: assigned Dec 16 13:38:57.813784 kernel: pci 0000:00:02.4: bridge window [io 0x4000-0x4fff]: assigned Dec 16 13:38:57.813853 kernel: pci 0000:00:02.5: bridge window [io 0x5000-0x5fff]: assigned Dec 16 13:38:57.813930 kernel: pci 0000:00:02.6: bridge window [io 0x8000-0x8fff]: assigned Dec 16 13:38:57.813998 kernel: pci 0000:00:02.7: bridge window [io 0x9000-0x9fff]: assigned Dec 16 13:38:57.814066 kernel: pci 0000:00:03.0: bridge window [io 0xa000-0xafff]: assigned Dec 16 13:38:57.814133 kernel: pci 0000:00:03.1: bridge window [io 0xb000-0xbfff]: assigned Dec 16 13:38:57.814203 kernel: pci 0000:00:03.2: bridge window [io 0xc000-0xcfff]: assigned Dec 16 13:38:57.814271 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff]: assigned Dec 16 13:38:57.814338 kernel: pci 0000:00:03.4: bridge window [io 0xe000-0xefff]: assigned Dec 16 13:38:57.814404 kernel: pci 0000:00:03.5: bridge window [io 0xf000-0xffff]: assigned Dec 16 13:38:57.814472 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.814539 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.814606 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.814673 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.814743 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.814809 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.814886 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.814956 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.815024 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.815091 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.815158 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.815225 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.815295 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.815361 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.815426 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.815491 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.815556 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.815622 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.815689 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.815754 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.815821 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.815897 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.815992 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.816085 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.816178 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.816270 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.816362 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.816452 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.816551 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.816633 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.816699 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff]: assigned Dec 16 13:38:57.816800 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff]: assigned Dec 16 13:38:57.816868 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff]: assigned Dec 16 13:38:57.816944 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff]: assigned Dec 16 13:38:57.817012 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff]: assigned Dec 16 13:38:57.817082 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff]: assigned Dec 16 13:38:57.817149 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff]: assigned Dec 16 13:38:57.817217 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff]: assigned Dec 16 13:38:57.817285 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff]: assigned Dec 16 13:38:57.817352 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff]: assigned Dec 16 13:38:57.817419 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff]: assigned Dec 16 13:38:57.817485 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff]: assigned Dec 16 13:38:57.817552 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff]: assigned Dec 16 13:38:57.817621 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.817688 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.817754 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.817832 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.817907 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.817975 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.818041 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.818106 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.818171 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.818240 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.818304 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.818369 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.818433 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.818498 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.818563 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.818627 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.818692 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.818760 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.818825 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.818900 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.818968 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.819039 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.819106 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.819178 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.819245 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.819326 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.819395 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.819463 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.819531 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 13:38:57.819602 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Dec 16 13:38:57.819673 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Dec 16 13:38:57.819744 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Dec 16 13:38:57.819814 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Dec 16 13:38:57.819893 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Dec 16 13:38:57.819962 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Dec 16 13:38:57.820030 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Dec 16 13:38:57.820096 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Dec 16 13:38:57.820163 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Dec 16 13:38:57.820234 kernel: pci 0000:03:00.0: ROM [mem 0x83e80000-0x83efffff pref]: assigned Dec 16 13:38:57.820302 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Dec 16 13:38:57.820369 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Dec 16 13:38:57.820439 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Dec 16 13:38:57.820509 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Dec 16 13:38:57.820576 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Dec 16 13:38:57.820647 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Dec 16 13:38:57.820715 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Dec 16 13:38:57.820782 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Dec 16 13:38:57.820856 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Dec 16 13:38:57.820941 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Dec 16 13:38:57.821042 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Dec 16 13:38:57.821125 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Dec 16 13:38:57.821204 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Dec 16 13:38:57.821289 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Dec 16 13:38:57.821357 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Dec 16 13:38:57.821427 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Dec 16 13:38:57.821495 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Dec 16 13:38:57.821562 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Dec 16 13:38:57.821628 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Dec 16 13:38:57.821697 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Dec 16 13:38:57.821763 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Dec 16 13:38:57.821847 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Dec 16 13:38:57.821922 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Dec 16 13:38:57.821991 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Dec 16 13:38:57.822058 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Dec 16 13:38:57.822124 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Dec 16 13:38:57.822189 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Dec 16 13:38:57.822253 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Dec 16 13:38:57.822318 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Dec 16 13:38:57.822385 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Dec 16 13:38:57.822448 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Dec 16 13:38:57.822512 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Dec 16 13:38:57.822576 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Dec 16 13:38:57.822648 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Dec 16 13:38:57.822711 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Dec 16 13:38:57.822774 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Dec 16 13:38:57.822839 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Dec 16 13:38:57.822911 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Dec 16 13:38:57.822976 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Dec 16 13:38:57.823043 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Dec 16 13:38:57.823107 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Dec 16 13:38:57.823172 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Dec 16 13:38:57.823237 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Dec 16 13:38:57.823304 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Dec 16 13:38:57.823370 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Dec 16 13:38:57.823438 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Dec 16 13:38:57.823503 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff] Dec 16 13:38:57.823572 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Dec 16 13:38:57.823636 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Dec 16 13:38:57.823702 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Dec 16 13:38:57.823768 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff] Dec 16 13:38:57.823836 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Dec 16 13:38:57.823911 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Dec 16 13:38:57.823987 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Dec 16 13:38:57.824060 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff] Dec 16 13:38:57.824127 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Dec 16 13:38:57.824194 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Dec 16 13:38:57.824261 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Dec 16 13:38:57.824328 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff] Dec 16 13:38:57.824394 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Dec 16 13:38:57.824459 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Dec 16 13:38:57.824527 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Dec 16 13:38:57.824592 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff] Dec 16 13:38:57.824662 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Dec 16 13:38:57.824727 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Dec 16 13:38:57.824794 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Dec 16 13:38:57.824861 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff] Dec 16 13:38:57.824941 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Dec 16 13:38:57.825008 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Dec 16 13:38:57.825080 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Dec 16 13:38:57.825148 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff] Dec 16 13:38:57.825216 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Dec 16 13:38:57.825283 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Dec 16 13:38:57.825352 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Dec 16 13:38:57.825421 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff] Dec 16 13:38:57.825489 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Dec 16 13:38:57.825556 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Dec 16 13:38:57.825627 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Dec 16 13:38:57.825698 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff] Dec 16 13:38:57.825765 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Dec 16 13:38:57.825842 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Dec 16 13:38:57.825919 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Dec 16 13:38:57.825988 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff] Dec 16 13:38:57.826054 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Dec 16 13:38:57.826122 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Dec 16 13:38:57.826190 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Dec 16 13:38:57.826255 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff] Dec 16 13:38:57.826321 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Dec 16 13:38:57.826386 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Dec 16 13:38:57.826453 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Dec 16 13:38:57.826518 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff] Dec 16 13:38:57.826586 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Dec 16 13:38:57.826651 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Dec 16 13:38:57.826717 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Dec 16 13:38:57.826782 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff] Dec 16 13:38:57.826848 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Dec 16 13:38:57.826924 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Dec 16 13:38:57.826991 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Dec 16 13:38:57.827055 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Dec 16 13:38:57.827113 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Dec 16 13:38:57.827172 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Dec 16 13:38:57.827235 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Dec 16 13:38:57.827293 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x38e800003fff window] Dec 16 13:38:57.827361 kernel: pci_bus 0000:01: resource 0 [io 0x6000-0x6fff] Dec 16 13:38:57.827422 kernel: pci_bus 0000:01: resource 1 [mem 0x84000000-0x842fffff] Dec 16 13:38:57.827485 kernel: pci_bus 0000:01: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Dec 16 13:38:57.827553 kernel: pci_bus 0000:02: resource 0 [io 0x6000-0x6fff] Dec 16 13:38:57.827616 kernel: pci_bus 0000:02: resource 1 [mem 0x84000000-0x841fffff] Dec 16 13:38:57.827683 kernel: pci_bus 0000:02: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Dec 16 13:38:57.827751 kernel: pci_bus 0000:03: resource 1 [mem 0x83e00000-0x83ffffff] Dec 16 13:38:57.827814 kernel: pci_bus 0000:03: resource 2 [mem 0x380800000000-0x380fffffffff 64bit pref] Dec 16 13:38:57.827891 kernel: pci_bus 0000:04: resource 1 [mem 0x83c00000-0x83dfffff] Dec 16 13:38:57.827958 kernel: pci_bus 0000:04: resource 2 [mem 0x381000000000-0x3817ffffffff 64bit pref] Dec 16 13:38:57.828030 kernel: pci_bus 0000:05: resource 1 [mem 0x83a00000-0x83bfffff] Dec 16 13:38:57.828097 kernel: pci_bus 0000:05: resource 2 [mem 0x381800000000-0x381fffffffff 64bit pref] Dec 16 13:38:57.828175 kernel: pci_bus 0000:06: resource 1 [mem 0x83800000-0x839fffff] Dec 16 13:38:57.828237 kernel: pci_bus 0000:06: resource 2 [mem 0x382000000000-0x3827ffffffff 64bit pref] Dec 16 13:38:57.828303 kernel: pci_bus 0000:07: resource 1 [mem 0x83600000-0x837fffff] Dec 16 13:38:57.828368 kernel: pci_bus 0000:07: resource 2 [mem 0x382800000000-0x382fffffffff 64bit pref] Dec 16 13:38:57.828434 kernel: pci_bus 0000:08: resource 1 [mem 0x83400000-0x835fffff] Dec 16 13:38:57.828497 kernel: pci_bus 0000:08: resource 2 [mem 0x383000000000-0x3837ffffffff 64bit pref] Dec 16 13:38:57.828564 kernel: pci_bus 0000:09: resource 1 [mem 0x83200000-0x833fffff] Dec 16 13:38:57.828627 kernel: pci_bus 0000:09: resource 2 [mem 0x383800000000-0x383fffffffff 64bit pref] Dec 16 13:38:57.828698 kernel: pci_bus 0000:0a: resource 1 [mem 0x83000000-0x831fffff] Dec 16 13:38:57.828761 kernel: pci_bus 0000:0a: resource 2 [mem 0x384000000000-0x3847ffffffff 64bit pref] Dec 16 13:38:57.828831 kernel: pci_bus 0000:0b: resource 1 [mem 0x82e00000-0x82ffffff] Dec 16 13:38:57.828922 kernel: pci_bus 0000:0b: resource 2 [mem 0x384800000000-0x384fffffffff 64bit pref] Dec 16 13:38:57.829000 kernel: pci_bus 0000:0c: resource 1 [mem 0x82c00000-0x82dfffff] Dec 16 13:38:57.829064 kernel: pci_bus 0000:0c: resource 2 [mem 0x385000000000-0x3857ffffffff 64bit pref] Dec 16 13:38:57.829131 kernel: pci_bus 0000:0d: resource 1 [mem 0x82a00000-0x82bfffff] Dec 16 13:38:57.829197 kernel: pci_bus 0000:0d: resource 2 [mem 0x385800000000-0x385fffffffff 64bit pref] Dec 16 13:38:57.829264 kernel: pci_bus 0000:0e: resource 1 [mem 0x82800000-0x829fffff] Dec 16 13:38:57.829328 kernel: pci_bus 0000:0e: resource 2 [mem 0x386000000000-0x3867ffffffff 64bit pref] Dec 16 13:38:57.829394 kernel: pci_bus 0000:0f: resource 1 [mem 0x82600000-0x827fffff] Dec 16 13:38:57.829458 kernel: pci_bus 0000:0f: resource 2 [mem 0x386800000000-0x386fffffffff 64bit pref] Dec 16 13:38:57.829525 kernel: pci_bus 0000:10: resource 1 [mem 0x82400000-0x825fffff] Dec 16 13:38:57.829591 kernel: pci_bus 0000:10: resource 2 [mem 0x387000000000-0x3877ffffffff 64bit pref] Dec 16 13:38:57.829658 kernel: pci_bus 0000:11: resource 1 [mem 0x82200000-0x823fffff] Dec 16 13:38:57.829721 kernel: pci_bus 0000:11: resource 2 [mem 0x387800000000-0x387fffffffff 64bit pref] Dec 16 13:38:57.829799 kernel: pci_bus 0000:12: resource 0 [io 0xf000-0xffff] Dec 16 13:38:57.829863 kernel: pci_bus 0000:12: resource 1 [mem 0x82000000-0x821fffff] Dec 16 13:38:57.829933 kernel: pci_bus 0000:12: resource 2 [mem 0x388000000000-0x3887ffffffff 64bit pref] Dec 16 13:38:57.830009 kernel: pci_bus 0000:13: resource 0 [io 0xe000-0xefff] Dec 16 13:38:57.830077 kernel: pci_bus 0000:13: resource 1 [mem 0x81e00000-0x81ffffff] Dec 16 13:38:57.830140 kernel: pci_bus 0000:13: resource 2 [mem 0x388800000000-0x388fffffffff 64bit pref] Dec 16 13:38:57.830205 kernel: pci_bus 0000:14: resource 0 [io 0xd000-0xdfff] Dec 16 13:38:57.830269 kernel: pci_bus 0000:14: resource 1 [mem 0x81c00000-0x81dfffff] Dec 16 13:38:57.830331 kernel: pci_bus 0000:14: resource 2 [mem 0x389000000000-0x3897ffffffff 64bit pref] Dec 16 13:38:57.830397 kernel: pci_bus 0000:15: resource 0 [io 0xc000-0xcfff] Dec 16 13:38:57.830463 kernel: pci_bus 0000:15: resource 1 [mem 0x81a00000-0x81bfffff] Dec 16 13:38:57.830526 kernel: pci_bus 0000:15: resource 2 [mem 0x389800000000-0x389fffffffff 64bit pref] Dec 16 13:38:57.830593 kernel: pci_bus 0000:16: resource 0 [io 0xb000-0xbfff] Dec 16 13:38:57.830656 kernel: pci_bus 0000:16: resource 1 [mem 0x81800000-0x819fffff] Dec 16 13:38:57.830719 kernel: pci_bus 0000:16: resource 2 [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Dec 16 13:38:57.830784 kernel: pci_bus 0000:17: resource 0 [io 0xa000-0xafff] Dec 16 13:38:57.830848 kernel: pci_bus 0000:17: resource 1 [mem 0x81600000-0x817fffff] Dec 16 13:38:57.830922 kernel: pci_bus 0000:17: resource 2 [mem 0x38a800000000-0x38afffffffff 64bit pref] Dec 16 13:38:57.830991 kernel: pci_bus 0000:18: resource 0 [io 0x9000-0x9fff] Dec 16 13:38:57.831053 kernel: pci_bus 0000:18: resource 1 [mem 0x81400000-0x815fffff] Dec 16 13:38:57.831116 kernel: pci_bus 0000:18: resource 2 [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Dec 16 13:38:57.831182 kernel: pci_bus 0000:19: resource 0 [io 0x8000-0x8fff] Dec 16 13:38:57.831245 kernel: pci_bus 0000:19: resource 1 [mem 0x81200000-0x813fffff] Dec 16 13:38:57.831308 kernel: pci_bus 0000:19: resource 2 [mem 0x38b800000000-0x38bfffffffff 64bit pref] Dec 16 13:38:57.831377 kernel: pci_bus 0000:1a: resource 0 [io 0x5000-0x5fff] Dec 16 13:38:57.831440 kernel: pci_bus 0000:1a: resource 1 [mem 0x81000000-0x811fffff] Dec 16 13:38:57.831503 kernel: pci_bus 0000:1a: resource 2 [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Dec 16 13:38:57.831574 kernel: pci_bus 0000:1b: resource 0 [io 0x4000-0x4fff] Dec 16 13:38:57.831637 kernel: pci_bus 0000:1b: resource 1 [mem 0x80e00000-0x80ffffff] Dec 16 13:38:57.831699 kernel: pci_bus 0000:1b: resource 2 [mem 0x38c800000000-0x38cfffffffff 64bit pref] Dec 16 13:38:57.831769 kernel: pci_bus 0000:1c: resource 0 [io 0x3000-0x3fff] Dec 16 13:38:57.831831 kernel: pci_bus 0000:1c: resource 1 [mem 0x80c00000-0x80dfffff] Dec 16 13:38:57.831901 kernel: pci_bus 0000:1c: resource 2 [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Dec 16 13:38:57.831968 kernel: pci_bus 0000:1d: resource 0 [io 0x2000-0x2fff] Dec 16 13:38:57.832031 kernel: pci_bus 0000:1d: resource 1 [mem 0x80a00000-0x80bfffff] Dec 16 13:38:57.832092 kernel: pci_bus 0000:1d: resource 2 [mem 0x38d800000000-0x38dfffffffff 64bit pref] Dec 16 13:38:57.832157 kernel: pci_bus 0000:1e: resource 0 [io 0x1000-0x1fff] Dec 16 13:38:57.832221 kernel: pci_bus 0000:1e: resource 1 [mem 0x80800000-0x809fffff] Dec 16 13:38:57.832281 kernel: pci_bus 0000:1e: resource 2 [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Dec 16 13:38:57.832291 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Dec 16 13:38:57.832299 kernel: PCI: CLS 0 bytes, default 64 Dec 16 13:38:57.832307 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Dec 16 13:38:57.832314 kernel: software IO TLB: mapped [mem 0x0000000077e7e000-0x000000007be7e000] (64MB) Dec 16 13:38:57.832322 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Dec 16 13:38:57.832330 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2113334dc36, max_idle_ns: 440795272915 ns Dec 16 13:38:57.832340 kernel: Initialise system trusted keyrings Dec 16 13:38:57.832348 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Dec 16 13:38:57.832355 kernel: Key type asymmetric registered Dec 16 13:38:57.832363 kernel: Asymmetric key parser 'x509' registered Dec 16 13:38:57.832370 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Dec 16 13:38:57.832378 kernel: io scheduler mq-deadline registered Dec 16 13:38:57.832385 kernel: io scheduler kyber registered Dec 16 13:38:57.832393 kernel: io scheduler bfq registered Dec 16 13:38:57.832462 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Dec 16 13:38:57.832532 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Dec 16 13:38:57.832600 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Dec 16 13:38:57.832667 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Dec 16 13:38:57.832734 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Dec 16 13:38:57.832800 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Dec 16 13:38:57.832868 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Dec 16 13:38:57.832951 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Dec 16 13:38:57.833019 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Dec 16 13:38:57.833085 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Dec 16 13:38:57.833151 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Dec 16 13:38:57.833217 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Dec 16 13:38:57.833283 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Dec 16 13:38:57.833351 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Dec 16 13:38:57.833417 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Dec 16 13:38:57.833483 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Dec 16 13:38:57.833493 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Dec 16 13:38:57.833558 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Dec 16 13:38:57.833626 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Dec 16 13:38:57.833697 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 33 Dec 16 13:38:57.833766 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 33 Dec 16 13:38:57.833843 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 34 Dec 16 13:38:57.833924 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 34 Dec 16 13:38:57.833994 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 35 Dec 16 13:38:57.834063 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 35 Dec 16 13:38:57.834127 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 36 Dec 16 13:38:57.834190 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 36 Dec 16 13:38:57.834256 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 37 Dec 16 13:38:57.834322 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 37 Dec 16 13:38:57.834389 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 38 Dec 16 13:38:57.834457 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 38 Dec 16 13:38:57.834524 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 39 Dec 16 13:38:57.834589 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 39 Dec 16 13:38:57.834599 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Dec 16 13:38:57.834663 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 40 Dec 16 13:38:57.834728 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 40 Dec 16 13:38:57.834795 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 41 Dec 16 13:38:57.834862 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 41 Dec 16 13:38:57.834941 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 42 Dec 16 13:38:57.835009 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 42 Dec 16 13:38:57.835078 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 43 Dec 16 13:38:57.835146 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 43 Dec 16 13:38:57.835214 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 44 Dec 16 13:38:57.835282 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 44 Dec 16 13:38:57.835349 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 45 Dec 16 13:38:57.835414 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 45 Dec 16 13:38:57.835479 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 46 Dec 16 13:38:57.835549 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 46 Dec 16 13:38:57.835617 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 47 Dec 16 13:38:57.835684 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 47 Dec 16 13:38:57.835694 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Dec 16 13:38:57.835760 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 48 Dec 16 13:38:57.835828 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 48 Dec 16 13:38:57.835903 kernel: pcieport 0000:00:05.1: PME: Signaling with IRQ 49 Dec 16 13:38:57.835972 kernel: pcieport 0000:00:05.1: AER: enabled with IRQ 49 Dec 16 13:38:57.836043 kernel: pcieport 0000:00:05.2: PME: Signaling with IRQ 50 Dec 16 13:38:57.836109 kernel: pcieport 0000:00:05.2: AER: enabled with IRQ 50 Dec 16 13:38:57.836176 kernel: pcieport 0000:00:05.3: PME: Signaling with IRQ 51 Dec 16 13:38:57.836246 kernel: pcieport 0000:00:05.3: AER: enabled with IRQ 51 Dec 16 13:38:57.836316 kernel: pcieport 0000:00:05.4: PME: Signaling with IRQ 52 Dec 16 13:38:57.836385 kernel: pcieport 0000:00:05.4: AER: enabled with IRQ 52 Dec 16 13:38:57.836394 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Dec 16 13:38:57.836402 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 16 13:38:57.836412 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Dec 16 13:38:57.836420 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Dec 16 13:38:57.836428 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Dec 16 13:38:57.836435 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Dec 16 13:38:57.836508 kernel: rtc_cmos 00:03: RTC can wake from S4 Dec 16 13:38:57.836519 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Dec 16 13:38:57.836579 kernel: rtc_cmos 00:03: registered as rtc0 Dec 16 13:38:57.836640 kernel: rtc_cmos 00:03: setting system clock to 2025-12-16T13:38:57 UTC (1765892337) Dec 16 13:38:57.836703 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Dec 16 13:38:57.836712 kernel: intel_pstate: CPU model not supported Dec 16 13:38:57.836720 kernel: efifb: probing for efifb Dec 16 13:38:57.836727 kernel: efifb: framebuffer at 0x80000000, using 4000k, total 4000k Dec 16 13:38:57.836735 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Dec 16 13:38:57.836743 kernel: efifb: scrolling: redraw Dec 16 13:38:57.836750 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Dec 16 13:38:57.836757 kernel: Console: switching to colour frame buffer device 160x50 Dec 16 13:38:57.836765 kernel: fb0: EFI VGA frame buffer device Dec 16 13:38:57.836774 kernel: pstore: Using crash dump compression: deflate Dec 16 13:38:57.836782 kernel: pstore: Registered efi_pstore as persistent store backend Dec 16 13:38:57.836790 kernel: NET: Registered PF_INET6 protocol family Dec 16 13:38:57.836797 kernel: Segment Routing with IPv6 Dec 16 13:38:57.836805 kernel: In-situ OAM (IOAM) with IPv6 Dec 16 13:38:57.836813 kernel: NET: Registered PF_PACKET protocol family Dec 16 13:38:57.836821 kernel: Key type dns_resolver registered Dec 16 13:38:57.836828 kernel: IPI shorthand broadcast: enabled Dec 16 13:38:57.836836 kernel: sched_clock: Marking stable (3939001517, 164211648)->(4338793756, -235580591) Dec 16 13:38:57.836845 kernel: registered taskstats version 1 Dec 16 13:38:57.836853 kernel: Loading compiled-in X.509 certificates Dec 16 13:38:57.836861 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.61-flatcar: 0d0c78e6590cb40d27f1cef749ef9f2f3425f38d' Dec 16 13:38:57.836868 kernel: Demotion targets for Node 0: null Dec 16 13:38:57.836876 kernel: Key type .fscrypt registered Dec 16 13:38:57.836897 kernel: Key type fscrypt-provisioning registered Dec 16 13:38:57.836905 kernel: ima: No TPM chip found, activating TPM-bypass! Dec 16 13:38:57.836912 kernel: ima: Allocated hash algorithm: sha1 Dec 16 13:38:57.836920 kernel: ima: No architecture policies found Dec 16 13:38:57.836927 kernel: clk: Disabling unused clocks Dec 16 13:38:57.836937 kernel: Warning: unable to open an initial console. Dec 16 13:38:57.836945 kernel: Freeing unused kernel image (initmem) memory: 46188K Dec 16 13:38:57.836953 kernel: Write protecting the kernel read-only data: 40960k Dec 16 13:38:57.836961 kernel: Freeing unused kernel image (rodata/data gap) memory: 560K Dec 16 13:38:57.836968 kernel: Run /init as init process Dec 16 13:38:57.836975 kernel: with arguments: Dec 16 13:38:57.836983 kernel: /init Dec 16 13:38:57.836990 kernel: with environment: Dec 16 13:38:57.836998 kernel: HOME=/ Dec 16 13:38:57.837007 kernel: TERM=linux Dec 16 13:38:57.837016 systemd[1]: Successfully made /usr/ read-only. Dec 16 13:38:57.837027 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 13:38:57.837036 systemd[1]: Detected virtualization kvm. Dec 16 13:38:57.837044 systemd[1]: Detected architecture x86-64. Dec 16 13:38:57.837052 systemd[1]: Running in initrd. Dec 16 13:38:57.837060 systemd[1]: No hostname configured, using default hostname. Dec 16 13:38:57.837070 systemd[1]: Hostname set to . Dec 16 13:38:57.837078 systemd[1]: Initializing machine ID from VM UUID. Dec 16 13:38:57.837095 systemd[1]: Queued start job for default target initrd.target. Dec 16 13:38:57.837105 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 13:38:57.837114 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 13:38:57.837122 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Dec 16 13:38:57.837131 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 13:38:57.837139 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Dec 16 13:38:57.837148 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Dec 16 13:38:57.837159 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Dec 16 13:38:57.837167 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Dec 16 13:38:57.837175 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 13:38:57.837183 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 13:38:57.837191 systemd[1]: Reached target paths.target - Path Units. Dec 16 13:38:57.837200 systemd[1]: Reached target slices.target - Slice Units. Dec 16 13:38:57.837208 systemd[1]: Reached target swap.target - Swaps. Dec 16 13:38:57.837216 systemd[1]: Reached target timers.target - Timer Units. Dec 16 13:38:57.837226 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 13:38:57.837234 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 13:38:57.837242 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Dec 16 13:38:57.837251 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Dec 16 13:38:57.837259 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 13:38:57.837267 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 13:38:57.837275 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 13:38:57.837283 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 13:38:57.837292 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Dec 16 13:38:57.837302 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 13:38:57.837310 systemd[1]: Finished network-cleanup.service - Network Cleanup. Dec 16 13:38:57.837319 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Dec 16 13:38:57.837327 systemd[1]: Starting systemd-fsck-usr.service... Dec 16 13:38:57.837335 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 13:38:57.837343 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 13:38:57.837351 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 13:38:57.837359 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Dec 16 13:38:57.837370 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 13:38:57.837378 systemd[1]: Finished systemd-fsck-usr.service. Dec 16 13:38:57.837386 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 16 13:38:57.837420 systemd-journald[276]: Collecting audit messages is disabled. Dec 16 13:38:57.837446 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 13:38:57.837456 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 13:38:57.837465 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 16 13:38:57.837475 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 13:38:57.837483 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 13:38:57.837492 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 16 13:38:57.837501 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 13:38:57.837509 kernel: Bridge firewalling registered Dec 16 13:38:57.837517 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 13:38:57.837526 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Dec 16 13:38:57.837534 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 13:38:57.837546 systemd-journald[276]: Journal started Dec 16 13:38:57.837565 systemd-journald[276]: Runtime Journal (/run/log/journal/3c8b0c8f08bd4e89814d798fa4527cae) is 8M, max 319.5M, 311.5M free. Dec 16 13:38:57.793425 systemd-modules-load[279]: Inserted module 'overlay' Dec 16 13:38:57.821299 systemd-modules-load[279]: Inserted module 'br_netfilter' Dec 16 13:38:57.844476 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 13:38:57.845077 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 13:38:57.848438 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 13:38:57.853027 systemd-tmpfiles[315]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Dec 16 13:38:57.853826 dracut-cmdline[308]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=a214a2d85e162c493e8b13db2df50a43e1005a0e4854a1ae089a14f442a30022 Dec 16 13:38:57.856304 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 13:38:57.858030 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 13:38:57.887504 systemd-resolved[341]: Positive Trust Anchors: Dec 16 13:38:57.887518 systemd-resolved[341]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 13:38:57.887549 systemd-resolved[341]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 13:38:57.889758 systemd-resolved[341]: Defaulting to hostname 'linux'. Dec 16 13:38:57.890620 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 13:38:57.891351 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 13:38:57.935913 kernel: SCSI subsystem initialized Dec 16 13:38:57.947915 kernel: Loading iSCSI transport class v2.0-870. Dec 16 13:38:57.958918 kernel: iscsi: registered transport (tcp) Dec 16 13:38:57.982194 kernel: iscsi: registered transport (qla4xxx) Dec 16 13:38:57.982266 kernel: QLogic iSCSI HBA Driver Dec 16 13:38:58.000163 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 13:38:58.024466 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 13:38:58.026170 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 13:38:58.072974 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Dec 16 13:38:58.075387 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Dec 16 13:38:58.136928 kernel: raid6: avx512x4 gen() 38345 MB/s Dec 16 13:38:58.154915 kernel: raid6: avx512x2 gen() 37757 MB/s Dec 16 13:38:58.172919 kernel: raid6: avx512x1 gen() 37913 MB/s Dec 16 13:38:58.190923 kernel: raid6: avx2x4 gen() 29701 MB/s Dec 16 13:38:58.208926 kernel: raid6: avx2x2 gen() 29582 MB/s Dec 16 13:38:58.226380 kernel: raid6: avx2x1 gen() 20011 MB/s Dec 16 13:38:58.226448 kernel: raid6: using algorithm avx512x4 gen() 38345 MB/s Dec 16 13:38:58.245985 kernel: raid6: .... xor() 8805 MB/s, rmw enabled Dec 16 13:38:58.246056 kernel: raid6: using avx512x2 recovery algorithm Dec 16 13:38:58.265908 kernel: xor: automatically using best checksumming function avx Dec 16 13:38:58.398010 kernel: Btrfs loaded, zoned=no, fsverity=no Dec 16 13:38:58.405507 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Dec 16 13:38:58.407463 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 13:38:58.441841 systemd-udevd[534]: Using default interface naming scheme 'v255'. Dec 16 13:38:58.446392 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 13:38:58.447754 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Dec 16 13:38:58.475457 dracut-pre-trigger[540]: rd.md=0: removing MD RAID activation Dec 16 13:38:58.499191 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 13:38:58.500777 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 13:38:58.597353 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 13:38:58.603121 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Dec 16 13:38:58.620896 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues Dec 16 13:38:58.626710 kernel: virtio_blk virtio2: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Dec 16 13:38:58.644031 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Dec 16 13:38:58.644121 kernel: GPT:17805311 != 104857599 Dec 16 13:38:58.644132 kernel: GPT:Alternate GPT header not at the end of the disk. Dec 16 13:38:58.645206 kernel: GPT:17805311 != 104857599 Dec 16 13:38:58.646104 kernel: GPT: Use GNU Parted to correct GPT errors. Dec 16 13:38:58.647127 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 16 13:38:58.652950 kernel: ACPI: bus type USB registered Dec 16 13:38:58.653017 kernel: cryptd: max_cpu_qlen set to 1000 Dec 16 13:38:58.667915 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Dec 16 13:38:58.671900 kernel: usbcore: registered new interface driver usbfs Dec 16 13:38:58.672905 kernel: libata version 3.00 loaded. Dec 16 13:38:58.672949 kernel: usbcore: registered new interface driver hub Dec 16 13:38:58.677496 kernel: usbcore: registered new device driver usb Dec 16 13:38:58.684529 kernel: ahci 0000:00:1f.2: version 3.0 Dec 16 13:38:58.684729 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Dec 16 13:38:58.687674 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Dec 16 13:38:58.687861 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Dec 16 13:38:58.687975 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Dec 16 13:38:58.690898 kernel: AES CTR mode by8 optimization enabled Dec 16 13:38:58.692513 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 13:38:58.693634 kernel: scsi host0: ahci Dec 16 13:38:58.692633 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 13:38:58.696537 kernel: scsi host1: ahci Dec 16 13:38:58.696715 kernel: scsi host2: ahci Dec 16 13:38:58.700020 kernel: scsi host3: ahci Dec 16 13:38:58.694957 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 13:38:58.706903 kernel: scsi host4: ahci Dec 16 13:38:58.709893 kernel: scsi host5: ahci Dec 16 13:38:58.710060 kernel: uhci_hcd 0000:02:01.0: UHCI Host Controller Dec 16 13:38:58.713637 kernel: ata1: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380100 irq 67 lpm-pol 1 Dec 16 13:38:58.713700 kernel: uhci_hcd 0000:02:01.0: new USB bus registered, assigned bus number 1 Dec 16 13:38:58.713918 kernel: ata2: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380180 irq 67 lpm-pol 1 Dec 16 13:38:58.713931 kernel: uhci_hcd 0000:02:01.0: detected 2 ports Dec 16 13:38:58.714033 kernel: ata3: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380200 irq 67 lpm-pol 1 Dec 16 13:38:58.714043 kernel: uhci_hcd 0000:02:01.0: irq 22, io port 0x00006000 Dec 16 13:38:58.714131 kernel: ata4: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380280 irq 67 lpm-pol 1 Dec 16 13:38:58.710922 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 13:38:58.731300 kernel: hub 1-0:1.0: USB hub found Dec 16 13:38:58.731493 kernel: ata5: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380300 irq 67 lpm-pol 1 Dec 16 13:38:58.731505 kernel: hub 1-0:1.0: 2 ports detected Dec 16 13:38:58.731612 kernel: ata6: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380380 irq 67 lpm-pol 1 Dec 16 13:38:58.761455 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Dec 16 13:38:58.769796 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Dec 16 13:38:58.777588 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 16 13:38:58.784771 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Dec 16 13:38:58.785294 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Dec 16 13:38:58.787635 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Dec 16 13:38:58.788084 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 13:38:58.788132 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 13:38:58.788972 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 13:38:58.790212 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 13:38:58.790939 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Dec 16 13:38:58.815641 disk-uuid[749]: Primary Header is updated. Dec 16 13:38:58.815641 disk-uuid[749]: Secondary Entries is updated. Dec 16 13:38:58.815641 disk-uuid[749]: Secondary Header is updated. Dec 16 13:38:58.822620 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 13:38:58.824719 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 16 13:38:58.937025 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd Dec 16 13:38:59.040201 kernel: ata2: SATA link down (SStatus 0 SControl 300) Dec 16 13:38:59.040272 kernel: ata6: SATA link down (SStatus 0 SControl 300) Dec 16 13:38:59.040283 kernel: ata3: SATA link down (SStatus 0 SControl 300) Dec 16 13:38:59.040306 kernel: ata5: SATA link down (SStatus 0 SControl 300) Dec 16 13:38:59.041912 kernel: ata4: SATA link down (SStatus 0 SControl 300) Dec 16 13:38:59.042910 kernel: ata1: SATA link down (SStatus 0 SControl 300) Dec 16 13:38:59.055306 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Dec 16 13:38:59.056295 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 13:38:59.056776 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 13:38:59.057529 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 13:38:59.058993 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Dec 16 13:38:59.085386 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Dec 16 13:38:59.118954 kernel: hid: raw HID events driver (C) Jiri Kosina Dec 16 13:38:59.127312 kernel: usbcore: registered new interface driver usbhid Dec 16 13:38:59.127369 kernel: usbhid: USB HID core driver Dec 16 13:38:59.134102 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.0/0000:01:00.0/0000:02:01.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Dec 16 13:38:59.134163 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:01.0-1/input0 Dec 16 13:38:59.837903 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 16 13:38:59.838216 disk-uuid[752]: The operation has completed successfully. Dec 16 13:38:59.883436 systemd[1]: disk-uuid.service: Deactivated successfully. Dec 16 13:38:59.883531 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Dec 16 13:38:59.917099 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Dec 16 13:38:59.943511 sh[785]: Success Dec 16 13:38:59.961052 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 16 13:38:59.961115 kernel: device-mapper: uevent: version 1.0.3 Dec 16 13:38:59.962218 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Dec 16 13:38:59.972922 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Dec 16 13:39:00.059578 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Dec 16 13:39:00.061890 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Dec 16 13:39:00.088086 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Dec 16 13:39:00.116907 kernel: BTRFS: device fsid a6ae7f96-a076-4d3c-81ed-46dd341492f8 devid 1 transid 37 /dev/mapper/usr (253:0) scanned by mount (797) Dec 16 13:39:00.121070 kernel: BTRFS info (device dm-0): first mount of filesystem a6ae7f96-a076-4d3c-81ed-46dd341492f8 Dec 16 13:39:00.121127 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Dec 16 13:39:00.146960 kernel: BTRFS info (device dm-0): disabling log replay at mount time Dec 16 13:39:00.147045 kernel: BTRFS info (device dm-0): enabling free space tree Dec 16 13:39:00.151378 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Dec 16 13:39:00.152551 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Dec 16 13:39:00.153117 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Dec 16 13:39:00.153983 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Dec 16 13:39:00.155720 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Dec 16 13:39:00.203907 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (828) Dec 16 13:39:00.207440 kernel: BTRFS info (device vda6): first mount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 16 13:39:00.207493 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Dec 16 13:39:00.216595 kernel: BTRFS info (device vda6): turning on async discard Dec 16 13:39:00.216933 kernel: BTRFS info (device vda6): enabling free space tree Dec 16 13:39:00.221926 kernel: BTRFS info (device vda6): last unmount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 16 13:39:00.223240 systemd[1]: Finished ignition-setup.service - Ignition (setup). Dec 16 13:39:00.224799 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Dec 16 13:39:00.272460 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 13:39:00.274978 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 13:39:00.313135 systemd-networkd[966]: lo: Link UP Dec 16 13:39:00.313143 systemd-networkd[966]: lo: Gained carrier Dec 16 13:39:00.314225 systemd-networkd[966]: Enumeration completed Dec 16 13:39:00.314506 systemd-networkd[966]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 13:39:00.314511 systemd-networkd[966]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 13:39:00.314805 systemd-networkd[966]: eth0: Link UP Dec 16 13:39:00.315047 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 13:39:00.315256 systemd-networkd[966]: eth0: Gained carrier Dec 16 13:39:00.315267 systemd-networkd[966]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 13:39:00.317028 systemd[1]: Reached target network.target - Network. Dec 16 13:39:00.346122 systemd-networkd[966]: eth0: DHCPv4 address 10.0.21.93/25, gateway 10.0.21.1 acquired from 10.0.21.1 Dec 16 13:39:00.370811 ignition[902]: Ignition 2.22.0 Dec 16 13:39:00.370825 ignition[902]: Stage: fetch-offline Dec 16 13:39:00.370859 ignition[902]: no configs at "/usr/lib/ignition/base.d" Dec 16 13:39:00.372573 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 13:39:00.370867 ignition[902]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 13:39:00.370958 ignition[902]: parsed url from cmdline: "" Dec 16 13:39:00.370961 ignition[902]: no config URL provided Dec 16 13:39:00.370966 ignition[902]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 13:39:00.370972 ignition[902]: no config at "/usr/lib/ignition/user.ign" Dec 16 13:39:00.374670 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Dec 16 13:39:00.370977 ignition[902]: failed to fetch config: resource requires networking Dec 16 13:39:00.371118 ignition[902]: Ignition finished successfully Dec 16 13:39:00.402663 ignition[984]: Ignition 2.22.0 Dec 16 13:39:00.402678 ignition[984]: Stage: fetch Dec 16 13:39:00.402836 ignition[984]: no configs at "/usr/lib/ignition/base.d" Dec 16 13:39:00.402845 ignition[984]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 13:39:00.402948 ignition[984]: parsed url from cmdline: "" Dec 16 13:39:00.402951 ignition[984]: no config URL provided Dec 16 13:39:00.402957 ignition[984]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 13:39:00.402968 ignition[984]: no config at "/usr/lib/ignition/user.ign" Dec 16 13:39:00.403081 ignition[984]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Dec 16 13:39:00.403668 ignition[984]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Dec 16 13:39:00.403769 ignition[984]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Dec 16 13:39:00.676846 ignition[984]: GET result: OK Dec 16 13:39:00.676960 ignition[984]: parsing config with SHA512: fa38bcd91b072b23c7e3611a2fd5f143861f3c43225e6d1647e4ff77f0d4eb7b0bb1c44df6995d7a2e77e617934db1242f58ff8aef00bad4874071a33c8ec939 Dec 16 13:39:00.682077 unknown[984]: fetched base config from "system" Dec 16 13:39:00.682088 unknown[984]: fetched base config from "system" Dec 16 13:39:00.682407 ignition[984]: fetch: fetch complete Dec 16 13:39:00.682093 unknown[984]: fetched user config from "openstack" Dec 16 13:39:00.682412 ignition[984]: fetch: fetch passed Dec 16 13:39:00.682447 ignition[984]: Ignition finished successfully Dec 16 13:39:00.684441 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Dec 16 13:39:00.686570 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Dec 16 13:39:00.725795 ignition[996]: Ignition 2.22.0 Dec 16 13:39:00.725808 ignition[996]: Stage: kargs Dec 16 13:39:00.725996 ignition[996]: no configs at "/usr/lib/ignition/base.d" Dec 16 13:39:00.726007 ignition[996]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 13:39:00.726742 ignition[996]: kargs: kargs passed Dec 16 13:39:00.726785 ignition[996]: Ignition finished successfully Dec 16 13:39:00.728081 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Dec 16 13:39:00.729964 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Dec 16 13:39:00.764873 ignition[1009]: Ignition 2.22.0 Dec 16 13:39:00.764896 ignition[1009]: Stage: disks Dec 16 13:39:00.765071 ignition[1009]: no configs at "/usr/lib/ignition/base.d" Dec 16 13:39:00.765080 ignition[1009]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 13:39:00.765838 ignition[1009]: disks: disks passed Dec 16 13:39:00.765887 ignition[1009]: Ignition finished successfully Dec 16 13:39:00.767503 systemd[1]: Finished ignition-disks.service - Ignition (disks). Dec 16 13:39:00.768679 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Dec 16 13:39:00.769571 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Dec 16 13:39:00.770023 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 13:39:00.770867 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 13:39:00.771804 systemd[1]: Reached target basic.target - Basic System. Dec 16 13:39:00.773603 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Dec 16 13:39:00.824443 systemd-fsck[1022]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Dec 16 13:39:00.827866 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Dec 16 13:39:00.829277 systemd[1]: Mounting sysroot.mount - /sysroot... Dec 16 13:39:01.016911 kernel: EXT4-fs (vda9): mounted filesystem e48ca59c-1206-4abd-b121-5e9b35e49852 r/w with ordered data mode. Quota mode: none. Dec 16 13:39:01.017220 systemd[1]: Mounted sysroot.mount - /sysroot. Dec 16 13:39:01.018208 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Dec 16 13:39:01.024953 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 13:39:01.026939 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Dec 16 13:39:01.027662 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Dec 16 13:39:01.028374 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Dec 16 13:39:01.028816 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Dec 16 13:39:01.028843 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 13:39:01.052036 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Dec 16 13:39:01.053784 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Dec 16 13:39:01.069925 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1030) Dec 16 13:39:01.074060 kernel: BTRFS info (device vda6): first mount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 16 13:39:01.074121 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Dec 16 13:39:01.082974 kernel: BTRFS info (device vda6): turning on async discard Dec 16 13:39:01.083041 kernel: BTRFS info (device vda6): enabling free space tree Dec 16 13:39:01.085851 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 13:39:01.112910 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:39:01.128584 initrd-setup-root[1060]: cut: /sysroot/etc/passwd: No such file or directory Dec 16 13:39:01.135179 initrd-setup-root[1067]: cut: /sysroot/etc/group: No such file or directory Dec 16 13:39:01.138889 initrd-setup-root[1074]: cut: /sysroot/etc/shadow: No such file or directory Dec 16 13:39:01.141934 initrd-setup-root[1081]: cut: /sysroot/etc/gshadow: No such file or directory Dec 16 13:39:01.255005 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Dec 16 13:39:01.256758 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Dec 16 13:39:01.257951 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Dec 16 13:39:01.283579 systemd[1]: sysroot-oem.mount: Deactivated successfully. Dec 16 13:39:01.285421 kernel: BTRFS info (device vda6): last unmount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 16 13:39:01.311157 ignition[1150]: INFO : Ignition 2.22.0 Dec 16 13:39:01.311157 ignition[1150]: INFO : Stage: mount Dec 16 13:39:01.313067 ignition[1150]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 13:39:01.313067 ignition[1150]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 13:39:01.313067 ignition[1150]: INFO : mount: mount passed Dec 16 13:39:01.313067 ignition[1150]: INFO : Ignition finished successfully Dec 16 13:39:01.311255 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Dec 16 13:39:01.313789 systemd[1]: Finished ignition-mount.service - Ignition (mount). Dec 16 13:39:01.614143 systemd-networkd[966]: eth0: Gained IPv6LL Dec 16 13:39:02.159935 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:39:04.165945 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:39:08.174953 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:39:08.183327 coreos-metadata[1032]: Dec 16 13:39:08.183 WARN failed to locate config-drive, using the metadata service API instead Dec 16 13:39:08.196451 coreos-metadata[1032]: Dec 16 13:39:08.196 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 16 13:39:08.309653 coreos-metadata[1032]: Dec 16 13:39:08.309 INFO Fetch successful Dec 16 13:39:08.310491 coreos-metadata[1032]: Dec 16 13:39:08.309 INFO wrote hostname ci-4459-2-2-a-7f096d1947 to /sysroot/etc/hostname Dec 16 13:39:08.311936 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Dec 16 13:39:08.312061 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Dec 16 13:39:08.313091 systemd[1]: Starting ignition-files.service - Ignition (files)... Dec 16 13:39:08.332930 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 13:39:08.376934 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1172) Dec 16 13:39:08.380923 kernel: BTRFS info (device vda6): first mount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 16 13:39:08.380988 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Dec 16 13:39:08.389728 kernel: BTRFS info (device vda6): turning on async discard Dec 16 13:39:08.389793 kernel: BTRFS info (device vda6): enabling free space tree Dec 16 13:39:08.391741 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 13:39:08.421679 ignition[1190]: INFO : Ignition 2.22.0 Dec 16 13:39:08.421679 ignition[1190]: INFO : Stage: files Dec 16 13:39:08.422964 ignition[1190]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 13:39:08.422964 ignition[1190]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 13:39:08.422964 ignition[1190]: DEBUG : files: compiled without relabeling support, skipping Dec 16 13:39:08.425956 ignition[1190]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Dec 16 13:39:08.425956 ignition[1190]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Dec 16 13:39:08.432268 ignition[1190]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Dec 16 13:39:08.432707 ignition[1190]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Dec 16 13:39:08.433087 ignition[1190]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Dec 16 13:39:08.432956 unknown[1190]: wrote ssh authorized keys file for user: core Dec 16 13:39:08.436634 ignition[1190]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Dec 16 13:39:08.436634 ignition[1190]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Dec 16 13:39:08.498198 ignition[1190]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Dec 16 13:39:08.669582 ignition[1190]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Dec 16 13:39:08.670652 ignition[1190]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Dec 16 13:39:08.670652 ignition[1190]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Dec 16 13:39:08.670652 ignition[1190]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Dec 16 13:39:08.670652 ignition[1190]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Dec 16 13:39:08.670652 ignition[1190]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 13:39:08.670652 ignition[1190]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 13:39:08.670652 ignition[1190]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 13:39:08.670652 ignition[1190]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 13:39:08.673453 ignition[1190]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 13:39:08.673453 ignition[1190]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 13:39:08.673453 ignition[1190]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Dec 16 13:39:08.675296 ignition[1190]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Dec 16 13:39:08.675296 ignition[1190]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Dec 16 13:39:08.676585 ignition[1190]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 Dec 16 13:39:08.932988 ignition[1190]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Dec 16 13:39:09.545473 ignition[1190]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Dec 16 13:39:09.545473 ignition[1190]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Dec 16 13:39:09.547486 ignition[1190]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 13:39:09.551168 ignition[1190]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 13:39:09.551168 ignition[1190]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Dec 16 13:39:09.551168 ignition[1190]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Dec 16 13:39:09.552944 ignition[1190]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Dec 16 13:39:09.552944 ignition[1190]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Dec 16 13:39:09.552944 ignition[1190]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Dec 16 13:39:09.552944 ignition[1190]: INFO : files: files passed Dec 16 13:39:09.552944 ignition[1190]: INFO : Ignition finished successfully Dec 16 13:39:09.554261 systemd[1]: Finished ignition-files.service - Ignition (files). Dec 16 13:39:09.555909 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Dec 16 13:39:09.557692 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Dec 16 13:39:09.576628 systemd[1]: ignition-quench.service: Deactivated successfully. Dec 16 13:39:09.576728 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Dec 16 13:39:09.580977 initrd-setup-root-after-ignition[1225]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 13:39:09.580977 initrd-setup-root-after-ignition[1225]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Dec 16 13:39:09.582180 initrd-setup-root-after-ignition[1229]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 13:39:09.583486 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 13:39:09.584163 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Dec 16 13:39:09.585521 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Dec 16 13:39:09.608411 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 16 13:39:09.608523 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Dec 16 13:39:09.609976 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Dec 16 13:39:09.610539 systemd[1]: Reached target initrd.target - Initrd Default Target. Dec 16 13:39:09.611464 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Dec 16 13:39:09.612209 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Dec 16 13:39:09.626267 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 13:39:09.628048 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Dec 16 13:39:09.651628 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Dec 16 13:39:09.652374 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 13:39:09.653738 systemd[1]: Stopped target timers.target - Timer Units. Dec 16 13:39:09.654623 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 16 13:39:09.654766 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 13:39:09.655969 systemd[1]: Stopped target initrd.target - Initrd Default Target. Dec 16 13:39:09.656832 systemd[1]: Stopped target basic.target - Basic System. Dec 16 13:39:09.657621 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Dec 16 13:39:09.658365 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 13:39:09.659179 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Dec 16 13:39:09.659949 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Dec 16 13:39:09.660657 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Dec 16 13:39:09.661388 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 13:39:09.662179 systemd[1]: Stopped target sysinit.target - System Initialization. Dec 16 13:39:09.662912 systemd[1]: Stopped target local-fs.target - Local File Systems. Dec 16 13:39:09.663629 systemd[1]: Stopped target swap.target - Swaps. Dec 16 13:39:09.664360 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 16 13:39:09.664479 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Dec 16 13:39:09.665552 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Dec 16 13:39:09.666385 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 13:39:09.667079 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Dec 16 13:39:09.667175 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 13:39:09.667813 systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 16 13:39:09.667919 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Dec 16 13:39:09.669088 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Dec 16 13:39:09.669208 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 13:39:09.670036 systemd[1]: ignition-files.service: Deactivated successfully. Dec 16 13:39:09.670127 systemd[1]: Stopped ignition-files.service - Ignition (files). Dec 16 13:39:09.671623 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Dec 16 13:39:09.672864 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Dec 16 13:39:09.673327 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 16 13:39:09.673424 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 13:39:09.674185 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 16 13:39:09.674264 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 13:39:09.677991 systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 16 13:39:09.699281 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Dec 16 13:39:09.716322 ignition[1250]: INFO : Ignition 2.22.0 Dec 16 13:39:09.716322 ignition[1250]: INFO : Stage: umount Dec 16 13:39:09.717862 ignition[1250]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 13:39:09.717862 ignition[1250]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 13:39:09.717862 ignition[1250]: INFO : umount: umount passed Dec 16 13:39:09.717862 ignition[1250]: INFO : Ignition finished successfully Dec 16 13:39:09.716504 systemd[1]: sysroot-boot.mount: Deactivated successfully. Dec 16 13:39:09.718463 systemd[1]: ignition-mount.service: Deactivated successfully. Dec 16 13:39:09.718561 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Dec 16 13:39:09.719596 systemd[1]: ignition-disks.service: Deactivated successfully. Dec 16 13:39:09.719685 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Dec 16 13:39:09.720304 systemd[1]: ignition-kargs.service: Deactivated successfully. Dec 16 13:39:09.720341 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Dec 16 13:39:09.721103 systemd[1]: ignition-fetch.service: Deactivated successfully. Dec 16 13:39:09.721150 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Dec 16 13:39:09.721844 systemd[1]: Stopped target network.target - Network. Dec 16 13:39:09.722527 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Dec 16 13:39:09.722573 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 13:39:09.723304 systemd[1]: Stopped target paths.target - Path Units. Dec 16 13:39:09.724030 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 16 13:39:09.724119 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 13:39:09.724804 systemd[1]: Stopped target slices.target - Slice Units. Dec 16 13:39:09.725623 systemd[1]: Stopped target sockets.target - Socket Units. Dec 16 13:39:09.726408 systemd[1]: iscsid.socket: Deactivated successfully. Dec 16 13:39:09.726450 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 13:39:09.727168 systemd[1]: iscsiuio.socket: Deactivated successfully. Dec 16 13:39:09.727203 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 13:39:09.727913 systemd[1]: ignition-setup.service: Deactivated successfully. Dec 16 13:39:09.727965 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Dec 16 13:39:09.728609 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Dec 16 13:39:09.728646 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Dec 16 13:39:09.729458 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Dec 16 13:39:09.730426 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Dec 16 13:39:09.736754 systemd[1]: systemd-resolved.service: Deactivated successfully. Dec 16 13:39:09.736868 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Dec 16 13:39:09.739303 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Dec 16 13:39:09.739585 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 16 13:39:09.739621 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 13:39:09.741658 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Dec 16 13:39:09.752727 systemd[1]: systemd-networkd.service: Deactivated successfully. Dec 16 13:39:09.752852 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Dec 16 13:39:09.754973 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Dec 16 13:39:09.755183 systemd[1]: Stopped target network-pre.target - Preparation for Network. Dec 16 13:39:09.755841 systemd[1]: systemd-networkd.socket: Deactivated successfully. Dec 16 13:39:09.755893 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Dec 16 13:39:09.757554 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Dec 16 13:39:09.757980 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Dec 16 13:39:09.758032 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 13:39:09.758752 systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 16 13:39:09.758787 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Dec 16 13:39:09.759594 systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 16 13:39:09.759630 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Dec 16 13:39:09.760248 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 13:39:09.761795 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Dec 16 13:39:09.782332 systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 16 13:39:09.782469 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 13:39:09.783947 systemd[1]: sysroot-boot.service: Deactivated successfully. Dec 16 13:39:09.784038 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Dec 16 13:39:09.786261 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 16 13:39:09.786318 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Dec 16 13:39:09.787312 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 16 13:39:09.787348 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 13:39:09.788277 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 16 13:39:09.788319 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Dec 16 13:39:09.789977 systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 16 13:39:09.790021 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Dec 16 13:39:09.791617 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 16 13:39:09.791661 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 13:39:09.792875 systemd[1]: initrd-setup-root.service: Deactivated successfully. Dec 16 13:39:09.792924 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Dec 16 13:39:09.794849 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Dec 16 13:39:09.795780 systemd[1]: systemd-network-generator.service: Deactivated successfully. Dec 16 13:39:09.795825 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 13:39:09.797193 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 16 13:39:09.797237 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 13:39:09.798114 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Dec 16 13:39:09.798149 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 13:39:09.799168 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 16 13:39:09.799203 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 13:39:09.800113 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 13:39:09.800151 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 13:39:09.801987 systemd[1]: network-cleanup.service: Deactivated successfully. Dec 16 13:39:09.802080 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Dec 16 13:39:09.804390 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 16 13:39:09.804497 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Dec 16 13:39:09.805457 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Dec 16 13:39:09.807136 systemd[1]: Starting initrd-switch-root.service - Switch Root... Dec 16 13:39:09.817269 systemd[1]: Switching root. Dec 16 13:39:09.869453 systemd-journald[276]: Journal stopped Dec 16 13:39:10.884240 systemd-journald[276]: Received SIGTERM from PID 1 (systemd). Dec 16 13:39:10.884324 kernel: SELinux: policy capability network_peer_controls=1 Dec 16 13:39:10.884348 kernel: SELinux: policy capability open_perms=1 Dec 16 13:39:10.884361 kernel: SELinux: policy capability extended_socket_class=1 Dec 16 13:39:10.884371 kernel: SELinux: policy capability always_check_network=0 Dec 16 13:39:10.884385 kernel: SELinux: policy capability cgroup_seclabel=1 Dec 16 13:39:10.884395 kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 16 13:39:10.884405 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Dec 16 13:39:10.884414 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Dec 16 13:39:10.884424 kernel: SELinux: policy capability userspace_initial_context=0 Dec 16 13:39:10.884438 kernel: audit: type=1403 audit(1765892350.018:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Dec 16 13:39:10.884452 systemd[1]: Successfully loaded SELinux policy in 77.749ms. Dec 16 13:39:10.884474 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.748ms. Dec 16 13:39:10.884487 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 13:39:10.884501 systemd[1]: Detected virtualization kvm. Dec 16 13:39:10.884511 systemd[1]: Detected architecture x86-64. Dec 16 13:39:10.884521 systemd[1]: Detected first boot. Dec 16 13:39:10.884532 systemd[1]: Hostname set to . Dec 16 13:39:10.884546 systemd[1]: Initializing machine ID from VM UUID. Dec 16 13:39:10.884557 zram_generator::config[1298]: No configuration found. Dec 16 13:39:10.884569 kernel: Guest personality initialized and is inactive Dec 16 13:39:10.884582 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Dec 16 13:39:10.884592 kernel: Initialized host personality Dec 16 13:39:10.884601 kernel: NET: Registered PF_VSOCK protocol family Dec 16 13:39:10.884611 systemd[1]: Populated /etc with preset unit settings. Dec 16 13:39:10.884622 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Dec 16 13:39:10.884633 systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 16 13:39:10.884643 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Dec 16 13:39:10.884653 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 16 13:39:10.884664 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Dec 16 13:39:10.884680 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Dec 16 13:39:10.884690 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Dec 16 13:39:10.884700 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Dec 16 13:39:10.884711 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Dec 16 13:39:10.884721 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Dec 16 13:39:10.884732 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Dec 16 13:39:10.884742 systemd[1]: Created slice user.slice - User and Session Slice. Dec 16 13:39:10.884753 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 13:39:10.884765 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 13:39:10.884776 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Dec 16 13:39:10.884787 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Dec 16 13:39:10.884797 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Dec 16 13:39:10.884809 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 13:39:10.884819 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Dec 16 13:39:10.884831 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 13:39:10.884841 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 13:39:10.884851 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Dec 16 13:39:10.884862 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Dec 16 13:39:10.884871 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Dec 16 13:39:10.884890 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Dec 16 13:39:10.884900 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 13:39:10.884910 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 13:39:10.884920 systemd[1]: Reached target slices.target - Slice Units. Dec 16 13:39:10.884930 systemd[1]: Reached target swap.target - Swaps. Dec 16 13:39:10.884943 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Dec 16 13:39:10.884953 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Dec 16 13:39:10.884963 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Dec 16 13:39:10.884973 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 13:39:10.884983 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 13:39:10.884993 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 13:39:10.885003 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Dec 16 13:39:10.885014 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Dec 16 13:39:10.885024 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Dec 16 13:39:10.885038 systemd[1]: Mounting media.mount - External Media Directory... Dec 16 13:39:10.885048 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 13:39:10.885059 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Dec 16 13:39:10.885069 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Dec 16 13:39:10.885080 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Dec 16 13:39:10.885091 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 16 13:39:10.885102 systemd[1]: Reached target machines.target - Containers. Dec 16 13:39:10.885112 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Dec 16 13:39:10.885125 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 13:39:10.885136 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 13:39:10.885146 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Dec 16 13:39:10.885156 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 13:39:10.885167 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 13:39:10.885177 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 13:39:10.885187 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Dec 16 13:39:10.885197 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 13:39:10.885210 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Dec 16 13:39:10.885223 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 16 13:39:10.885234 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Dec 16 13:39:10.885247 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Dec 16 13:39:10.885257 systemd[1]: Stopped systemd-fsck-usr.service. Dec 16 13:39:10.885272 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 13:39:10.885283 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 13:39:10.885295 kernel: loop: module loaded Dec 16 13:39:10.885304 kernel: fuse: init (API version 7.41) Dec 16 13:39:10.885314 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 13:39:10.885324 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 13:39:10.885337 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Dec 16 13:39:10.885347 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Dec 16 13:39:10.885358 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 13:39:10.885368 systemd[1]: verity-setup.service: Deactivated successfully. Dec 16 13:39:10.885382 systemd[1]: Stopped verity-setup.service. Dec 16 13:39:10.885392 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 13:39:10.885402 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Dec 16 13:39:10.885431 systemd-journald[1382]: Collecting audit messages is disabled. Dec 16 13:39:10.885457 kernel: ACPI: bus type drm_connector registered Dec 16 13:39:10.885467 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Dec 16 13:39:10.885479 systemd-journald[1382]: Journal started Dec 16 13:39:10.885506 systemd-journald[1382]: Runtime Journal (/run/log/journal/3c8b0c8f08bd4e89814d798fa4527cae) is 8M, max 319.5M, 311.5M free. Dec 16 13:39:10.693038 systemd[1]: Queued start job for default target multi-user.target. Dec 16 13:39:10.714249 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Dec 16 13:39:10.714644 systemd[1]: systemd-journald.service: Deactivated successfully. Dec 16 13:39:10.887896 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 13:39:10.888313 systemd[1]: Mounted media.mount - External Media Directory. Dec 16 13:39:10.888835 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Dec 16 13:39:10.889348 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Dec 16 13:39:10.889854 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Dec 16 13:39:10.890537 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Dec 16 13:39:10.891230 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 13:39:10.891859 systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 16 13:39:10.892069 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Dec 16 13:39:10.892676 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 13:39:10.892825 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 13:39:10.893434 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 13:39:10.893575 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 13:39:10.894182 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 13:39:10.894323 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 13:39:10.894906 systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 16 13:39:10.895039 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Dec 16 13:39:10.895619 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 13:39:10.895759 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 13:39:10.896409 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 13:39:10.897026 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 13:39:10.897617 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Dec 16 13:39:10.898268 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Dec 16 13:39:10.907392 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 13:39:10.909405 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Dec 16 13:39:10.910886 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Dec 16 13:39:10.911373 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Dec 16 13:39:10.911406 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 13:39:10.912604 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Dec 16 13:39:10.926007 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Dec 16 13:39:10.926755 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 13:39:10.927946 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Dec 16 13:39:10.929484 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Dec 16 13:39:10.930072 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 13:39:10.930975 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Dec 16 13:39:10.931499 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 13:39:10.932309 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 13:39:10.934578 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Dec 16 13:39:10.935964 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 16 13:39:10.938124 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Dec 16 13:39:10.938745 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Dec 16 13:39:10.942016 systemd-journald[1382]: Time spent on flushing to /var/log/journal/3c8b0c8f08bd4e89814d798fa4527cae is 27.339ms for 1711 entries. Dec 16 13:39:10.942016 systemd-journald[1382]: System Journal (/var/log/journal/3c8b0c8f08bd4e89814d798fa4527cae) is 8M, max 584.8M, 576.8M free. Dec 16 13:39:10.983328 systemd-journald[1382]: Received client request to flush runtime journal. Dec 16 13:39:10.983390 kernel: loop0: detected capacity change from 0 to 110984 Dec 16 13:39:10.943653 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Dec 16 13:39:10.944367 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Dec 16 13:39:10.945913 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Dec 16 13:39:10.962103 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 13:39:10.971757 systemd-tmpfiles[1427]: ACLs are not supported, ignoring. Dec 16 13:39:10.971767 systemd-tmpfiles[1427]: ACLs are not supported, ignoring. Dec 16 13:39:10.976111 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 13:39:10.977091 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 13:39:10.979348 systemd[1]: Starting systemd-sysusers.service - Create System Users... Dec 16 13:39:10.992213 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Dec 16 13:39:10.993358 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Dec 16 13:39:10.993923 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Dec 16 13:39:11.035750 systemd[1]: Finished systemd-sysusers.service - Create System Users. Dec 16 13:39:11.035918 kernel: loop1: detected capacity change from 0 to 1640 Dec 16 13:39:11.037769 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 13:39:11.064697 systemd-tmpfiles[1451]: ACLs are not supported, ignoring. Dec 16 13:39:11.064717 systemd-tmpfiles[1451]: ACLs are not supported, ignoring. Dec 16 13:39:11.068056 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 13:39:11.093612 kernel: loop2: detected capacity change from 0 to 229808 Dec 16 13:39:11.146041 kernel: loop3: detected capacity change from 0 to 128560 Dec 16 13:39:11.214944 kernel: loop4: detected capacity change from 0 to 110984 Dec 16 13:39:11.237929 kernel: loop5: detected capacity change from 0 to 1640 Dec 16 13:39:11.246924 kernel: loop6: detected capacity change from 0 to 229808 Dec 16 13:39:11.277935 kernel: loop7: detected capacity change from 0 to 128560 Dec 16 13:39:11.298805 (sd-merge)[1457]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-stackit'. Dec 16 13:39:11.299244 (sd-merge)[1457]: Merged extensions into '/usr'. Dec 16 13:39:11.303406 systemd[1]: Reload requested from client PID 1426 ('systemd-sysext') (unit systemd-sysext.service)... Dec 16 13:39:11.303426 systemd[1]: Reloading... Dec 16 13:39:11.338918 zram_generator::config[1482]: No configuration found. Dec 16 13:39:11.498084 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Dec 16 13:39:11.498209 systemd[1]: Reloading finished in 194 ms. Dec 16 13:39:11.535929 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Dec 16 13:39:11.536696 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Dec 16 13:39:11.554194 systemd[1]: Starting ensure-sysext.service... Dec 16 13:39:11.555705 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 13:39:11.557214 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 13:39:11.566952 systemd[1]: Reload requested from client PID 1527 ('systemctl') (unit ensure-sysext.service)... Dec 16 13:39:11.566973 systemd[1]: Reloading... Dec 16 13:39:11.572652 systemd-tmpfiles[1528]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Dec 16 13:39:11.573478 systemd-tmpfiles[1528]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Dec 16 13:39:11.573748 systemd-tmpfiles[1528]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 16 13:39:11.574044 systemd-tmpfiles[1528]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Dec 16 13:39:11.574657 systemd-tmpfiles[1528]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Dec 16 13:39:11.574860 systemd-tmpfiles[1528]: ACLs are not supported, ignoring. Dec 16 13:39:11.574917 systemd-tmpfiles[1528]: ACLs are not supported, ignoring. Dec 16 13:39:11.579665 systemd-tmpfiles[1528]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 13:39:11.579676 systemd-tmpfiles[1528]: Skipping /boot Dec 16 13:39:11.584009 systemd-udevd[1529]: Using default interface naming scheme 'v255'. Dec 16 13:39:11.586109 systemd-tmpfiles[1528]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 13:39:11.586119 systemd-tmpfiles[1528]: Skipping /boot Dec 16 13:39:11.603961 zram_generator::config[1558]: No configuration found. Dec 16 13:39:11.610655 ldconfig[1421]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Dec 16 13:39:11.694905 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Dec 16 13:39:11.707902 kernel: mousedev: PS/2 mouse device common for all mice Dec 16 13:39:11.710898 kernel: ACPI: button: Power Button [PWRF] Dec 16 13:39:11.761983 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Dec 16 13:39:11.770087 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Dec 16 13:39:11.770207 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Dec 16 13:39:11.775325 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Dec 16 13:39:11.775373 kernel: Console: switching to colour dummy device 80x25 Dec 16 13:39:11.775462 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Dec 16 13:39:11.778408 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Dec 16 13:39:11.778450 kernel: [drm] features: -context_init Dec 16 13:39:11.785900 kernel: [drm] number of scanouts: 1 Dec 16 13:39:11.786016 kernel: [drm] number of cap sets: 0 Dec 16 13:39:11.786029 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Dec 16 13:39:11.792815 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Dec 16 13:39:11.793353 kernel: Console: switching to colour frame buffer device 160x50 Dec 16 13:39:11.797894 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Dec 16 13:39:11.815858 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 16 13:39:11.816297 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Dec 16 13:39:11.816405 systemd[1]: Reloading finished in 249 ms. Dec 16 13:39:11.865867 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 13:39:11.866348 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Dec 16 13:39:11.872623 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 13:39:11.916513 systemd[1]: Finished ensure-sysext.service. Dec 16 13:39:11.921132 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 13:39:11.922286 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 13:39:11.925219 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Dec 16 13:39:11.926393 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 13:39:11.936115 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 13:39:11.939648 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 13:39:11.940912 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 13:39:11.942241 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 13:39:11.943908 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Dec 16 13:39:11.945936 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 13:39:11.946848 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Dec 16 13:39:11.946967 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 13:39:11.947850 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Dec 16 13:39:11.950790 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 13:39:11.952698 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 13:39:11.953156 systemd[1]: Reached target time-set.target - System Time Set. Dec 16 13:39:11.954764 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Dec 16 13:39:11.955945 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 13:39:11.958207 kernel: pps_core: LinuxPPS API ver. 1 registered Dec 16 13:39:11.958247 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Dec 16 13:39:11.960693 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 13:39:11.961613 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 13:39:11.961775 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 13:39:11.962488 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 13:39:11.962632 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 13:39:11.963227 kernel: PTP clock support registered Dec 16 13:39:11.963197 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 13:39:11.963340 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 13:39:11.965350 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 13:39:11.965530 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 13:39:11.966804 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Dec 16 13:39:11.972955 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Dec 16 13:39:11.973136 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Dec 16 13:39:11.979303 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 13:39:11.979408 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 13:39:11.980872 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Dec 16 13:39:11.981720 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Dec 16 13:39:11.988098 augenrules[1712]: No rules Dec 16 13:39:11.989225 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 13:39:11.995161 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 13:39:11.996205 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Dec 16 13:39:11.999441 systemd[1]: Starting systemd-update-done.service - Update is Completed... Dec 16 13:39:12.019363 systemd[1]: Finished systemd-update-done.service - Update is Completed. Dec 16 13:39:12.025047 systemd[1]: Started systemd-userdbd.service - User Database Manager. Dec 16 13:39:12.056156 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 13:39:12.086559 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Dec 16 13:39:12.087249 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 16 13:39:12.092859 systemd-networkd[1684]: lo: Link UP Dec 16 13:39:12.092867 systemd-networkd[1684]: lo: Gained carrier Dec 16 13:39:12.093964 systemd-networkd[1684]: Enumeration completed Dec 16 13:39:12.094076 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 13:39:12.094244 systemd-networkd[1684]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 13:39:12.094251 systemd-networkd[1684]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 13:39:12.094665 systemd-networkd[1684]: eth0: Link UP Dec 16 13:39:12.094765 systemd-networkd[1684]: eth0: Gained carrier Dec 16 13:39:12.094783 systemd-networkd[1684]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 13:39:12.095446 systemd-resolved[1685]: Positive Trust Anchors: Dec 16 13:39:12.095465 systemd-resolved[1685]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 13:39:12.095496 systemd-resolved[1685]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 13:39:12.095726 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Dec 16 13:39:12.098473 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Dec 16 13:39:12.101183 systemd-resolved[1685]: Using system hostname 'ci-4459-2-2-a-7f096d1947'. Dec 16 13:39:12.102418 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 13:39:12.103554 systemd[1]: Reached target network.target - Network. Dec 16 13:39:12.103924 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 13:39:12.104276 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 13:39:12.104686 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Dec 16 13:39:12.105059 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Dec 16 13:39:12.105418 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Dec 16 13:39:12.105952 systemd[1]: Started logrotate.timer - Daily rotation of log files. Dec 16 13:39:12.108758 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Dec 16 13:39:12.109187 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Dec 16 13:39:12.109557 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Dec 16 13:39:12.109586 systemd[1]: Reached target paths.target - Path Units. Dec 16 13:39:12.109967 systemd[1]: Reached target timers.target - Timer Units. Dec 16 13:39:12.112391 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Dec 16 13:39:12.116510 systemd[1]: Starting docker.socket - Docker Socket for the API... Dec 16 13:39:12.118624 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Dec 16 13:39:12.119970 systemd-networkd[1684]: eth0: DHCPv4 address 10.0.21.93/25, gateway 10.0.21.1 acquired from 10.0.21.1 Dec 16 13:39:12.121437 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Dec 16 13:39:12.121847 systemd[1]: Reached target ssh-access.target - SSH Access Available. Dec 16 13:39:12.124085 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Dec 16 13:39:12.126528 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Dec 16 13:39:12.128463 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Dec 16 13:39:12.129174 systemd[1]: Listening on docker.socket - Docker Socket for the API. Dec 16 13:39:12.130827 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 13:39:12.131244 systemd[1]: Reached target basic.target - Basic System. Dec 16 13:39:12.131664 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Dec 16 13:39:12.131697 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Dec 16 13:39:12.134807 systemd[1]: Starting chronyd.service - NTP client/server... Dec 16 13:39:12.137180 systemd[1]: Starting containerd.service - containerd container runtime... Dec 16 13:39:12.138666 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Dec 16 13:39:12.155853 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Dec 16 13:39:12.157402 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Dec 16 13:39:12.157913 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:39:12.162122 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Dec 16 13:39:12.164320 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Dec 16 13:39:12.165289 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Dec 16 13:39:12.166473 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Dec 16 13:39:12.168677 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Dec 16 13:39:12.170247 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Dec 16 13:39:12.171067 jq[1746]: false Dec 16 13:39:12.172490 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Dec 16 13:39:12.175435 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Dec 16 13:39:12.176036 extend-filesystems[1747]: Found /dev/vda6 Dec 16 13:39:12.177958 google_oslogin_nss_cache[1748]: oslogin_cache_refresh[1748]: Refreshing passwd entry cache Dec 16 13:39:12.176069 oslogin_cache_refresh[1748]: Refreshing passwd entry cache Dec 16 13:39:12.178976 systemd[1]: Starting systemd-logind.service - User Login Management... Dec 16 13:39:12.180378 extend-filesystems[1747]: Found /dev/vda9 Dec 16 13:39:12.184959 extend-filesystems[1747]: Checking size of /dev/vda9 Dec 16 13:39:12.184621 oslogin_cache_refresh[1748]: Failure getting users, quitting Dec 16 13:39:12.186087 google_oslogin_nss_cache[1748]: oslogin_cache_refresh[1748]: Failure getting users, quitting Dec 16 13:39:12.186087 google_oslogin_nss_cache[1748]: oslogin_cache_refresh[1748]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Dec 16 13:39:12.186087 google_oslogin_nss_cache[1748]: oslogin_cache_refresh[1748]: Refreshing group entry cache Dec 16 13:39:12.182189 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Dec 16 13:39:12.184639 oslogin_cache_refresh[1748]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Dec 16 13:39:12.182658 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Dec 16 13:39:12.184680 oslogin_cache_refresh[1748]: Refreshing group entry cache Dec 16 13:39:12.183144 systemd[1]: Starting update-engine.service - Update Engine... Dec 16 13:39:12.187643 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Dec 16 13:39:12.190170 extend-filesystems[1747]: Resized partition /dev/vda9 Dec 16 13:39:12.191271 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Dec 16 13:39:12.192039 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Dec 16 13:39:12.192196 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Dec 16 13:39:12.192410 systemd[1]: motdgen.service: Deactivated successfully. Dec 16 13:39:12.192557 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Dec 16 13:39:12.193729 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Dec 16 13:39:12.193888 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Dec 16 13:39:12.194004 jq[1770]: true Dec 16 13:39:12.195251 google_oslogin_nss_cache[1748]: oslogin_cache_refresh[1748]: Failure getting groups, quitting Dec 16 13:39:12.195247 oslogin_cache_refresh[1748]: Failure getting groups, quitting Dec 16 13:39:12.195335 google_oslogin_nss_cache[1748]: oslogin_cache_refresh[1748]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Dec 16 13:39:12.195261 oslogin_cache_refresh[1748]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Dec 16 13:39:12.198784 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Dec 16 13:39:12.199236 extend-filesystems[1776]: resize2fs 1.47.3 (8-Jul-2025) Dec 16 13:39:12.221078 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 12499963 blocks Dec 16 13:39:12.199480 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Dec 16 13:39:12.221340 update_engine[1764]: I20251216 13:39:12.216025 1764 main.cc:92] Flatcar Update Engine starting Dec 16 13:39:12.200914 (ntainerd)[1779]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Dec 16 13:39:12.221668 jq[1775]: true Dec 16 13:39:12.224744 chronyd[1739]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Dec 16 13:39:12.225525 chronyd[1739]: Loaded seccomp filter (level 2) Dec 16 13:39:12.226364 systemd[1]: Started chronyd.service - NTP client/server. Dec 16 13:39:12.236269 tar[1774]: linux-amd64/LICENSE Dec 16 13:39:12.236269 tar[1774]: linux-amd64/helm Dec 16 13:39:12.243619 systemd-logind[1758]: New seat seat0. Dec 16 13:39:12.245267 systemd-logind[1758]: Watching system buttons on /dev/input/event3 (Power Button) Dec 16 13:39:12.245288 systemd-logind[1758]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Dec 16 13:39:12.245484 systemd[1]: Started systemd-logind.service - User Login Management. Dec 16 13:39:12.259742 dbus-daemon[1742]: [system] SELinux support is enabled Dec 16 13:39:12.259981 systemd[1]: Started dbus.service - D-Bus System Message Bus. Dec 16 13:39:12.372500 update_engine[1764]: I20251216 13:39:12.262208 1764 update_check_scheduler.cc:74] Next update check in 7m32s Dec 16 13:39:12.264039 dbus-daemon[1742]: [system] Successfully activated service 'org.freedesktop.systemd1' Dec 16 13:39:12.263272 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Dec 16 13:39:12.263302 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Dec 16 13:39:12.266861 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Dec 16 13:39:12.266894 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Dec 16 13:39:12.267436 systemd[1]: Started update-engine.service - Update Engine. Dec 16 13:39:12.269794 systemd[1]: Started locksmithd.service - Cluster reboot manager. Dec 16 13:39:12.317258 locksmithd[1808]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Dec 16 13:39:12.385552 containerd[1779]: time="2025-12-16T13:39:12Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Dec 16 13:39:12.388432 containerd[1779]: time="2025-12-16T13:39:12.388122379Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Dec 16 13:39:12.393443 bash[1807]: Updated "/home/core/.ssh/authorized_keys" Dec 16 13:39:12.397307 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Dec 16 13:39:12.402895 containerd[1779]: time="2025-12-16T13:39:12.399565620Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="12.988µs" Dec 16 13:39:12.402895 containerd[1779]: time="2025-12-16T13:39:12.399608757Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Dec 16 13:39:12.402895 containerd[1779]: time="2025-12-16T13:39:12.399634525Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Dec 16 13:39:12.402895 containerd[1779]: time="2025-12-16T13:39:12.399795233Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Dec 16 13:39:12.402895 containerd[1779]: time="2025-12-16T13:39:12.399812678Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Dec 16 13:39:12.402895 containerd[1779]: time="2025-12-16T13:39:12.399841636Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 13:39:12.402895 containerd[1779]: time="2025-12-16T13:39:12.400623766Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 13:39:12.402895 containerd[1779]: time="2025-12-16T13:39:12.400644078Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 13:39:12.402895 containerd[1779]: time="2025-12-16T13:39:12.400916674Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 13:39:12.402895 containerd[1779]: time="2025-12-16T13:39:12.400930473Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 13:39:12.402895 containerd[1779]: time="2025-12-16T13:39:12.400940784Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 13:39:12.402895 containerd[1779]: time="2025-12-16T13:39:12.400947914Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Dec 16 13:39:12.399808 systemd[1]: Starting sshkeys.service... Dec 16 13:39:12.403218 containerd[1779]: time="2025-12-16T13:39:12.401005928Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Dec 16 13:39:12.403218 containerd[1779]: time="2025-12-16T13:39:12.401372724Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 13:39:12.403218 containerd[1779]: time="2025-12-16T13:39:12.401399078Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 13:39:12.403218 containerd[1779]: time="2025-12-16T13:39:12.401408293Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Dec 16 13:39:12.403218 containerd[1779]: time="2025-12-16T13:39:12.401437668Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Dec 16 13:39:12.403218 containerd[1779]: time="2025-12-16T13:39:12.401710244Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Dec 16 13:39:12.403218 containerd[1779]: time="2025-12-16T13:39:12.401776998Z" level=info msg="metadata content store policy set" policy=shared Dec 16 13:39:12.421145 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Dec 16 13:39:12.422827 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Dec 16 13:39:12.447822 containerd[1779]: time="2025-12-16T13:39:12.447755332Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Dec 16 13:39:12.447937 containerd[1779]: time="2025-12-16T13:39:12.447832063Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Dec 16 13:39:12.447937 containerd[1779]: time="2025-12-16T13:39:12.447847619Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Dec 16 13:39:12.447937 containerd[1779]: time="2025-12-16T13:39:12.447861926Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Dec 16 13:39:12.447937 containerd[1779]: time="2025-12-16T13:39:12.447874975Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Dec 16 13:39:12.447937 containerd[1779]: time="2025-12-16T13:39:12.447916504Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Dec 16 13:39:12.447937 containerd[1779]: time="2025-12-16T13:39:12.447928187Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Dec 16 13:39:12.448056 containerd[1779]: time="2025-12-16T13:39:12.447939891Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Dec 16 13:39:12.448056 containerd[1779]: time="2025-12-16T13:39:12.447951010Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Dec 16 13:39:12.448056 containerd[1779]: time="2025-12-16T13:39:12.448013184Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Dec 16 13:39:12.448056 containerd[1779]: time="2025-12-16T13:39:12.448023941Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Dec 16 13:39:12.448056 containerd[1779]: time="2025-12-16T13:39:12.448035784Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Dec 16 13:39:12.448186 containerd[1779]: time="2025-12-16T13:39:12.448166087Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Dec 16 13:39:12.448206 containerd[1779]: time="2025-12-16T13:39:12.448187875Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Dec 16 13:39:12.448231 containerd[1779]: time="2025-12-16T13:39:12.448210176Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Dec 16 13:39:12.448231 containerd[1779]: time="2025-12-16T13:39:12.448221217Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Dec 16 13:39:12.448268 containerd[1779]: time="2025-12-16T13:39:12.448231477Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Dec 16 13:39:12.448268 containerd[1779]: time="2025-12-16T13:39:12.448240832Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Dec 16 13:39:12.448268 containerd[1779]: time="2025-12-16T13:39:12.448252848Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Dec 16 13:39:12.448268 containerd[1779]: time="2025-12-16T13:39:12.448262875Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Dec 16 13:39:12.448337 containerd[1779]: time="2025-12-16T13:39:12.448274456Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Dec 16 13:39:12.448337 containerd[1779]: time="2025-12-16T13:39:12.448287099Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Dec 16 13:39:12.448337 containerd[1779]: time="2025-12-16T13:39:12.448307998Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Dec 16 13:39:12.448384 containerd[1779]: time="2025-12-16T13:39:12.448356873Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Dec 16 13:39:12.448384 containerd[1779]: time="2025-12-16T13:39:12.448370584Z" level=info msg="Start snapshots syncer" Dec 16 13:39:12.448425 containerd[1779]: time="2025-12-16T13:39:12.448407410Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Dec 16 13:39:12.448742 containerd[1779]: time="2025-12-16T13:39:12.448696478Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Dec 16 13:39:12.448851 containerd[1779]: time="2025-12-16T13:39:12.448753531Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Dec 16 13:39:12.448851 containerd[1779]: time="2025-12-16T13:39:12.448804080Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Dec 16 13:39:12.448946 containerd[1779]: time="2025-12-16T13:39:12.448927487Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Dec 16 13:39:12.448974 containerd[1779]: time="2025-12-16T13:39:12.448950873Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Dec 16 13:39:12.448992 containerd[1779]: time="2025-12-16T13:39:12.448976857Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Dec 16 13:39:12.448992 containerd[1779]: time="2025-12-16T13:39:12.448988408Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Dec 16 13:39:12.449034 containerd[1779]: time="2025-12-16T13:39:12.449000482Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Dec 16 13:39:12.449034 containerd[1779]: time="2025-12-16T13:39:12.449010401Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Dec 16 13:39:12.449034 containerd[1779]: time="2025-12-16T13:39:12.449020237Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Dec 16 13:39:12.449082 containerd[1779]: time="2025-12-16T13:39:12.449041034Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Dec 16 13:39:12.449082 containerd[1779]: time="2025-12-16T13:39:12.449052809Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Dec 16 13:39:12.449082 containerd[1779]: time="2025-12-16T13:39:12.449062630Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Dec 16 13:39:12.449132 containerd[1779]: time="2025-12-16T13:39:12.449103091Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 13:39:12.449132 containerd[1779]: time="2025-12-16T13:39:12.449118560Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 13:39:12.449132 containerd[1779]: time="2025-12-16T13:39:12.449127753Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 13:39:12.449181 containerd[1779]: time="2025-12-16T13:39:12.449136341Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 13:39:12.449210 containerd[1779]: time="2025-12-16T13:39:12.449196729Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Dec 16 13:39:12.449232 containerd[1779]: time="2025-12-16T13:39:12.449211670Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Dec 16 13:39:12.449232 containerd[1779]: time="2025-12-16T13:39:12.449226663Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Dec 16 13:39:12.449267 containerd[1779]: time="2025-12-16T13:39:12.449242072Z" level=info msg="runtime interface created" Dec 16 13:39:12.449267 containerd[1779]: time="2025-12-16T13:39:12.449247138Z" level=info msg="created NRI interface" Dec 16 13:39:12.449267 containerd[1779]: time="2025-12-16T13:39:12.449260969Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Dec 16 13:39:12.449313 containerd[1779]: time="2025-12-16T13:39:12.449272240Z" level=info msg="Connect containerd service" Dec 16 13:39:12.449313 containerd[1779]: time="2025-12-16T13:39:12.449290494Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Dec 16 13:39:12.450322 containerd[1779]: time="2025-12-16T13:39:12.450009835Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 16 13:39:12.460276 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:39:12.537685 containerd[1779]: time="2025-12-16T13:39:12.537586042Z" level=info msg="Start subscribing containerd event" Dec 16 13:39:12.537685 containerd[1779]: time="2025-12-16T13:39:12.537680704Z" level=info msg="Start recovering state" Dec 16 13:39:12.537862 containerd[1779]: time="2025-12-16T13:39:12.537775898Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Dec 16 13:39:12.537862 containerd[1779]: time="2025-12-16T13:39:12.537818668Z" level=info msg="Start event monitor" Dec 16 13:39:12.537862 containerd[1779]: time="2025-12-16T13:39:12.537826437Z" level=info msg=serving... address=/run/containerd/containerd.sock Dec 16 13:39:12.537862 containerd[1779]: time="2025-12-16T13:39:12.537832933Z" level=info msg="Start cni network conf syncer for default" Dec 16 13:39:12.537862 containerd[1779]: time="2025-12-16T13:39:12.537840546Z" level=info msg="Start streaming server" Dec 16 13:39:12.537971 containerd[1779]: time="2025-12-16T13:39:12.537866664Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Dec 16 13:39:12.537971 containerd[1779]: time="2025-12-16T13:39:12.537876328Z" level=info msg="runtime interface starting up..." Dec 16 13:39:12.537971 containerd[1779]: time="2025-12-16T13:39:12.537895744Z" level=info msg="starting plugins..." Dec 16 13:39:12.537971 containerd[1779]: time="2025-12-16T13:39:12.537908541Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Dec 16 13:39:12.538203 systemd[1]: Started containerd.service - containerd container runtime. Dec 16 13:39:12.538722 containerd[1779]: time="2025-12-16T13:39:12.538695620Z" level=info msg="containerd successfully booted in 0.153523s" Dec 16 13:39:12.659713 sshd_keygen[1777]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Dec 16 13:39:12.670936 kernel: EXT4-fs (vda9): resized filesystem to 12499963 Dec 16 13:39:12.680430 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Dec 16 13:39:12.684358 systemd[1]: Starting issuegen.service - Generate /run/issue... Dec 16 13:39:12.698168 extend-filesystems[1776]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Dec 16 13:39:12.698168 extend-filesystems[1776]: old_desc_blocks = 1, new_desc_blocks = 6 Dec 16 13:39:12.698168 extend-filesystems[1776]: The filesystem on /dev/vda9 is now 12499963 (4k) blocks long. Dec 16 13:39:12.699490 extend-filesystems[1747]: Resized filesystem in /dev/vda9 Dec 16 13:39:12.699835 tar[1774]: linux-amd64/README.md Dec 16 13:39:12.699693 systemd[1]: extend-filesystems.service: Deactivated successfully. Dec 16 13:39:12.706373 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Dec 16 13:39:12.712364 systemd[1]: issuegen.service: Deactivated successfully. Dec 16 13:39:12.712549 systemd[1]: Finished issuegen.service - Generate /run/issue. Dec 16 13:39:12.714858 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Dec 16 13:39:12.717113 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Dec 16 13:39:12.726457 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Dec 16 13:39:12.728613 systemd[1]: Started getty@tty1.service - Getty on tty1. Dec 16 13:39:12.733565 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Dec 16 13:39:12.739672 systemd[1]: Reached target getty.target - Login Prompts. Dec 16 13:39:13.169925 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:39:13.468019 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:39:14.030095 systemd-networkd[1684]: eth0: Gained IPv6LL Dec 16 13:39:14.032371 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Dec 16 13:39:14.033399 systemd[1]: Reached target network-online.target - Network is Online. Dec 16 13:39:14.035135 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:39:14.069236 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Dec 16 13:39:14.095991 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Dec 16 13:39:15.180920 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:39:15.247641 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:39:15.251571 (kubelet)[1883]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 13:39:15.480958 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:39:15.989182 kubelet[1883]: E1216 13:39:15.989112 1883 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 13:39:15.991569 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 13:39:15.991694 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 13:39:15.992018 systemd[1]: kubelet.service: Consumed 1.021s CPU time, 268.6M memory peak. Dec 16 13:39:17.718836 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Dec 16 13:39:17.720710 systemd[1]: Started sshd@0-10.0.21.93:22-147.75.109.163:39500.service - OpenSSH per-connection server daemon (147.75.109.163:39500). Dec 16 13:39:18.735030 sshd[1899]: Accepted publickey for core from 147.75.109.163 port 39500 ssh2: RSA SHA256:ieG6luY3aPLDHUxSRue076s7/DaIRhKi0Ks9YLqRurI Dec 16 13:39:18.737951 sshd-session[1899]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:39:18.749447 systemd-logind[1758]: New session 1 of user core. Dec 16 13:39:18.750636 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Dec 16 13:39:18.751607 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Dec 16 13:39:18.789366 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Dec 16 13:39:18.791289 systemd[1]: Starting user@500.service - User Manager for UID 500... Dec 16 13:39:18.822486 (systemd)[1908]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Dec 16 13:39:18.824708 systemd-logind[1758]: New session c1 of user core. Dec 16 13:39:18.952627 systemd[1908]: Queued start job for default target default.target. Dec 16 13:39:18.973997 systemd[1908]: Created slice app.slice - User Application Slice. Dec 16 13:39:18.974029 systemd[1908]: Reached target paths.target - Paths. Dec 16 13:39:18.974068 systemd[1908]: Reached target timers.target - Timers. Dec 16 13:39:18.975235 systemd[1908]: Starting dbus.socket - D-Bus User Message Bus Socket... Dec 16 13:39:18.986523 systemd[1908]: Listening on dbus.socket - D-Bus User Message Bus Socket. Dec 16 13:39:18.986635 systemd[1908]: Reached target sockets.target - Sockets. Dec 16 13:39:18.986690 systemd[1908]: Reached target basic.target - Basic System. Dec 16 13:39:18.986727 systemd[1908]: Reached target default.target - Main User Target. Dec 16 13:39:18.986753 systemd[1908]: Startup finished in 156ms. Dec 16 13:39:18.986837 systemd[1]: Started user@500.service - User Manager for UID 500. Dec 16 13:39:18.988172 systemd[1]: Started session-1.scope - Session 1 of User core. Dec 16 13:39:19.194982 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:39:19.206268 coreos-metadata[1741]: Dec 16 13:39:19.206 WARN failed to locate config-drive, using the metadata service API instead Dec 16 13:39:19.220787 coreos-metadata[1741]: Dec 16 13:39:19.220 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Dec 16 13:39:19.499932 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 13:39:19.505609 coreos-metadata[1823]: Dec 16 13:39:19.505 WARN failed to locate config-drive, using the metadata service API instead Dec 16 13:39:19.517321 coreos-metadata[1823]: Dec 16 13:39:19.517 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Dec 16 13:39:19.668322 systemd[1]: Started sshd@1-10.0.21.93:22-147.75.109.163:39508.service - OpenSSH per-connection server daemon (147.75.109.163:39508). Dec 16 13:39:19.879221 coreos-metadata[1741]: Dec 16 13:39:19.879 INFO Fetch successful Dec 16 13:39:19.879221 coreos-metadata[1741]: Dec 16 13:39:19.879 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 16 13:39:20.617011 sshd[1923]: Accepted publickey for core from 147.75.109.163 port 39508 ssh2: RSA SHA256:ieG6luY3aPLDHUxSRue076s7/DaIRhKi0Ks9YLqRurI Dec 16 13:39:20.618458 sshd-session[1923]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:39:20.622791 systemd-logind[1758]: New session 2 of user core. Dec 16 13:39:20.631108 systemd[1]: Started session-2.scope - Session 2 of User core. Dec 16 13:39:21.073037 coreos-metadata[1823]: Dec 16 13:39:21.072 INFO Fetch successful Dec 16 13:39:21.073037 coreos-metadata[1823]: Dec 16 13:39:21.072 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Dec 16 13:39:21.114755 coreos-metadata[1741]: Dec 16 13:39:21.114 INFO Fetch successful Dec 16 13:39:21.114755 coreos-metadata[1741]: Dec 16 13:39:21.114 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Dec 16 13:39:21.276507 sshd[1926]: Connection closed by 147.75.109.163 port 39508 Dec 16 13:39:21.276853 sshd-session[1923]: pam_unix(sshd:session): session closed for user core Dec 16 13:39:21.280044 systemd[1]: sshd@1-10.0.21.93:22-147.75.109.163:39508.service: Deactivated successfully. Dec 16 13:39:21.282047 systemd[1]: session-2.scope: Deactivated successfully. Dec 16 13:39:21.282860 systemd-logind[1758]: Session 2 logged out. Waiting for processes to exit. Dec 16 13:39:21.283612 systemd-logind[1758]: Removed session 2. Dec 16 13:39:21.334718 coreos-metadata[1741]: Dec 16 13:39:21.334 INFO Fetch successful Dec 16 13:39:21.334718 coreos-metadata[1741]: Dec 16 13:39:21.334 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Dec 16 13:39:21.336994 coreos-metadata[1823]: Dec 16 13:39:21.336 INFO Fetch successful Dec 16 13:39:21.345155 unknown[1823]: wrote ssh authorized keys file for user: core Dec 16 13:39:21.380031 update-ssh-keys[1931]: Updated "/home/core/.ssh/authorized_keys" Dec 16 13:39:21.381277 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Dec 16 13:39:21.383135 systemd[1]: Finished sshkeys.service. Dec 16 13:39:21.446203 coreos-metadata[1741]: Dec 16 13:39:21.446 INFO Fetch successful Dec 16 13:39:21.446203 coreos-metadata[1741]: Dec 16 13:39:21.446 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Dec 16 13:39:21.453219 systemd[1]: Started sshd@2-10.0.21.93:22-147.75.109.163:51566.service - OpenSSH per-connection server daemon (147.75.109.163:51566). Dec 16 13:39:21.555493 coreos-metadata[1741]: Dec 16 13:39:21.555 INFO Fetch successful Dec 16 13:39:21.555493 coreos-metadata[1741]: Dec 16 13:39:21.555 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Dec 16 13:39:21.663875 coreos-metadata[1741]: Dec 16 13:39:21.663 INFO Fetch successful Dec 16 13:39:21.692401 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Dec 16 13:39:21.692824 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Dec 16 13:39:21.692957 systemd[1]: Reached target multi-user.target - Multi-User System. Dec 16 13:39:21.693065 systemd[1]: Startup finished in 3.991s (kernel) + 12.370s (initrd) + 11.750s (userspace) = 28.113s. Dec 16 13:39:22.407543 sshd[1936]: Accepted publickey for core from 147.75.109.163 port 51566 ssh2: RSA SHA256:ieG6luY3aPLDHUxSRue076s7/DaIRhKi0Ks9YLqRurI Dec 16 13:39:22.408774 sshd-session[1936]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:39:22.413611 systemd-logind[1758]: New session 3 of user core. Dec 16 13:39:22.431117 systemd[1]: Started session-3.scope - Session 3 of User core. Dec 16 13:39:23.069542 sshd[1944]: Connection closed by 147.75.109.163 port 51566 Dec 16 13:39:23.069964 sshd-session[1936]: pam_unix(sshd:session): session closed for user core Dec 16 13:39:23.073960 systemd[1]: sshd@2-10.0.21.93:22-147.75.109.163:51566.service: Deactivated successfully. Dec 16 13:39:23.075423 systemd[1]: session-3.scope: Deactivated successfully. Dec 16 13:39:23.076101 systemd-logind[1758]: Session 3 logged out. Waiting for processes to exit. Dec 16 13:39:23.076949 systemd-logind[1758]: Removed session 3. Dec 16 13:39:26.113277 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Dec 16 13:39:26.114893 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:39:26.271464 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:39:26.275121 (kubelet)[1957]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 13:39:26.316765 kubelet[1957]: E1216 13:39:26.316701 1957 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 13:39:26.320539 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 13:39:26.320671 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 13:39:26.321042 systemd[1]: kubelet.service: Consumed 160ms CPU time, 111.6M memory peak. Dec 16 13:39:33.244666 systemd[1]: Started sshd@3-10.0.21.93:22-147.75.109.163:39818.service - OpenSSH per-connection server daemon (147.75.109.163:39818). Dec 16 13:39:34.217023 sshd[1970]: Accepted publickey for core from 147.75.109.163 port 39818 ssh2: RSA SHA256:ieG6luY3aPLDHUxSRue076s7/DaIRhKi0Ks9YLqRurI Dec 16 13:39:34.218246 sshd-session[1970]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:39:34.222069 systemd-logind[1758]: New session 4 of user core. Dec 16 13:39:34.242107 systemd[1]: Started session-4.scope - Session 4 of User core. Dec 16 13:39:34.876899 sshd[1973]: Connection closed by 147.75.109.163 port 39818 Dec 16 13:39:34.877318 sshd-session[1970]: pam_unix(sshd:session): session closed for user core Dec 16 13:39:34.880873 systemd-logind[1758]: Session 4 logged out. Waiting for processes to exit. Dec 16 13:39:34.881284 systemd[1]: sshd@3-10.0.21.93:22-147.75.109.163:39818.service: Deactivated successfully. Dec 16 13:39:34.882964 systemd[1]: session-4.scope: Deactivated successfully. Dec 16 13:39:34.884311 systemd-logind[1758]: Removed session 4. Dec 16 13:39:35.047640 systemd[1]: Started sshd@4-10.0.21.93:22-147.75.109.163:39834.service - OpenSSH per-connection server daemon (147.75.109.163:39834). Dec 16 13:39:36.008701 chronyd[1739]: Selected source PHC0 Dec 16 13:39:36.008727 chronyd[1739]: System clock wrong by 2.004671 seconds Dec 16 13:39:38.013496 systemd-resolved[1685]: Clock change detected. Flushing caches. Dec 16 13:39:38.013427 chronyd[1739]: System clock was stepped by 2.004671 seconds Dec 16 13:39:38.014691 sshd[1979]: Accepted publickey for core from 147.75.109.163 port 39834 ssh2: RSA SHA256:ieG6luY3aPLDHUxSRue076s7/DaIRhKi0Ks9YLqRurI Dec 16 13:39:38.016069 sshd-session[1979]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:39:38.020615 systemd-logind[1758]: New session 5 of user core. Dec 16 13:39:38.043863 systemd[1]: Started session-5.scope - Session 5 of User core. Dec 16 13:39:38.367067 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Dec 16 13:39:38.368466 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:39:38.511471 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:39:38.515294 (kubelet)[1991]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 13:39:38.548771 kubelet[1991]: E1216 13:39:38.548701 1991 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 13:39:38.551514 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 13:39:38.551657 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 13:39:38.551987 systemd[1]: kubelet.service: Consumed 145ms CPU time, 110.9M memory peak. Dec 16 13:39:38.678453 sshd[1982]: Connection closed by 147.75.109.163 port 39834 Dec 16 13:39:38.679518 sshd-session[1979]: pam_unix(sshd:session): session closed for user core Dec 16 13:39:38.683339 systemd[1]: sshd@4-10.0.21.93:22-147.75.109.163:39834.service: Deactivated successfully. Dec 16 13:39:38.684782 systemd[1]: session-5.scope: Deactivated successfully. Dec 16 13:39:38.685332 systemd-logind[1758]: Session 5 logged out. Waiting for processes to exit. Dec 16 13:39:38.686249 systemd-logind[1758]: Removed session 5. Dec 16 13:39:38.851914 systemd[1]: Started sshd@5-10.0.21.93:22-147.75.109.163:39842.service - OpenSSH per-connection server daemon (147.75.109.163:39842). Dec 16 13:39:39.838633 sshd[2007]: Accepted publickey for core from 147.75.109.163 port 39842 ssh2: RSA SHA256:ieG6luY3aPLDHUxSRue076s7/DaIRhKi0Ks9YLqRurI Dec 16 13:39:39.839853 sshd-session[2007]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:39:39.844651 systemd-logind[1758]: New session 6 of user core. Dec 16 13:39:39.853891 systemd[1]: Started session-6.scope - Session 6 of User core. Dec 16 13:39:40.512273 sshd[2010]: Connection closed by 147.75.109.163 port 39842 Dec 16 13:39:40.512793 sshd-session[2007]: pam_unix(sshd:session): session closed for user core Dec 16 13:39:40.516025 systemd[1]: sshd@5-10.0.21.93:22-147.75.109.163:39842.service: Deactivated successfully. Dec 16 13:39:40.518504 systemd[1]: session-6.scope: Deactivated successfully. Dec 16 13:39:40.520239 systemd-logind[1758]: Session 6 logged out. Waiting for processes to exit. Dec 16 13:39:40.520940 systemd-logind[1758]: Removed session 6. Dec 16 13:39:40.694039 systemd[1]: Started sshd@6-10.0.21.93:22-147.75.109.163:39852.service - OpenSSH per-connection server daemon (147.75.109.163:39852). Dec 16 13:39:41.741065 sshd[2016]: Accepted publickey for core from 147.75.109.163 port 39852 ssh2: RSA SHA256:ieG6luY3aPLDHUxSRue076s7/DaIRhKi0Ks9YLqRurI Dec 16 13:39:41.742242 sshd-session[2016]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:39:41.746428 systemd-logind[1758]: New session 7 of user core. Dec 16 13:39:41.756734 systemd[1]: Started session-7.scope - Session 7 of User core. Dec 16 13:39:42.305478 sudo[2020]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Dec 16 13:39:42.305740 sudo[2020]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 13:39:42.322632 sudo[2020]: pam_unix(sudo:session): session closed for user root Dec 16 13:39:42.490457 sshd[2019]: Connection closed by 147.75.109.163 port 39852 Dec 16 13:39:42.490894 sshd-session[2016]: pam_unix(sshd:session): session closed for user core Dec 16 13:39:42.494327 systemd[1]: sshd@6-10.0.21.93:22-147.75.109.163:39852.service: Deactivated successfully. Dec 16 13:39:42.495711 systemd[1]: session-7.scope: Deactivated successfully. Dec 16 13:39:42.496329 systemd-logind[1758]: Session 7 logged out. Waiting for processes to exit. Dec 16 13:39:42.497293 systemd-logind[1758]: Removed session 7. Dec 16 13:39:42.663883 systemd[1]: Started sshd@7-10.0.21.93:22-147.75.109.163:54780.service - OpenSSH per-connection server daemon (147.75.109.163:54780). Dec 16 13:39:43.636591 sshd[2026]: Accepted publickey for core from 147.75.109.163 port 54780 ssh2: RSA SHA256:ieG6luY3aPLDHUxSRue076s7/DaIRhKi0Ks9YLqRurI Dec 16 13:39:43.637727 sshd-session[2026]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:39:43.641620 systemd-logind[1758]: New session 8 of user core. Dec 16 13:39:43.656759 systemd[1]: Started session-8.scope - Session 8 of User core. Dec 16 13:39:44.146265 sudo[2031]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Dec 16 13:39:44.146488 sudo[2031]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 13:39:44.151710 sudo[2031]: pam_unix(sudo:session): session closed for user root Dec 16 13:39:44.156451 sudo[2030]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Dec 16 13:39:44.156687 sudo[2030]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 13:39:44.165392 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 13:39:44.207622 augenrules[2053]: No rules Dec 16 13:39:44.208200 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 13:39:44.208400 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 13:39:44.209595 sudo[2030]: pam_unix(sudo:session): session closed for user root Dec 16 13:39:44.364858 sshd[2029]: Connection closed by 147.75.109.163 port 54780 Dec 16 13:39:44.365204 sshd-session[2026]: pam_unix(sshd:session): session closed for user core Dec 16 13:39:44.367941 systemd[1]: sshd@7-10.0.21.93:22-147.75.109.163:54780.service: Deactivated successfully. Dec 16 13:39:44.369371 systemd[1]: session-8.scope: Deactivated successfully. Dec 16 13:39:44.370406 systemd-logind[1758]: Session 8 logged out. Waiting for processes to exit. Dec 16 13:39:44.371261 systemd-logind[1758]: Removed session 8. Dec 16 13:39:44.529733 systemd[1]: Started sshd@8-10.0.21.93:22-147.75.109.163:54794.service - OpenSSH per-connection server daemon (147.75.109.163:54794). Dec 16 13:39:45.482412 sshd[2062]: Accepted publickey for core from 147.75.109.163 port 54794 ssh2: RSA SHA256:ieG6luY3aPLDHUxSRue076s7/DaIRhKi0Ks9YLqRurI Dec 16 13:39:45.483430 sshd-session[2062]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:39:45.487342 systemd-logind[1758]: New session 9 of user core. Dec 16 13:39:45.509814 systemd[1]: Started session-9.scope - Session 9 of User core. Dec 16 13:39:45.990595 sudo[2066]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Dec 16 13:39:45.990831 sudo[2066]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 13:39:46.314809 systemd[1]: Starting docker.service - Docker Application Container Engine... Dec 16 13:39:46.335178 (dockerd)[2092]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Dec 16 13:39:46.586348 dockerd[2092]: time="2025-12-16T13:39:46.586206949Z" level=info msg="Starting up" Dec 16 13:39:46.586864 dockerd[2092]: time="2025-12-16T13:39:46.586841480Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Dec 16 13:39:46.598522 dockerd[2092]: time="2025-12-16T13:39:46.598474358Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Dec 16 13:39:46.642541 dockerd[2092]: time="2025-12-16T13:39:46.642473377Z" level=info msg="Loading containers: start." Dec 16 13:39:46.655577 kernel: Initializing XFRM netlink socket Dec 16 13:39:46.926951 systemd-networkd[1684]: docker0: Link UP Dec 16 13:39:46.933670 dockerd[2092]: time="2025-12-16T13:39:46.933621203Z" level=info msg="Loading containers: done." Dec 16 13:39:46.945573 dockerd[2092]: time="2025-12-16T13:39:46.945499666Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Dec 16 13:39:46.945765 dockerd[2092]: time="2025-12-16T13:39:46.945587073Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Dec 16 13:39:46.945765 dockerd[2092]: time="2025-12-16T13:39:46.945647064Z" level=info msg="Initializing buildkit" Dec 16 13:39:46.945729 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1445653064-merged.mount: Deactivated successfully. Dec 16 13:39:46.975592 dockerd[2092]: time="2025-12-16T13:39:46.975536218Z" level=info msg="Completed buildkit initialization" Dec 16 13:39:46.979806 dockerd[2092]: time="2025-12-16T13:39:46.979750224Z" level=info msg="Daemon has completed initialization" Dec 16 13:39:46.980101 dockerd[2092]: time="2025-12-16T13:39:46.980024346Z" level=info msg="API listen on /run/docker.sock" Dec 16 13:39:46.980157 systemd[1]: Started docker.service - Docker Application Container Engine. Dec 16 13:39:48.271078 containerd[1779]: time="2025-12-16T13:39:48.271004580Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\"" Dec 16 13:39:48.616950 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Dec 16 13:39:48.618156 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:39:48.768728 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:39:48.772934 (kubelet)[2324]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 13:39:48.812442 kubelet[2324]: E1216 13:39:48.812392 2324 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 13:39:48.814759 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 13:39:48.814886 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 13:39:48.815201 systemd[1]: kubelet.service: Consumed 142ms CPU time, 113M memory peak. Dec 16 13:39:48.945718 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4278360849.mount: Deactivated successfully. Dec 16 13:39:49.886599 containerd[1779]: time="2025-12-16T13:39:49.886533830Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:39:49.887847 containerd[1779]: time="2025-12-16T13:39:49.887822371Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.7: active requests=0, bytes read=30114810" Dec 16 13:39:49.889271 containerd[1779]: time="2025-12-16T13:39:49.889233967Z" level=info msg="ImageCreate event name:\"sha256:021d1ceeffb11df7a9fb9adfa0ad0a30dcd13cb3d630022066f184cdcb93731b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:39:49.892978 containerd[1779]: time="2025-12-16T13:39:49.892746036Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:39:49.893437 containerd[1779]: time="2025-12-16T13:39:49.893417415Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.7\" with image id \"sha256:021d1ceeffb11df7a9fb9adfa0ad0a30dcd13cb3d630022066f184cdcb93731b\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\", size \"30111311\" in 1.622362135s" Dec 16 13:39:49.893473 containerd[1779]: time="2025-12-16T13:39:49.893445550Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\" returns image reference \"sha256:021d1ceeffb11df7a9fb9adfa0ad0a30dcd13cb3d630022066f184cdcb93731b\"" Dec 16 13:39:49.893930 containerd[1779]: time="2025-12-16T13:39:49.893906586Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\"" Dec 16 13:39:51.052644 containerd[1779]: time="2025-12-16T13:39:51.052573549Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:39:51.053880 containerd[1779]: time="2025-12-16T13:39:51.053847417Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.7: active requests=0, bytes read=26016801" Dec 16 13:39:51.055853 containerd[1779]: time="2025-12-16T13:39:51.055803067Z" level=info msg="ImageCreate event name:\"sha256:29c7cab9d8e681d047281fd3711baf13c28f66923480fb11c8f22ddb7ca742d1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:39:51.059706 containerd[1779]: time="2025-12-16T13:39:51.059662025Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:39:51.060860 containerd[1779]: time="2025-12-16T13:39:51.060811780Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.7\" with image id \"sha256:29c7cab9d8e681d047281fd3711baf13c28f66923480fb11c8f22ddb7ca742d1\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\", size \"27673815\" in 1.166876336s" Dec 16 13:39:51.060860 containerd[1779]: time="2025-12-16T13:39:51.060855676Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\" returns image reference \"sha256:29c7cab9d8e681d047281fd3711baf13c28f66923480fb11c8f22ddb7ca742d1\"" Dec 16 13:39:51.061272 containerd[1779]: time="2025-12-16T13:39:51.061250681Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\"" Dec 16 13:39:52.089335 containerd[1779]: time="2025-12-16T13:39:52.089278822Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:39:52.090581 containerd[1779]: time="2025-12-16T13:39:52.090528996Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.7: active requests=0, bytes read=20158122" Dec 16 13:39:52.092296 containerd[1779]: time="2025-12-16T13:39:52.092267713Z" level=info msg="ImageCreate event name:\"sha256:f457f6fcd712acb5b9beef873f6f4a4869182f9eb52ea6e24824fd4ac4eed393\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:39:52.095811 containerd[1779]: time="2025-12-16T13:39:52.095737567Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:39:52.096652 containerd[1779]: time="2025-12-16T13:39:52.096622608Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.7\" with image id \"sha256:f457f6fcd712acb5b9beef873f6f4a4869182f9eb52ea6e24824fd4ac4eed393\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\", size \"21815154\" in 1.035340203s" Dec 16 13:39:52.096695 containerd[1779]: time="2025-12-16T13:39:52.096656272Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\" returns image reference \"sha256:f457f6fcd712acb5b9beef873f6f4a4869182f9eb52ea6e24824fd4ac4eed393\"" Dec 16 13:39:52.097053 containerd[1779]: time="2025-12-16T13:39:52.097036658Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\"" Dec 16 13:39:53.020104 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3279838222.mount: Deactivated successfully. Dec 16 13:39:53.339184 containerd[1779]: time="2025-12-16T13:39:53.338610438Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:39:53.339882 containerd[1779]: time="2025-12-16T13:39:53.339857907Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.7: active requests=0, bytes read=31930122" Dec 16 13:39:53.341577 containerd[1779]: time="2025-12-16T13:39:53.341543934Z" level=info msg="ImageCreate event name:\"sha256:0929027b17fc30cb9de279f3bdba4e130b991a1dab7978a7db2e5feb2091853c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:39:53.344074 containerd[1779]: time="2025-12-16T13:39:53.344044295Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:39:53.344447 containerd[1779]: time="2025-12-16T13:39:53.344424460Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.7\" with image id \"sha256:0929027b17fc30cb9de279f3bdba4e130b991a1dab7978a7db2e5feb2091853c\", repo tag \"registry.k8s.io/kube-proxy:v1.33.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\", size \"31929115\" in 1.247362025s" Dec 16 13:39:53.344491 containerd[1779]: time="2025-12-16T13:39:53.344453032Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\" returns image reference \"sha256:0929027b17fc30cb9de279f3bdba4e130b991a1dab7978a7db2e5feb2091853c\"" Dec 16 13:39:53.344854 containerd[1779]: time="2025-12-16T13:39:53.344829229Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Dec 16 13:39:53.939500 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4038805724.mount: Deactivated successfully. Dec 16 13:39:54.600861 containerd[1779]: time="2025-12-16T13:39:54.600782789Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:39:54.602888 containerd[1779]: time="2025-12-16T13:39:54.602604045Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942330" Dec 16 13:39:54.604184 containerd[1779]: time="2025-12-16T13:39:54.604131678Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:39:54.607323 containerd[1779]: time="2025-12-16T13:39:54.607155335Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:39:54.607917 containerd[1779]: time="2025-12-16T13:39:54.607893802Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.263033319s" Dec 16 13:39:54.607950 containerd[1779]: time="2025-12-16T13:39:54.607918102Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Dec 16 13:39:54.608324 containerd[1779]: time="2025-12-16T13:39:54.608285131Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Dec 16 13:39:55.187155 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2425707894.mount: Deactivated successfully. Dec 16 13:39:55.197458 containerd[1779]: time="2025-12-16T13:39:55.197389493Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 13:39:55.198819 containerd[1779]: time="2025-12-16T13:39:55.198774085Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321158" Dec 16 13:39:55.200578 containerd[1779]: time="2025-12-16T13:39:55.200512091Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 13:39:55.203135 containerd[1779]: time="2025-12-16T13:39:55.203085462Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 13:39:55.203608 containerd[1779]: time="2025-12-16T13:39:55.203562518Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 595.223367ms" Dec 16 13:39:55.203608 containerd[1779]: time="2025-12-16T13:39:55.203595653Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Dec 16 13:39:55.204083 containerd[1779]: time="2025-12-16T13:39:55.204039684Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Dec 16 13:39:55.835762 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1985766929.mount: Deactivated successfully. Dec 16 13:39:57.520672 containerd[1779]: time="2025-12-16T13:39:57.520614337Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:39:57.522348 containerd[1779]: time="2025-12-16T13:39:57.522317610Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=58926289" Dec 16 13:39:57.524710 containerd[1779]: time="2025-12-16T13:39:57.524631547Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:39:57.527687 containerd[1779]: time="2025-12-16T13:39:57.527627883Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:39:57.528482 containerd[1779]: time="2025-12-16T13:39:57.528450818Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 2.324379852s" Dec 16 13:39:57.528690 containerd[1779]: time="2025-12-16T13:39:57.528589155Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" Dec 16 13:39:58.866996 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Dec 16 13:39:58.868365 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:39:58.996235 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:39:58.999853 (kubelet)[2558]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 13:39:59.033428 kubelet[2558]: E1216 13:39:59.033386 2558 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 13:39:59.035885 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 13:39:59.036018 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 13:39:59.036334 systemd[1]: kubelet.service: Consumed 138ms CPU time, 111M memory peak. Dec 16 13:39:59.597665 update_engine[1764]: I20251216 13:39:59.597540 1764 update_attempter.cc:509] Updating boot flags... Dec 16 13:40:00.985530 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:40:00.985690 systemd[1]: kubelet.service: Consumed 138ms CPU time, 111M memory peak. Dec 16 13:40:00.987894 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:40:01.010153 systemd[1]: Reload requested from client PID 2592 ('systemctl') (unit session-9.scope)... Dec 16 13:40:01.010168 systemd[1]: Reloading... Dec 16 13:40:01.070605 zram_generator::config[2635]: No configuration found. Dec 16 13:40:01.257532 systemd[1]: Reloading finished in 247 ms. Dec 16 13:40:01.321978 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:40:01.324665 systemd[1]: kubelet.service: Deactivated successfully. Dec 16 13:40:01.324876 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:40:01.324926 systemd[1]: kubelet.service: Consumed 98ms CPU time, 98.7M memory peak. Dec 16 13:40:01.326171 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:40:01.443436 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:40:01.447288 (kubelet)[2691]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 13:40:01.484972 kubelet[2691]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 13:40:01.484972 kubelet[2691]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 13:40:01.484972 kubelet[2691]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 13:40:01.485355 kubelet[2691]: I1216 13:40:01.485005 2691 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 13:40:01.823574 kubelet[2691]: I1216 13:40:01.823502 2691 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Dec 16 13:40:01.823574 kubelet[2691]: I1216 13:40:01.823532 2691 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 13:40:01.823790 kubelet[2691]: I1216 13:40:01.823772 2691 server.go:956] "Client rotation is on, will bootstrap in background" Dec 16 13:40:01.859989 kubelet[2691]: I1216 13:40:01.859822 2691 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 13:40:01.860950 kubelet[2691]: E1216 13:40:01.860832 2691 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.21.93:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.21.93:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Dec 16 13:40:01.868276 kubelet[2691]: I1216 13:40:01.868230 2691 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 13:40:01.877511 kubelet[2691]: I1216 13:40:01.876927 2691 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 16 13:40:01.877511 kubelet[2691]: I1216 13:40:01.877151 2691 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 13:40:01.877511 kubelet[2691]: I1216 13:40:01.877174 2691 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-a-7f096d1947","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 13:40:01.877511 kubelet[2691]: I1216 13:40:01.877337 2691 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 13:40:01.877737 kubelet[2691]: I1216 13:40:01.877345 2691 container_manager_linux.go:303] "Creating device plugin manager" Dec 16 13:40:01.878623 kubelet[2691]: I1216 13:40:01.878610 2691 state_mem.go:36] "Initialized new in-memory state store" Dec 16 13:40:01.882568 kubelet[2691]: I1216 13:40:01.882527 2691 kubelet.go:480] "Attempting to sync node with API server" Dec 16 13:40:01.882696 kubelet[2691]: I1216 13:40:01.882687 2691 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 13:40:01.882769 kubelet[2691]: I1216 13:40:01.882764 2691 kubelet.go:386] "Adding apiserver pod source" Dec 16 13:40:01.882856 kubelet[2691]: I1216 13:40:01.882850 2691 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 13:40:01.889703 kubelet[2691]: I1216 13:40:01.889662 2691 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 16 13:40:01.890177 kubelet[2691]: I1216 13:40:01.890155 2691 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 16 13:40:01.891192 kubelet[2691]: E1216 13:40:01.891157 2691 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.21.93:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-2-a-7f096d1947&limit=500&resourceVersion=0\": dial tcp 10.0.21.93:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 16 13:40:01.891192 kubelet[2691]: E1216 13:40:01.891157 2691 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.21.93:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.21.93:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 16 13:40:01.891770 kubelet[2691]: W1216 13:40:01.891746 2691 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Dec 16 13:40:01.894044 kubelet[2691]: I1216 13:40:01.894015 2691 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 16 13:40:01.894104 kubelet[2691]: I1216 13:40:01.894070 2691 server.go:1289] "Started kubelet" Dec 16 13:40:01.895716 kubelet[2691]: I1216 13:40:01.895673 2691 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 13:40:01.895941 kubelet[2691]: I1216 13:40:01.895919 2691 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 13:40:01.896027 kubelet[2691]: I1216 13:40:01.896009 2691 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 16 13:40:01.896101 kubelet[2691]: I1216 13:40:01.896087 2691 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 13:40:01.896235 kubelet[2691]: E1216 13:40:01.896221 2691 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459-2-2-a-7f096d1947\" not found" Dec 16 13:40:01.896390 kubelet[2691]: I1216 13:40:01.896383 2691 reconciler.go:26] "Reconciler: start to sync state" Dec 16 13:40:01.896438 kubelet[2691]: I1216 13:40:01.896434 2691 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 16 13:40:01.896531 kubelet[2691]: I1216 13:40:01.896472 2691 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 13:40:01.896808 kubelet[2691]: I1216 13:40:01.896788 2691 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 13:40:01.896912 kubelet[2691]: E1216 13:40:01.896898 2691 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.21.93:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.21.93:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 16 13:40:01.897014 kubelet[2691]: E1216 13:40:01.896974 2691 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.21.93:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-a-7f096d1947?timeout=10s\": dial tcp 10.0.21.93:6443: connect: connection refused" interval="200ms" Dec 16 13:40:01.897014 kubelet[2691]: I1216 13:40:01.896986 2691 server.go:317] "Adding debug handlers to kubelet server" Dec 16 13:40:01.897655 kubelet[2691]: I1216 13:40:01.897261 2691 factory.go:223] Registration of the systemd container factory successfully Dec 16 13:40:01.897655 kubelet[2691]: I1216 13:40:01.897337 2691 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 13:40:01.898021 kubelet[2691]: E1216 13:40:01.897984 2691 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 16 13:40:01.898227 kubelet[2691]: I1216 13:40:01.898211 2691 factory.go:223] Registration of the containerd container factory successfully Dec 16 13:40:01.907699 kubelet[2691]: E1216 13:40:01.905439 2691 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.21.93:6443/api/v1/namespaces/default/events\": dial tcp 10.0.21.93:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-2-a-7f096d1947.1881b5c88e748b3f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-2-a-7f096d1947,UID:ci-4459-2-2-a-7f096d1947,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-2-a-7f096d1947,},FirstTimestamp:2025-12-16 13:40:01.894034239 +0000 UTC m=+0.443493440,LastTimestamp:2025-12-16 13:40:01.894034239 +0000 UTC m=+0.443493440,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-a-7f096d1947,}" Dec 16 13:40:01.914396 kubelet[2691]: I1216 13:40:01.914344 2691 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 13:40:01.914396 kubelet[2691]: I1216 13:40:01.914365 2691 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 13:40:01.914396 kubelet[2691]: I1216 13:40:01.914380 2691 state_mem.go:36] "Initialized new in-memory state store" Dec 16 13:40:01.918306 kubelet[2691]: I1216 13:40:01.918276 2691 policy_none.go:49] "None policy: Start" Dec 16 13:40:01.918306 kubelet[2691]: I1216 13:40:01.918300 2691 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 16 13:40:01.918306 kubelet[2691]: I1216 13:40:01.918311 2691 state_mem.go:35] "Initializing new in-memory state store" Dec 16 13:40:01.920536 kubelet[2691]: I1216 13:40:01.920493 2691 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Dec 16 13:40:01.921646 kubelet[2691]: I1216 13:40:01.921623 2691 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Dec 16 13:40:01.921700 kubelet[2691]: I1216 13:40:01.921657 2691 status_manager.go:230] "Starting to sync pod status with apiserver" Dec 16 13:40:01.921700 kubelet[2691]: I1216 13:40:01.921676 2691 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 13:40:01.921700 kubelet[2691]: I1216 13:40:01.921685 2691 kubelet.go:2436] "Starting kubelet main sync loop" Dec 16 13:40:01.921765 kubelet[2691]: E1216 13:40:01.921723 2691 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 13:40:01.922843 kubelet[2691]: E1216 13:40:01.922161 2691 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.21.93:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.21.93:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Dec 16 13:40:01.925410 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Dec 16 13:40:01.944633 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Dec 16 13:40:01.947483 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Dec 16 13:40:01.968491 kubelet[2691]: E1216 13:40:01.968462 2691 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 16 13:40:01.968681 kubelet[2691]: I1216 13:40:01.968663 2691 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 13:40:01.968710 kubelet[2691]: I1216 13:40:01.968675 2691 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 13:40:01.968896 kubelet[2691]: I1216 13:40:01.968853 2691 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 13:40:01.970073 kubelet[2691]: E1216 13:40:01.969992 2691 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 13:40:01.970073 kubelet[2691]: E1216 13:40:01.970031 2691 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-2-a-7f096d1947\" not found" Dec 16 13:40:02.033386 systemd[1]: Created slice kubepods-burstable-podb8327394c4bbf31f869ef4b2b5575c90.slice - libcontainer container kubepods-burstable-podb8327394c4bbf31f869ef4b2b5575c90.slice. Dec 16 13:40:02.057601 kubelet[2691]: E1216 13:40:02.057528 2691 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-a-7f096d1947\" not found" node="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:02.060946 systemd[1]: Created slice kubepods-burstable-poda5de365130e76bebbd64bed8193f592b.slice - libcontainer container kubepods-burstable-poda5de365130e76bebbd64bed8193f592b.slice. Dec 16 13:40:02.070192 kubelet[2691]: I1216 13:40:02.070160 2691 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:02.070515 kubelet[2691]: E1216 13:40:02.070491 2691 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.21.93:6443/api/v1/nodes\": dial tcp 10.0.21.93:6443: connect: connection refused" node="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:02.082831 kubelet[2691]: E1216 13:40:02.082731 2691 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-a-7f096d1947\" not found" node="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:02.085893 systemd[1]: Created slice kubepods-burstable-pod28636bb0ff25a91b3ddd483b31735e1b.slice - libcontainer container kubepods-burstable-pod28636bb0ff25a91b3ddd483b31735e1b.slice. Dec 16 13:40:02.087562 kubelet[2691]: E1216 13:40:02.087517 2691 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-a-7f096d1947\" not found" node="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:02.098122 kubelet[2691]: I1216 13:40:02.098019 2691 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b8327394c4bbf31f869ef4b2b5575c90-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-a-7f096d1947\" (UID: \"b8327394c4bbf31f869ef4b2b5575c90\") " pod="kube-system/kube-apiserver-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:02.098122 kubelet[2691]: I1216 13:40:02.098057 2691 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b8327394c4bbf31f869ef4b2b5575c90-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-a-7f096d1947\" (UID: \"b8327394c4bbf31f869ef4b2b5575c90\") " pod="kube-system/kube-apiserver-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:02.098122 kubelet[2691]: I1216 13:40:02.098083 2691 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a5de365130e76bebbd64bed8193f592b-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-a-7f096d1947\" (UID: \"a5de365130e76bebbd64bed8193f592b\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:02.098122 kubelet[2691]: I1216 13:40:02.098099 2691 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a5de365130e76bebbd64bed8193f592b-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-a-7f096d1947\" (UID: \"a5de365130e76bebbd64bed8193f592b\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:02.098122 kubelet[2691]: I1216 13:40:02.098118 2691 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a5de365130e76bebbd64bed8193f592b-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-a-7f096d1947\" (UID: \"a5de365130e76bebbd64bed8193f592b\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:02.098309 kubelet[2691]: I1216 13:40:02.098135 2691 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a5de365130e76bebbd64bed8193f592b-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-a-7f096d1947\" (UID: \"a5de365130e76bebbd64bed8193f592b\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:02.098309 kubelet[2691]: I1216 13:40:02.098150 2691 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b8327394c4bbf31f869ef4b2b5575c90-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-a-7f096d1947\" (UID: \"b8327394c4bbf31f869ef4b2b5575c90\") " pod="kube-system/kube-apiserver-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:02.098309 kubelet[2691]: I1216 13:40:02.098165 2691 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a5de365130e76bebbd64bed8193f592b-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-a-7f096d1947\" (UID: \"a5de365130e76bebbd64bed8193f592b\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:02.098309 kubelet[2691]: I1216 13:40:02.098179 2691 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/28636bb0ff25a91b3ddd483b31735e1b-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-a-7f096d1947\" (UID: \"28636bb0ff25a91b3ddd483b31735e1b\") " pod="kube-system/kube-scheduler-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:02.098730 kubelet[2691]: E1216 13:40:02.098709 2691 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.21.93:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-a-7f096d1947?timeout=10s\": dial tcp 10.0.21.93:6443: connect: connection refused" interval="400ms" Dec 16 13:40:02.272625 kubelet[2691]: I1216 13:40:02.272536 2691 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:02.272983 kubelet[2691]: E1216 13:40:02.272942 2691 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.21.93:6443/api/v1/nodes\": dial tcp 10.0.21.93:6443: connect: connection refused" node="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:02.359481 containerd[1779]: time="2025-12-16T13:40:02.359343009Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-a-7f096d1947,Uid:b8327394c4bbf31f869ef4b2b5575c90,Namespace:kube-system,Attempt:0,}" Dec 16 13:40:02.384751 containerd[1779]: time="2025-12-16T13:40:02.384442127Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-a-7f096d1947,Uid:a5de365130e76bebbd64bed8193f592b,Namespace:kube-system,Attempt:0,}" Dec 16 13:40:02.385058 containerd[1779]: time="2025-12-16T13:40:02.385026603Z" level=info msg="connecting to shim 6932abcc69ebb7e3dd31ad7720f9abb37eef316f7b252d725b98e66972a2555a" address="unix:///run/containerd/s/f20a01c57c7d55f2f6210331e3a0f4866f3ab7fbc17dba540bc68c57a8f92e1e" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:40:02.389257 containerd[1779]: time="2025-12-16T13:40:02.389214777Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-a-7f096d1947,Uid:28636bb0ff25a91b3ddd483b31735e1b,Namespace:kube-system,Attempt:0,}" Dec 16 13:40:02.410805 systemd[1]: Started cri-containerd-6932abcc69ebb7e3dd31ad7720f9abb37eef316f7b252d725b98e66972a2555a.scope - libcontainer container 6932abcc69ebb7e3dd31ad7720f9abb37eef316f7b252d725b98e66972a2555a. Dec 16 13:40:02.413058 containerd[1779]: time="2025-12-16T13:40:02.413026249Z" level=info msg="connecting to shim 99f6b13f3e6707e1e9d9ee2a7b4d2967131b90bef5366a7369a89022dc04ef34" address="unix:///run/containerd/s/c6ad5dc5a43cbb07989c2700f1f8f2a25dac783f22e92e4e9551f9678c1b5712" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:40:02.423438 containerd[1779]: time="2025-12-16T13:40:02.422784193Z" level=info msg="connecting to shim 43fc1367cb51f2a35f04018d18cf8662b84552019a6d3e3586c0a53a110e5962" address="unix:///run/containerd/s/ce506c5368c044f1040c711d785f84ddaf37c388bd90948411e203799d3e4610" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:40:02.436727 systemd[1]: Started cri-containerd-99f6b13f3e6707e1e9d9ee2a7b4d2967131b90bef5366a7369a89022dc04ef34.scope - libcontainer container 99f6b13f3e6707e1e9d9ee2a7b4d2967131b90bef5366a7369a89022dc04ef34. Dec 16 13:40:02.439001 systemd[1]: Started cri-containerd-43fc1367cb51f2a35f04018d18cf8662b84552019a6d3e3586c0a53a110e5962.scope - libcontainer container 43fc1367cb51f2a35f04018d18cf8662b84552019a6d3e3586c0a53a110e5962. Dec 16 13:40:02.460024 containerd[1779]: time="2025-12-16T13:40:02.459984375Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-a-7f096d1947,Uid:b8327394c4bbf31f869ef4b2b5575c90,Namespace:kube-system,Attempt:0,} returns sandbox id \"6932abcc69ebb7e3dd31ad7720f9abb37eef316f7b252d725b98e66972a2555a\"" Dec 16 13:40:02.464983 containerd[1779]: time="2025-12-16T13:40:02.464328622Z" level=info msg="CreateContainer within sandbox \"6932abcc69ebb7e3dd31ad7720f9abb37eef316f7b252d725b98e66972a2555a\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Dec 16 13:40:02.477156 containerd[1779]: time="2025-12-16T13:40:02.477122233Z" level=info msg="Container 9834abdb3cdfe3ce070baa5049c5ff831e9caec06fd6a60aad7b84dadaaa0783: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:40:02.486346 containerd[1779]: time="2025-12-16T13:40:02.486304936Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-a-7f096d1947,Uid:28636bb0ff25a91b3ddd483b31735e1b,Namespace:kube-system,Attempt:0,} returns sandbox id \"43fc1367cb51f2a35f04018d18cf8662b84552019a6d3e3586c0a53a110e5962\"" Dec 16 13:40:02.490237 containerd[1779]: time="2025-12-16T13:40:02.489960193Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-a-7f096d1947,Uid:a5de365130e76bebbd64bed8193f592b,Namespace:kube-system,Attempt:0,} returns sandbox id \"99f6b13f3e6707e1e9d9ee2a7b4d2967131b90bef5366a7369a89022dc04ef34\"" Dec 16 13:40:02.490898 containerd[1779]: time="2025-12-16T13:40:02.490880752Z" level=info msg="CreateContainer within sandbox \"6932abcc69ebb7e3dd31ad7720f9abb37eef316f7b252d725b98e66972a2555a\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"9834abdb3cdfe3ce070baa5049c5ff831e9caec06fd6a60aad7b84dadaaa0783\"" Dec 16 13:40:02.491039 containerd[1779]: time="2025-12-16T13:40:02.491018417Z" level=info msg="CreateContainer within sandbox \"43fc1367cb51f2a35f04018d18cf8662b84552019a6d3e3586c0a53a110e5962\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Dec 16 13:40:02.491658 containerd[1779]: time="2025-12-16T13:40:02.491631498Z" level=info msg="StartContainer for \"9834abdb3cdfe3ce070baa5049c5ff831e9caec06fd6a60aad7b84dadaaa0783\"" Dec 16 13:40:02.493521 containerd[1779]: time="2025-12-16T13:40:02.493497998Z" level=info msg="connecting to shim 9834abdb3cdfe3ce070baa5049c5ff831e9caec06fd6a60aad7b84dadaaa0783" address="unix:///run/containerd/s/f20a01c57c7d55f2f6210331e3a0f4866f3ab7fbc17dba540bc68c57a8f92e1e" protocol=ttrpc version=3 Dec 16 13:40:02.494345 containerd[1779]: time="2025-12-16T13:40:02.494319261Z" level=info msg="CreateContainer within sandbox \"99f6b13f3e6707e1e9d9ee2a7b4d2967131b90bef5366a7369a89022dc04ef34\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Dec 16 13:40:02.500018 kubelet[2691]: E1216 13:40:02.499975 2691 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.21.93:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-a-7f096d1947?timeout=10s\": dial tcp 10.0.21.93:6443: connect: connection refused" interval="800ms" Dec 16 13:40:02.502131 containerd[1779]: time="2025-12-16T13:40:02.502093238Z" level=info msg="Container bf54ddf8924781bfc588f42ab8e89aa5cd1ec8b3685bdf4e5c8ef7a9a0196a05: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:40:02.510104 containerd[1779]: time="2025-12-16T13:40:02.510054330Z" level=info msg="Container d2f01c2f8448f7d617d6df94713fbead1ba0e243119dda27fbdb747fc15c7dfb: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:40:02.514686 containerd[1779]: time="2025-12-16T13:40:02.514653851Z" level=info msg="CreateContainer within sandbox \"43fc1367cb51f2a35f04018d18cf8662b84552019a6d3e3586c0a53a110e5962\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"bf54ddf8924781bfc588f42ab8e89aa5cd1ec8b3685bdf4e5c8ef7a9a0196a05\"" Dec 16 13:40:02.515319 containerd[1779]: time="2025-12-16T13:40:02.515067331Z" level=info msg="StartContainer for \"bf54ddf8924781bfc588f42ab8e89aa5cd1ec8b3685bdf4e5c8ef7a9a0196a05\"" Dec 16 13:40:02.516564 containerd[1779]: time="2025-12-16T13:40:02.516533403Z" level=info msg="connecting to shim bf54ddf8924781bfc588f42ab8e89aa5cd1ec8b3685bdf4e5c8ef7a9a0196a05" address="unix:///run/containerd/s/ce506c5368c044f1040c711d785f84ddaf37c388bd90948411e203799d3e4610" protocol=ttrpc version=3 Dec 16 13:40:02.518194 containerd[1779]: time="2025-12-16T13:40:02.518157336Z" level=info msg="CreateContainer within sandbox \"99f6b13f3e6707e1e9d9ee2a7b4d2967131b90bef5366a7369a89022dc04ef34\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"d2f01c2f8448f7d617d6df94713fbead1ba0e243119dda27fbdb747fc15c7dfb\"" Dec 16 13:40:02.518581 containerd[1779]: time="2025-12-16T13:40:02.518546564Z" level=info msg="StartContainer for \"d2f01c2f8448f7d617d6df94713fbead1ba0e243119dda27fbdb747fc15c7dfb\"" Dec 16 13:40:02.519358 containerd[1779]: time="2025-12-16T13:40:02.519337278Z" level=info msg="connecting to shim d2f01c2f8448f7d617d6df94713fbead1ba0e243119dda27fbdb747fc15c7dfb" address="unix:///run/containerd/s/c6ad5dc5a43cbb07989c2700f1f8f2a25dac783f22e92e4e9551f9678c1b5712" protocol=ttrpc version=3 Dec 16 13:40:02.519766 systemd[1]: Started cri-containerd-9834abdb3cdfe3ce070baa5049c5ff831e9caec06fd6a60aad7b84dadaaa0783.scope - libcontainer container 9834abdb3cdfe3ce070baa5049c5ff831e9caec06fd6a60aad7b84dadaaa0783. Dec 16 13:40:02.552879 systemd[1]: Started cri-containerd-d2f01c2f8448f7d617d6df94713fbead1ba0e243119dda27fbdb747fc15c7dfb.scope - libcontainer container d2f01c2f8448f7d617d6df94713fbead1ba0e243119dda27fbdb747fc15c7dfb. Dec 16 13:40:02.556856 systemd[1]: Started cri-containerd-bf54ddf8924781bfc588f42ab8e89aa5cd1ec8b3685bdf4e5c8ef7a9a0196a05.scope - libcontainer container bf54ddf8924781bfc588f42ab8e89aa5cd1ec8b3685bdf4e5c8ef7a9a0196a05. Dec 16 13:40:02.592758 containerd[1779]: time="2025-12-16T13:40:02.592652824Z" level=info msg="StartContainer for \"9834abdb3cdfe3ce070baa5049c5ff831e9caec06fd6a60aad7b84dadaaa0783\" returns successfully" Dec 16 13:40:02.605502 containerd[1779]: time="2025-12-16T13:40:02.605434028Z" level=info msg="StartContainer for \"bf54ddf8924781bfc588f42ab8e89aa5cd1ec8b3685bdf4e5c8ef7a9a0196a05\" returns successfully" Dec 16 13:40:02.605692 containerd[1779]: time="2025-12-16T13:40:02.605529753Z" level=info msg="StartContainer for \"d2f01c2f8448f7d617d6df94713fbead1ba0e243119dda27fbdb747fc15c7dfb\" returns successfully" Dec 16 13:40:02.675132 kubelet[2691]: I1216 13:40:02.675100 2691 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:02.927690 kubelet[2691]: E1216 13:40:02.927501 2691 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-a-7f096d1947\" not found" node="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:02.929902 kubelet[2691]: E1216 13:40:02.929809 2691 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-a-7f096d1947\" not found" node="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:02.934817 kubelet[2691]: E1216 13:40:02.934784 2691 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-a-7f096d1947\" not found" node="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:03.645566 kubelet[2691]: E1216 13:40:03.645356 2691 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-2-2-a-7f096d1947\" not found" node="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:03.748112 kubelet[2691]: I1216 13:40:03.748046 2691 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:03.797293 kubelet[2691]: I1216 13:40:03.797236 2691 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:03.804280 kubelet[2691]: E1216 13:40:03.804218 2691 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-a-7f096d1947\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:03.804280 kubelet[2691]: I1216 13:40:03.804246 2691 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:03.805603 kubelet[2691]: E1216 13:40:03.805562 2691 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-2-a-7f096d1947\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:03.805603 kubelet[2691]: I1216 13:40:03.805590 2691 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:03.806904 kubelet[2691]: E1216 13:40:03.806871 2691 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-a-7f096d1947\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:03.888104 kubelet[2691]: I1216 13:40:03.888063 2691 apiserver.go:52] "Watching apiserver" Dec 16 13:40:03.897250 kubelet[2691]: I1216 13:40:03.897155 2691 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 16 13:40:03.937602 kubelet[2691]: I1216 13:40:03.937367 2691 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:03.937602 kubelet[2691]: I1216 13:40:03.937474 2691 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:03.939014 kubelet[2691]: E1216 13:40:03.938983 2691 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-a-7f096d1947\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:03.939088 kubelet[2691]: E1216 13:40:03.939045 2691 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-a-7f096d1947\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:04.015384 kubelet[2691]: I1216 13:40:04.015323 2691 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:04.017158 kubelet[2691]: E1216 13:40:04.017125 2691 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-2-a-7f096d1947\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:05.907672 systemd[1]: Reload requested from client PID 2985 ('systemctl') (unit session-9.scope)... Dec 16 13:40:05.907688 systemd[1]: Reloading... Dec 16 13:40:05.962589 zram_generator::config[3025]: No configuration found. Dec 16 13:40:06.155995 systemd[1]: Reloading finished in 248 ms. Dec 16 13:40:06.189758 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:40:06.197069 systemd[1]: kubelet.service: Deactivated successfully. Dec 16 13:40:06.197265 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:40:06.197311 systemd[1]: kubelet.service: Consumed 789ms CPU time, 136.8M memory peak. Dec 16 13:40:06.199296 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:40:06.325246 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:40:06.329524 (kubelet)[3080]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 13:40:06.365676 kubelet[3080]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 13:40:06.365676 kubelet[3080]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 13:40:06.365676 kubelet[3080]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 13:40:06.365974 kubelet[3080]: I1216 13:40:06.365746 3080 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 13:40:06.373324 kubelet[3080]: I1216 13:40:06.373296 3080 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Dec 16 13:40:06.374509 kubelet[3080]: I1216 13:40:06.373383 3080 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 13:40:06.374509 kubelet[3080]: I1216 13:40:06.373621 3080 server.go:956] "Client rotation is on, will bootstrap in background" Dec 16 13:40:06.374704 kubelet[3080]: I1216 13:40:06.374688 3080 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Dec 16 13:40:06.380348 kubelet[3080]: I1216 13:40:06.380311 3080 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 13:40:06.384808 kubelet[3080]: I1216 13:40:06.384788 3080 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 13:40:06.389906 kubelet[3080]: I1216 13:40:06.389860 3080 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 16 13:40:06.390104 kubelet[3080]: I1216 13:40:06.390071 3080 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 13:40:06.390242 kubelet[3080]: I1216 13:40:06.390101 3080 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-a-7f096d1947","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 13:40:06.390323 kubelet[3080]: I1216 13:40:06.390248 3080 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 13:40:06.390323 kubelet[3080]: I1216 13:40:06.390276 3080 container_manager_linux.go:303] "Creating device plugin manager" Dec 16 13:40:06.390323 kubelet[3080]: I1216 13:40:06.390320 3080 state_mem.go:36] "Initialized new in-memory state store" Dec 16 13:40:06.390485 kubelet[3080]: I1216 13:40:06.390476 3080 kubelet.go:480] "Attempting to sync node with API server" Dec 16 13:40:06.390509 kubelet[3080]: I1216 13:40:06.390488 3080 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 13:40:06.390509 kubelet[3080]: I1216 13:40:06.390507 3080 kubelet.go:386] "Adding apiserver pod source" Dec 16 13:40:06.390557 kubelet[3080]: I1216 13:40:06.390518 3080 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 13:40:06.391279 kubelet[3080]: I1216 13:40:06.391260 3080 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 16 13:40:06.393214 kubelet[3080]: I1216 13:40:06.393175 3080 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 16 13:40:06.395169 kubelet[3080]: I1216 13:40:06.395148 3080 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 16 13:40:06.395227 kubelet[3080]: I1216 13:40:06.395189 3080 server.go:1289] "Started kubelet" Dec 16 13:40:06.395327 kubelet[3080]: I1216 13:40:06.395296 3080 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 13:40:06.395568 kubelet[3080]: I1216 13:40:06.395421 3080 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 13:40:06.395703 kubelet[3080]: I1216 13:40:06.395691 3080 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 13:40:06.398756 kubelet[3080]: I1216 13:40:06.398738 3080 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 13:40:06.401482 kubelet[3080]: I1216 13:40:06.401447 3080 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 16 13:40:06.402176 kubelet[3080]: I1216 13:40:06.402152 3080 server.go:317] "Adding debug handlers to kubelet server" Dec 16 13:40:06.402245 kubelet[3080]: I1216 13:40:06.402224 3080 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 16 13:40:06.402458 kubelet[3080]: I1216 13:40:06.402439 3080 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 13:40:06.403656 kubelet[3080]: I1216 13:40:06.403637 3080 factory.go:223] Registration of the systemd container factory successfully Dec 16 13:40:06.403734 kubelet[3080]: I1216 13:40:06.403717 3080 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 13:40:06.404293 kubelet[3080]: I1216 13:40:06.404282 3080 reconciler.go:26] "Reconciler: start to sync state" Dec 16 13:40:06.404322 kubelet[3080]: I1216 13:40:06.404304 3080 factory.go:223] Registration of the containerd container factory successfully Dec 16 13:40:06.404996 kubelet[3080]: E1216 13:40:06.404729 3080 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 16 13:40:06.407906 kubelet[3080]: I1216 13:40:06.407848 3080 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Dec 16 13:40:06.413022 kubelet[3080]: I1216 13:40:06.412989 3080 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Dec 16 13:40:06.413022 kubelet[3080]: I1216 13:40:06.413011 3080 status_manager.go:230] "Starting to sync pod status with apiserver" Dec 16 13:40:06.413130 kubelet[3080]: I1216 13:40:06.413028 3080 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 13:40:06.413130 kubelet[3080]: I1216 13:40:06.413035 3080 kubelet.go:2436] "Starting kubelet main sync loop" Dec 16 13:40:06.413130 kubelet[3080]: E1216 13:40:06.413070 3080 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 13:40:06.429364 kubelet[3080]: I1216 13:40:06.429332 3080 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 13:40:06.429364 kubelet[3080]: I1216 13:40:06.429346 3080 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 13:40:06.429364 kubelet[3080]: I1216 13:40:06.429361 3080 state_mem.go:36] "Initialized new in-memory state store" Dec 16 13:40:06.429522 kubelet[3080]: I1216 13:40:06.429476 3080 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 16 13:40:06.429522 kubelet[3080]: I1216 13:40:06.429483 3080 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 16 13:40:06.429522 kubelet[3080]: I1216 13:40:06.429497 3080 policy_none.go:49] "None policy: Start" Dec 16 13:40:06.429522 kubelet[3080]: I1216 13:40:06.429506 3080 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 16 13:40:06.429522 kubelet[3080]: I1216 13:40:06.429514 3080 state_mem.go:35] "Initializing new in-memory state store" Dec 16 13:40:06.429640 kubelet[3080]: I1216 13:40:06.429598 3080 state_mem.go:75] "Updated machine memory state" Dec 16 13:40:06.432904 kubelet[3080]: E1216 13:40:06.432795 3080 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 16 13:40:06.432974 kubelet[3080]: I1216 13:40:06.432951 3080 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 13:40:06.433006 kubelet[3080]: I1216 13:40:06.432961 3080 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 13:40:06.433130 kubelet[3080]: I1216 13:40:06.433116 3080 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 13:40:06.433638 kubelet[3080]: E1216 13:40:06.433621 3080 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 13:40:06.514777 kubelet[3080]: I1216 13:40:06.514686 3080 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:06.515428 kubelet[3080]: I1216 13:40:06.514689 3080 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:06.515428 kubelet[3080]: I1216 13:40:06.515077 3080 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:06.536977 kubelet[3080]: I1216 13:40:06.536947 3080 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:06.544397 kubelet[3080]: I1216 13:40:06.544366 3080 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:06.544514 kubelet[3080]: I1216 13:40:06.544443 3080 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:06.605535 kubelet[3080]: I1216 13:40:06.605375 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/28636bb0ff25a91b3ddd483b31735e1b-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-a-7f096d1947\" (UID: \"28636bb0ff25a91b3ddd483b31735e1b\") " pod="kube-system/kube-scheduler-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:06.605535 kubelet[3080]: I1216 13:40:06.605405 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b8327394c4bbf31f869ef4b2b5575c90-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-a-7f096d1947\" (UID: \"b8327394c4bbf31f869ef4b2b5575c90\") " pod="kube-system/kube-apiserver-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:06.605535 kubelet[3080]: I1216 13:40:06.605424 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a5de365130e76bebbd64bed8193f592b-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-a-7f096d1947\" (UID: \"a5de365130e76bebbd64bed8193f592b\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:06.605535 kubelet[3080]: I1216 13:40:06.605439 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a5de365130e76bebbd64bed8193f592b-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-a-7f096d1947\" (UID: \"a5de365130e76bebbd64bed8193f592b\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:06.605535 kubelet[3080]: I1216 13:40:06.605452 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b8327394c4bbf31f869ef4b2b5575c90-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-a-7f096d1947\" (UID: \"b8327394c4bbf31f869ef4b2b5575c90\") " pod="kube-system/kube-apiserver-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:06.605761 kubelet[3080]: I1216 13:40:06.605487 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b8327394c4bbf31f869ef4b2b5575c90-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-a-7f096d1947\" (UID: \"b8327394c4bbf31f869ef4b2b5575c90\") " pod="kube-system/kube-apiserver-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:06.605761 kubelet[3080]: I1216 13:40:06.605536 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a5de365130e76bebbd64bed8193f592b-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-a-7f096d1947\" (UID: \"a5de365130e76bebbd64bed8193f592b\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:06.605761 kubelet[3080]: I1216 13:40:06.605578 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a5de365130e76bebbd64bed8193f592b-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-a-7f096d1947\" (UID: \"a5de365130e76bebbd64bed8193f592b\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:06.605761 kubelet[3080]: I1216 13:40:06.605596 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a5de365130e76bebbd64bed8193f592b-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-a-7f096d1947\" (UID: \"a5de365130e76bebbd64bed8193f592b\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:07.391471 kubelet[3080]: I1216 13:40:07.391416 3080 apiserver.go:52] "Watching apiserver" Dec 16 13:40:07.402837 kubelet[3080]: I1216 13:40:07.402687 3080 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 16 13:40:07.420251 kubelet[3080]: I1216 13:40:07.420227 3080 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:07.420428 kubelet[3080]: I1216 13:40:07.420410 3080 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:07.420685 kubelet[3080]: I1216 13:40:07.420671 3080 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:07.428136 kubelet[3080]: E1216 13:40:07.428110 3080 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-a-7f096d1947\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:07.428835 kubelet[3080]: E1216 13:40:07.428809 3080 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-2-a-7f096d1947\" already exists" pod="kube-system/kube-controller-manager-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:07.428961 kubelet[3080]: E1216 13:40:07.428939 3080 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-a-7f096d1947\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-2-a-7f096d1947" Dec 16 13:40:07.443302 kubelet[3080]: I1216 13:40:07.442842 3080 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-2-a-7f096d1947" podStartSLOduration=1.44282189 podStartE2EDuration="1.44282189s" podCreationTimestamp="2025-12-16 13:40:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:40:07.442441691 +0000 UTC m=+1.109688770" watchObservedRunningTime="2025-12-16 13:40:07.44282189 +0000 UTC m=+1.110068948" Dec 16 13:40:07.463443 kubelet[3080]: I1216 13:40:07.463297 3080 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-2-a-7f096d1947" podStartSLOduration=1.4632787170000001 podStartE2EDuration="1.463278717s" podCreationTimestamp="2025-12-16 13:40:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:40:07.454886312 +0000 UTC m=+1.122133414" watchObservedRunningTime="2025-12-16 13:40:07.463278717 +0000 UTC m=+1.130525775" Dec 16 13:40:07.470847 kubelet[3080]: I1216 13:40:07.470791 3080 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-2-a-7f096d1947" podStartSLOduration=1.470776037 podStartE2EDuration="1.470776037s" podCreationTimestamp="2025-12-16 13:40:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:40:07.463240308 +0000 UTC m=+1.130487389" watchObservedRunningTime="2025-12-16 13:40:07.470776037 +0000 UTC m=+1.138023109" Dec 16 13:40:12.703404 kubelet[3080]: I1216 13:40:12.703366 3080 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Dec 16 13:40:12.703734 containerd[1779]: time="2025-12-16T13:40:12.703646544Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Dec 16 13:40:12.703906 kubelet[3080]: I1216 13:40:12.703798 3080 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Dec 16 13:40:13.755233 systemd[1]: Created slice kubepods-besteffort-pod55272e55_9d43_489d_961d_ec282f3ee217.slice - libcontainer container kubepods-besteffort-pod55272e55_9d43_489d_961d_ec282f3ee217.slice. Dec 16 13:40:13.848578 kubelet[3080]: I1216 13:40:13.848504 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/55272e55-9d43-489d-961d-ec282f3ee217-kube-proxy\") pod \"kube-proxy-f7xg9\" (UID: \"55272e55-9d43-489d-961d-ec282f3ee217\") " pod="kube-system/kube-proxy-f7xg9" Dec 16 13:40:13.848578 kubelet[3080]: I1216 13:40:13.848572 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/55272e55-9d43-489d-961d-ec282f3ee217-xtables-lock\") pod \"kube-proxy-f7xg9\" (UID: \"55272e55-9d43-489d-961d-ec282f3ee217\") " pod="kube-system/kube-proxy-f7xg9" Dec 16 13:40:13.848972 kubelet[3080]: I1216 13:40:13.848640 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/55272e55-9d43-489d-961d-ec282f3ee217-lib-modules\") pod \"kube-proxy-f7xg9\" (UID: \"55272e55-9d43-489d-961d-ec282f3ee217\") " pod="kube-system/kube-proxy-f7xg9" Dec 16 13:40:13.848972 kubelet[3080]: I1216 13:40:13.848675 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxq9z\" (UniqueName: \"kubernetes.io/projected/55272e55-9d43-489d-961d-ec282f3ee217-kube-api-access-nxq9z\") pod \"kube-proxy-f7xg9\" (UID: \"55272e55-9d43-489d-961d-ec282f3ee217\") " pod="kube-system/kube-proxy-f7xg9" Dec 16 13:40:13.910762 systemd[1]: Created slice kubepods-besteffort-podf9282a71_1e62_4807_9b32_a4a8ec8b3bc3.slice - libcontainer container kubepods-besteffort-podf9282a71_1e62_4807_9b32_a4a8ec8b3bc3.slice. Dec 16 13:40:13.949954 kubelet[3080]: I1216 13:40:13.949679 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/f9282a71-1e62-4807-9b32-a4a8ec8b3bc3-var-lib-calico\") pod \"tigera-operator-7dcd859c48-52gxm\" (UID: \"f9282a71-1e62-4807-9b32-a4a8ec8b3bc3\") " pod="tigera-operator/tigera-operator-7dcd859c48-52gxm" Dec 16 13:40:13.949954 kubelet[3080]: I1216 13:40:13.949742 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr4lh\" (UniqueName: \"kubernetes.io/projected/f9282a71-1e62-4807-9b32-a4a8ec8b3bc3-kube-api-access-hr4lh\") pod \"tigera-operator-7dcd859c48-52gxm\" (UID: \"f9282a71-1e62-4807-9b32-a4a8ec8b3bc3\") " pod="tigera-operator/tigera-operator-7dcd859c48-52gxm" Dec 16 13:40:14.076681 containerd[1779]: time="2025-12-16T13:40:14.076529412Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-f7xg9,Uid:55272e55-9d43-489d-961d-ec282f3ee217,Namespace:kube-system,Attempt:0,}" Dec 16 13:40:14.098292 containerd[1779]: time="2025-12-16T13:40:14.098226759Z" level=info msg="connecting to shim edfd61f1bbc74e19fbe48b2a64329031ed7a66530739a3d76ea20d6b3165beae" address="unix:///run/containerd/s/f7d91d90ed4dd39b68fcbf2397c50a52cf4e78f934dc05f9d9d01d4be4210ba1" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:40:14.123847 systemd[1]: Started cri-containerd-edfd61f1bbc74e19fbe48b2a64329031ed7a66530739a3d76ea20d6b3165beae.scope - libcontainer container edfd61f1bbc74e19fbe48b2a64329031ed7a66530739a3d76ea20d6b3165beae. Dec 16 13:40:14.147064 containerd[1779]: time="2025-12-16T13:40:14.147021257Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-f7xg9,Uid:55272e55-9d43-489d-961d-ec282f3ee217,Namespace:kube-system,Attempt:0,} returns sandbox id \"edfd61f1bbc74e19fbe48b2a64329031ed7a66530739a3d76ea20d6b3165beae\"" Dec 16 13:40:14.151782 containerd[1779]: time="2025-12-16T13:40:14.151741015Z" level=info msg="CreateContainer within sandbox \"edfd61f1bbc74e19fbe48b2a64329031ed7a66530739a3d76ea20d6b3165beae\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Dec 16 13:40:14.164122 containerd[1779]: time="2025-12-16T13:40:14.164080271Z" level=info msg="Container e6e605f500c772f48121fc517d58bff81eab8b7aa3219d02d412130af4da2908: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:40:14.173094 containerd[1779]: time="2025-12-16T13:40:14.173030849Z" level=info msg="CreateContainer within sandbox \"edfd61f1bbc74e19fbe48b2a64329031ed7a66530739a3d76ea20d6b3165beae\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"e6e605f500c772f48121fc517d58bff81eab8b7aa3219d02d412130af4da2908\"" Dec 16 13:40:14.173608 containerd[1779]: time="2025-12-16T13:40:14.173583952Z" level=info msg="StartContainer for \"e6e605f500c772f48121fc517d58bff81eab8b7aa3219d02d412130af4da2908\"" Dec 16 13:40:14.174703 containerd[1779]: time="2025-12-16T13:40:14.174683965Z" level=info msg="connecting to shim e6e605f500c772f48121fc517d58bff81eab8b7aa3219d02d412130af4da2908" address="unix:///run/containerd/s/f7d91d90ed4dd39b68fcbf2397c50a52cf4e78f934dc05f9d9d01d4be4210ba1" protocol=ttrpc version=3 Dec 16 13:40:14.201790 systemd[1]: Started cri-containerd-e6e605f500c772f48121fc517d58bff81eab8b7aa3219d02d412130af4da2908.scope - libcontainer container e6e605f500c772f48121fc517d58bff81eab8b7aa3219d02d412130af4da2908. Dec 16 13:40:14.213371 containerd[1779]: time="2025-12-16T13:40:14.213313350Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-52gxm,Uid:f9282a71-1e62-4807-9b32-a4a8ec8b3bc3,Namespace:tigera-operator,Attempt:0,}" Dec 16 13:40:14.235727 containerd[1779]: time="2025-12-16T13:40:14.235667613Z" level=info msg="connecting to shim a1f592c89c6404b0e83a3a88cde728deaf28654ee76060a5f4613f98ed440a46" address="unix:///run/containerd/s/298593836c23173c01e2ec85ee67d4b4e73cf47a88e76a8d6fbdb722e5d8d4d0" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:40:14.267799 systemd[1]: Started cri-containerd-a1f592c89c6404b0e83a3a88cde728deaf28654ee76060a5f4613f98ed440a46.scope - libcontainer container a1f592c89c6404b0e83a3a88cde728deaf28654ee76060a5f4613f98ed440a46. Dec 16 13:40:14.289177 containerd[1779]: time="2025-12-16T13:40:14.288281042Z" level=info msg="StartContainer for \"e6e605f500c772f48121fc517d58bff81eab8b7aa3219d02d412130af4da2908\" returns successfully" Dec 16 13:40:14.316203 containerd[1779]: time="2025-12-16T13:40:14.316141547Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-52gxm,Uid:f9282a71-1e62-4807-9b32-a4a8ec8b3bc3,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"a1f592c89c6404b0e83a3a88cde728deaf28654ee76060a5f4613f98ed440a46\"" Dec 16 13:40:14.317617 containerd[1779]: time="2025-12-16T13:40:14.317564241Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Dec 16 13:40:14.452693 kubelet[3080]: I1216 13:40:14.452624 3080 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-f7xg9" podStartSLOduration=1.452598555 podStartE2EDuration="1.452598555s" podCreationTimestamp="2025-12-16 13:40:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:40:14.443434717 +0000 UTC m=+8.110681773" watchObservedRunningTime="2025-12-16 13:40:14.452598555 +0000 UTC m=+8.119845629" Dec 16 13:40:17.033275 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2194601440.mount: Deactivated successfully. Dec 16 13:40:17.887955 containerd[1779]: time="2025-12-16T13:40:17.887875404Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:40:17.889068 containerd[1779]: time="2025-12-16T13:40:17.889032950Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=25061691" Dec 16 13:40:17.890532 containerd[1779]: time="2025-12-16T13:40:17.890494837Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:40:17.893266 containerd[1779]: time="2025-12-16T13:40:17.893216997Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:40:17.893747 containerd[1779]: time="2025-12-16T13:40:17.893726082Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 3.576129965s" Dec 16 13:40:17.893786 containerd[1779]: time="2025-12-16T13:40:17.893753006Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Dec 16 13:40:17.897482 containerd[1779]: time="2025-12-16T13:40:17.897449548Z" level=info msg="CreateContainer within sandbox \"a1f592c89c6404b0e83a3a88cde728deaf28654ee76060a5f4613f98ed440a46\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Dec 16 13:40:17.908140 containerd[1779]: time="2025-12-16T13:40:17.907618401Z" level=info msg="Container d3848637bdef26fa564147cfa1b0c65bba01b9d7ee0d1116df461038da80c2da: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:40:17.915953 containerd[1779]: time="2025-12-16T13:40:17.915900104Z" level=info msg="CreateContainer within sandbox \"a1f592c89c6404b0e83a3a88cde728deaf28654ee76060a5f4613f98ed440a46\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"d3848637bdef26fa564147cfa1b0c65bba01b9d7ee0d1116df461038da80c2da\"" Dec 16 13:40:17.916344 containerd[1779]: time="2025-12-16T13:40:17.916317389Z" level=info msg="StartContainer for \"d3848637bdef26fa564147cfa1b0c65bba01b9d7ee0d1116df461038da80c2da\"" Dec 16 13:40:17.917505 containerd[1779]: time="2025-12-16T13:40:17.917076943Z" level=info msg="connecting to shim d3848637bdef26fa564147cfa1b0c65bba01b9d7ee0d1116df461038da80c2da" address="unix:///run/containerd/s/298593836c23173c01e2ec85ee67d4b4e73cf47a88e76a8d6fbdb722e5d8d4d0" protocol=ttrpc version=3 Dec 16 13:40:17.941795 systemd[1]: Started cri-containerd-d3848637bdef26fa564147cfa1b0c65bba01b9d7ee0d1116df461038da80c2da.scope - libcontainer container d3848637bdef26fa564147cfa1b0c65bba01b9d7ee0d1116df461038da80c2da. Dec 16 13:40:17.971930 containerd[1779]: time="2025-12-16T13:40:17.971859966Z" level=info msg="StartContainer for \"d3848637bdef26fa564147cfa1b0c65bba01b9d7ee0d1116df461038da80c2da\" returns successfully" Dec 16 13:40:18.452944 kubelet[3080]: I1216 13:40:18.452838 3080 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-52gxm" podStartSLOduration=1.8757401059999999 podStartE2EDuration="5.452811043s" podCreationTimestamp="2025-12-16 13:40:13 +0000 UTC" firstStartedPulling="2025-12-16 13:40:14.317277987 +0000 UTC m=+7.984525044" lastFinishedPulling="2025-12-16 13:40:17.894348917 +0000 UTC m=+11.561595981" observedRunningTime="2025-12-16 13:40:18.45280294 +0000 UTC m=+12.120050019" watchObservedRunningTime="2025-12-16 13:40:18.452811043 +0000 UTC m=+12.120058120" Dec 16 13:40:19.701514 systemd[1]: cri-containerd-d3848637bdef26fa564147cfa1b0c65bba01b9d7ee0d1116df461038da80c2da.scope: Deactivated successfully. Dec 16 13:40:19.703220 containerd[1779]: time="2025-12-16T13:40:19.703190401Z" level=info msg="received container exit event container_id:\"d3848637bdef26fa564147cfa1b0c65bba01b9d7ee0d1116df461038da80c2da\" id:\"d3848637bdef26fa564147cfa1b0c65bba01b9d7ee0d1116df461038da80c2da\" pid:3420 exit_status:1 exited_at:{seconds:1765892419 nanos:702822280}" Dec 16 13:40:19.725196 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d3848637bdef26fa564147cfa1b0c65bba01b9d7ee0d1116df461038da80c2da-rootfs.mount: Deactivated successfully. Dec 16 13:40:20.446204 kubelet[3080]: I1216 13:40:20.446159 3080 scope.go:117] "RemoveContainer" containerID="d3848637bdef26fa564147cfa1b0c65bba01b9d7ee0d1116df461038da80c2da" Dec 16 13:40:20.448617 containerd[1779]: time="2025-12-16T13:40:20.448307132Z" level=info msg="CreateContainer within sandbox \"a1f592c89c6404b0e83a3a88cde728deaf28654ee76060a5f4613f98ed440a46\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Dec 16 13:40:20.460311 containerd[1779]: time="2025-12-16T13:40:20.460248676Z" level=info msg="Container 3918d2dd5324cbf8b1979462384233bd28fb750339fa9a7d3bcd81ffc9c9cc3b: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:40:20.467504 containerd[1779]: time="2025-12-16T13:40:20.467459356Z" level=info msg="CreateContainer within sandbox \"a1f592c89c6404b0e83a3a88cde728deaf28654ee76060a5f4613f98ed440a46\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"3918d2dd5324cbf8b1979462384233bd28fb750339fa9a7d3bcd81ffc9c9cc3b\"" Dec 16 13:40:20.468325 containerd[1779]: time="2025-12-16T13:40:20.468302649Z" level=info msg="StartContainer for \"3918d2dd5324cbf8b1979462384233bd28fb750339fa9a7d3bcd81ffc9c9cc3b\"" Dec 16 13:40:20.469040 containerd[1779]: time="2025-12-16T13:40:20.469014754Z" level=info msg="connecting to shim 3918d2dd5324cbf8b1979462384233bd28fb750339fa9a7d3bcd81ffc9c9cc3b" address="unix:///run/containerd/s/298593836c23173c01e2ec85ee67d4b4e73cf47a88e76a8d6fbdb722e5d8d4d0" protocol=ttrpc version=3 Dec 16 13:40:20.494764 systemd[1]: Started cri-containerd-3918d2dd5324cbf8b1979462384233bd28fb750339fa9a7d3bcd81ffc9c9cc3b.scope - libcontainer container 3918d2dd5324cbf8b1979462384233bd28fb750339fa9a7d3bcd81ffc9c9cc3b. Dec 16 13:40:20.519216 containerd[1779]: time="2025-12-16T13:40:20.519177470Z" level=info msg="StartContainer for \"3918d2dd5324cbf8b1979462384233bd28fb750339fa9a7d3bcd81ffc9c9cc3b\" returns successfully" Dec 16 13:40:22.640600 sudo[2066]: pam_unix(sudo:session): session closed for user root Dec 16 13:40:22.795079 sshd[2065]: Connection closed by 147.75.109.163 port 54794 Dec 16 13:40:22.795606 sshd-session[2062]: pam_unix(sshd:session): session closed for user core Dec 16 13:40:22.798959 systemd[1]: sshd@8-10.0.21.93:22-147.75.109.163:54794.service: Deactivated successfully. Dec 16 13:40:22.800604 systemd[1]: session-9.scope: Deactivated successfully. Dec 16 13:40:22.800760 systemd[1]: session-9.scope: Consumed 5.066s CPU time, 240.5M memory peak. Dec 16 13:40:22.801598 systemd-logind[1758]: Session 9 logged out. Waiting for processes to exit. Dec 16 13:40:22.802475 systemd-logind[1758]: Removed session 9. Dec 16 13:40:27.792202 systemd[1]: Created slice kubepods-besteffort-podfe31b04e_5762_4886_8a0d_2891e0d925de.slice - libcontainer container kubepods-besteffort-podfe31b04e_5762_4886_8a0d_2891e0d925de.slice. Dec 16 13:40:27.828823 kubelet[3080]: I1216 13:40:27.828775 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe31b04e-5762-4886-8a0d-2891e0d925de-tigera-ca-bundle\") pod \"calico-typha-69d9dd669-ck6rh\" (UID: \"fe31b04e-5762-4886-8a0d-2891e0d925de\") " pod="calico-system/calico-typha-69d9dd669-ck6rh" Dec 16 13:40:27.828823 kubelet[3080]: I1216 13:40:27.828816 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/fe31b04e-5762-4886-8a0d-2891e0d925de-typha-certs\") pod \"calico-typha-69d9dd669-ck6rh\" (UID: \"fe31b04e-5762-4886-8a0d-2891e0d925de\") " pod="calico-system/calico-typha-69d9dd669-ck6rh" Dec 16 13:40:27.829191 kubelet[3080]: I1216 13:40:27.828834 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhbcz\" (UniqueName: \"kubernetes.io/projected/fe31b04e-5762-4886-8a0d-2891e0d925de-kube-api-access-qhbcz\") pod \"calico-typha-69d9dd669-ck6rh\" (UID: \"fe31b04e-5762-4886-8a0d-2891e0d925de\") " pod="calico-system/calico-typha-69d9dd669-ck6rh" Dec 16 13:40:27.921429 systemd[1]: Created slice kubepods-besteffort-podf10d75dd_ab6f_4506_8ddc_e2aebecea5b5.slice - libcontainer container kubepods-besteffort-podf10d75dd_ab6f_4506_8ddc_e2aebecea5b5.slice. Dec 16 13:40:27.929056 kubelet[3080]: I1216 13:40:27.928967 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cjt7\" (UniqueName: \"kubernetes.io/projected/f10d75dd-ab6f-4506-8ddc-e2aebecea5b5-kube-api-access-9cjt7\") pod \"calico-node-tgbz2\" (UID: \"f10d75dd-ab6f-4506-8ddc-e2aebecea5b5\") " pod="calico-system/calico-node-tgbz2" Dec 16 13:40:27.929056 kubelet[3080]: I1216 13:40:27.929011 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/f10d75dd-ab6f-4506-8ddc-e2aebecea5b5-flexvol-driver-host\") pod \"calico-node-tgbz2\" (UID: \"f10d75dd-ab6f-4506-8ddc-e2aebecea5b5\") " pod="calico-system/calico-node-tgbz2" Dec 16 13:40:27.929056 kubelet[3080]: I1216 13:40:27.929026 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/f10d75dd-ab6f-4506-8ddc-e2aebecea5b5-policysync\") pod \"calico-node-tgbz2\" (UID: \"f10d75dd-ab6f-4506-8ddc-e2aebecea5b5\") " pod="calico-system/calico-node-tgbz2" Dec 16 13:40:27.929860 kubelet[3080]: I1216 13:40:27.929070 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/f10d75dd-ab6f-4506-8ddc-e2aebecea5b5-var-lib-calico\") pod \"calico-node-tgbz2\" (UID: \"f10d75dd-ab6f-4506-8ddc-e2aebecea5b5\") " pod="calico-system/calico-node-tgbz2" Dec 16 13:40:27.929860 kubelet[3080]: I1216 13:40:27.929466 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/f10d75dd-ab6f-4506-8ddc-e2aebecea5b5-cni-log-dir\") pod \"calico-node-tgbz2\" (UID: \"f10d75dd-ab6f-4506-8ddc-e2aebecea5b5\") " pod="calico-system/calico-node-tgbz2" Dec 16 13:40:27.929860 kubelet[3080]: I1216 13:40:27.929543 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f10d75dd-ab6f-4506-8ddc-e2aebecea5b5-lib-modules\") pod \"calico-node-tgbz2\" (UID: \"f10d75dd-ab6f-4506-8ddc-e2aebecea5b5\") " pod="calico-system/calico-node-tgbz2" Dec 16 13:40:27.929860 kubelet[3080]: I1216 13:40:27.929567 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f10d75dd-ab6f-4506-8ddc-e2aebecea5b5-tigera-ca-bundle\") pod \"calico-node-tgbz2\" (UID: \"f10d75dd-ab6f-4506-8ddc-e2aebecea5b5\") " pod="calico-system/calico-node-tgbz2" Dec 16 13:40:27.929860 kubelet[3080]: I1216 13:40:27.929642 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/f10d75dd-ab6f-4506-8ddc-e2aebecea5b5-cni-bin-dir\") pod \"calico-node-tgbz2\" (UID: \"f10d75dd-ab6f-4506-8ddc-e2aebecea5b5\") " pod="calico-system/calico-node-tgbz2" Dec 16 13:40:27.930448 kubelet[3080]: I1216 13:40:27.929674 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/f10d75dd-ab6f-4506-8ddc-e2aebecea5b5-cni-net-dir\") pod \"calico-node-tgbz2\" (UID: \"f10d75dd-ab6f-4506-8ddc-e2aebecea5b5\") " pod="calico-system/calico-node-tgbz2" Dec 16 13:40:27.930448 kubelet[3080]: I1216 13:40:27.929690 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/f10d75dd-ab6f-4506-8ddc-e2aebecea5b5-var-run-calico\") pod \"calico-node-tgbz2\" (UID: \"f10d75dd-ab6f-4506-8ddc-e2aebecea5b5\") " pod="calico-system/calico-node-tgbz2" Dec 16 13:40:27.930448 kubelet[3080]: I1216 13:40:27.929733 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f10d75dd-ab6f-4506-8ddc-e2aebecea5b5-xtables-lock\") pod \"calico-node-tgbz2\" (UID: \"f10d75dd-ab6f-4506-8ddc-e2aebecea5b5\") " pod="calico-system/calico-node-tgbz2" Dec 16 13:40:27.930448 kubelet[3080]: I1216 13:40:27.929750 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/f10d75dd-ab6f-4506-8ddc-e2aebecea5b5-node-certs\") pod \"calico-node-tgbz2\" (UID: \"f10d75dd-ab6f-4506-8ddc-e2aebecea5b5\") " pod="calico-system/calico-node-tgbz2" Dec 16 13:40:28.031234 kubelet[3080]: E1216 13:40:28.031198 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.031234 kubelet[3080]: W1216 13:40:28.031221 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.031419 kubelet[3080]: E1216 13:40:28.031252 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.033271 kubelet[3080]: E1216 13:40:28.033245 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.033271 kubelet[3080]: W1216 13:40:28.033280 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.033271 kubelet[3080]: E1216 13:40:28.033298 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.042358 kubelet[3080]: E1216 13:40:28.042250 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.042358 kubelet[3080]: W1216 13:40:28.042293 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.042358 kubelet[3080]: E1216 13:40:28.042311 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.070527 kubelet[3080]: E1216 13:40:28.070314 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:40:28.097894 containerd[1779]: time="2025-12-16T13:40:28.096938257Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-69d9dd669-ck6rh,Uid:fe31b04e-5762-4886-8a0d-2891e0d925de,Namespace:calico-system,Attempt:0,}" Dec 16 13:40:28.120866 containerd[1779]: time="2025-12-16T13:40:28.120805779Z" level=info msg="connecting to shim 961626ce4ffde7f7b7f5647a9de3ab5d709d4dad3df216691e14856e8191ce77" address="unix:///run/containerd/s/951e37ff36a1ff70efaed3cec8baf688146ca825b6178e208fe7a03f70e10f74" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:40:28.121363 kubelet[3080]: E1216 13:40:28.121256 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.121363 kubelet[3080]: W1216 13:40:28.121276 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.121363 kubelet[3080]: E1216 13:40:28.121295 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.121502 kubelet[3080]: E1216 13:40:28.121496 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.121624 kubelet[3080]: W1216 13:40:28.121533 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.121624 kubelet[3080]: E1216 13:40:28.121543 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.121709 kubelet[3080]: E1216 13:40:28.121703 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.121745 kubelet[3080]: W1216 13:40:28.121739 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.121875 kubelet[3080]: E1216 13:40:28.121778 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.121964 kubelet[3080]: E1216 13:40:28.121957 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.122001 kubelet[3080]: W1216 13:40:28.121996 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.122037 kubelet[3080]: E1216 13:40:28.122031 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.122299 kubelet[3080]: E1216 13:40:28.122197 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.122299 kubelet[3080]: W1216 13:40:28.122204 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.122299 kubelet[3080]: E1216 13:40:28.122213 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.122414 kubelet[3080]: E1216 13:40:28.122408 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.122449 kubelet[3080]: W1216 13:40:28.122444 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.122490 kubelet[3080]: E1216 13:40:28.122484 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.122671 kubelet[3080]: E1216 13:40:28.122664 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.122710 kubelet[3080]: W1216 13:40:28.122705 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.122813 kubelet[3080]: E1216 13:40:28.122750 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.122881 kubelet[3080]: E1216 13:40:28.122875 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.122916 kubelet[3080]: W1216 13:40:28.122910 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.122950 kubelet[3080]: E1216 13:40:28.122944 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.123108 kubelet[3080]: E1216 13:40:28.123102 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.123204 kubelet[3080]: W1216 13:40:28.123143 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.123204 kubelet[3080]: E1216 13:40:28.123151 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.123285 kubelet[3080]: E1216 13:40:28.123280 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.123318 kubelet[3080]: W1216 13:40:28.123313 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.123353 kubelet[3080]: E1216 13:40:28.123348 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.124160 kubelet[3080]: E1216 13:40:28.124068 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.124160 kubelet[3080]: W1216 13:40:28.124079 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.124160 kubelet[3080]: E1216 13:40:28.124089 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.125311 kubelet[3080]: E1216 13:40:28.124266 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.125311 kubelet[3080]: W1216 13:40:28.124271 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.125311 kubelet[3080]: E1216 13:40:28.124278 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.125311 kubelet[3080]: E1216 13:40:28.124412 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.125311 kubelet[3080]: W1216 13:40:28.124417 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.125311 kubelet[3080]: E1216 13:40:28.124423 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.125311 kubelet[3080]: E1216 13:40:28.124571 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.125311 kubelet[3080]: W1216 13:40:28.124576 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.125311 kubelet[3080]: E1216 13:40:28.124582 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.125311 kubelet[3080]: E1216 13:40:28.124708 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.125566 kubelet[3080]: W1216 13:40:28.124713 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.125566 kubelet[3080]: E1216 13:40:28.124719 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.125566 kubelet[3080]: E1216 13:40:28.124843 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.125566 kubelet[3080]: W1216 13:40:28.124847 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.125566 kubelet[3080]: E1216 13:40:28.124853 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.125566 kubelet[3080]: E1216 13:40:28.124989 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.125566 kubelet[3080]: W1216 13:40:28.124993 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.125566 kubelet[3080]: E1216 13:40:28.124999 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.125566 kubelet[3080]: E1216 13:40:28.125112 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.125566 kubelet[3080]: W1216 13:40:28.125118 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.125861 kubelet[3080]: E1216 13:40:28.125123 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.125861 kubelet[3080]: E1216 13:40:28.125244 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.125861 kubelet[3080]: W1216 13:40:28.125249 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.125861 kubelet[3080]: E1216 13:40:28.125255 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.125861 kubelet[3080]: E1216 13:40:28.125382 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.125861 kubelet[3080]: W1216 13:40:28.125387 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.125861 kubelet[3080]: E1216 13:40:28.125393 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.130780 kubelet[3080]: E1216 13:40:28.130735 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.130780 kubelet[3080]: W1216 13:40:28.130757 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.130780 kubelet[3080]: E1216 13:40:28.130774 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.130921 kubelet[3080]: I1216 13:40:28.130798 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/713c1552-24ab-4b60-9872-de4f52adb23b-registration-dir\") pod \"csi-node-driver-wzss6\" (UID: \"713c1552-24ab-4b60-9872-de4f52adb23b\") " pod="calico-system/csi-node-driver-wzss6" Dec 16 13:40:28.130992 kubelet[3080]: E1216 13:40:28.130974 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.130992 kubelet[3080]: W1216 13:40:28.130985 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.130992 kubelet[3080]: E1216 13:40:28.130991 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.131055 kubelet[3080]: I1216 13:40:28.131008 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2425\" (UniqueName: \"kubernetes.io/projected/713c1552-24ab-4b60-9872-de4f52adb23b-kube-api-access-t2425\") pod \"csi-node-driver-wzss6\" (UID: \"713c1552-24ab-4b60-9872-de4f52adb23b\") " pod="calico-system/csi-node-driver-wzss6" Dec 16 13:40:28.131231 kubelet[3080]: E1216 13:40:28.131214 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.131261 kubelet[3080]: W1216 13:40:28.131230 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.131261 kubelet[3080]: E1216 13:40:28.131244 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.131464 kubelet[3080]: E1216 13:40:28.131407 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.131464 kubelet[3080]: W1216 13:40:28.131419 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.131464 kubelet[3080]: E1216 13:40:28.131432 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.131632 kubelet[3080]: E1216 13:40:28.131622 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.131632 kubelet[3080]: W1216 13:40:28.131631 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.131679 kubelet[3080]: E1216 13:40:28.131638 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.131679 kubelet[3080]: I1216 13:40:28.131667 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/713c1552-24ab-4b60-9872-de4f52adb23b-socket-dir\") pod \"csi-node-driver-wzss6\" (UID: \"713c1552-24ab-4b60-9872-de4f52adb23b\") " pod="calico-system/csi-node-driver-wzss6" Dec 16 13:40:28.131853 kubelet[3080]: E1216 13:40:28.131838 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.131853 kubelet[3080]: W1216 13:40:28.131847 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.131898 kubelet[3080]: E1216 13:40:28.131854 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.131898 kubelet[3080]: I1216 13:40:28.131887 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/713c1552-24ab-4b60-9872-de4f52adb23b-kubelet-dir\") pod \"csi-node-driver-wzss6\" (UID: \"713c1552-24ab-4b60-9872-de4f52adb23b\") " pod="calico-system/csi-node-driver-wzss6" Dec 16 13:40:28.132021 kubelet[3080]: E1216 13:40:28.132010 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.132042 kubelet[3080]: W1216 13:40:28.132022 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.132042 kubelet[3080]: E1216 13:40:28.132030 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.132168 kubelet[3080]: E1216 13:40:28.132160 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.132168 kubelet[3080]: W1216 13:40:28.132167 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.132209 kubelet[3080]: E1216 13:40:28.132173 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.132313 kubelet[3080]: E1216 13:40:28.132305 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.132313 kubelet[3080]: W1216 13:40:28.132312 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.132358 kubelet[3080]: E1216 13:40:28.132318 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.132444 kubelet[3080]: E1216 13:40:28.132437 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.132444 kubelet[3080]: W1216 13:40:28.132444 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.132483 kubelet[3080]: E1216 13:40:28.132450 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.132600 kubelet[3080]: E1216 13:40:28.132592 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.132600 kubelet[3080]: W1216 13:40:28.132600 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.132641 kubelet[3080]: E1216 13:40:28.132606 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.132641 kubelet[3080]: I1216 13:40:28.132627 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/713c1552-24ab-4b60-9872-de4f52adb23b-varrun\") pod \"csi-node-driver-wzss6\" (UID: \"713c1552-24ab-4b60-9872-de4f52adb23b\") " pod="calico-system/csi-node-driver-wzss6" Dec 16 13:40:28.133130 kubelet[3080]: E1216 13:40:28.132813 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.133130 kubelet[3080]: W1216 13:40:28.132828 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.133130 kubelet[3080]: E1216 13:40:28.132840 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.133130 kubelet[3080]: E1216 13:40:28.132974 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.133130 kubelet[3080]: W1216 13:40:28.132980 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.133130 kubelet[3080]: E1216 13:40:28.132986 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.133130 kubelet[3080]: E1216 13:40:28.133126 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.133130 kubelet[3080]: W1216 13:40:28.133132 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.133130 kubelet[3080]: E1216 13:40:28.133137 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.133337 kubelet[3080]: E1216 13:40:28.133260 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.133337 kubelet[3080]: W1216 13:40:28.133265 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.133337 kubelet[3080]: E1216 13:40:28.133271 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.154791 systemd[1]: Started cri-containerd-961626ce4ffde7f7b7f5647a9de3ab5d709d4dad3df216691e14856e8191ce77.scope - libcontainer container 961626ce4ffde7f7b7f5647a9de3ab5d709d4dad3df216691e14856e8191ce77. Dec 16 13:40:28.195452 containerd[1779]: time="2025-12-16T13:40:28.195400989Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-69d9dd669-ck6rh,Uid:fe31b04e-5762-4886-8a0d-2891e0d925de,Namespace:calico-system,Attempt:0,} returns sandbox id \"961626ce4ffde7f7b7f5647a9de3ab5d709d4dad3df216691e14856e8191ce77\"" Dec 16 13:40:28.196658 containerd[1779]: time="2025-12-16T13:40:28.196627569Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Dec 16 13:40:28.226592 containerd[1779]: time="2025-12-16T13:40:28.226475678Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-tgbz2,Uid:f10d75dd-ab6f-4506-8ddc-e2aebecea5b5,Namespace:calico-system,Attempt:0,}" Dec 16 13:40:28.235510 kubelet[3080]: E1216 13:40:28.235465 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.235510 kubelet[3080]: W1216 13:40:28.235499 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.235668 kubelet[3080]: E1216 13:40:28.235531 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.235818 kubelet[3080]: E1216 13:40:28.235806 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.235818 kubelet[3080]: W1216 13:40:28.235816 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.235862 kubelet[3080]: E1216 13:40:28.235824 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.236042 kubelet[3080]: E1216 13:40:28.236025 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.236042 kubelet[3080]: W1216 13:40:28.236034 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.236042 kubelet[3080]: E1216 13:40:28.236041 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.236211 kubelet[3080]: E1216 13:40:28.236200 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.236211 kubelet[3080]: W1216 13:40:28.236209 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.236264 kubelet[3080]: E1216 13:40:28.236215 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.236375 kubelet[3080]: E1216 13:40:28.236364 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.236375 kubelet[3080]: W1216 13:40:28.236373 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.236426 kubelet[3080]: E1216 13:40:28.236379 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.236633 kubelet[3080]: E1216 13:40:28.236621 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.236633 kubelet[3080]: W1216 13:40:28.236631 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.236690 kubelet[3080]: E1216 13:40:28.236638 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.236806 kubelet[3080]: E1216 13:40:28.236795 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.236806 kubelet[3080]: W1216 13:40:28.236804 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.236863 kubelet[3080]: E1216 13:40:28.236810 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.236957 kubelet[3080]: E1216 13:40:28.236946 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.236957 kubelet[3080]: W1216 13:40:28.236954 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.237025 kubelet[3080]: E1216 13:40:28.236960 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.237590 kubelet[3080]: E1216 13:40:28.237132 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.237590 kubelet[3080]: W1216 13:40:28.237141 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.237590 kubelet[3080]: E1216 13:40:28.237147 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.237590 kubelet[3080]: E1216 13:40:28.237296 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.237590 kubelet[3080]: W1216 13:40:28.237301 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.237590 kubelet[3080]: E1216 13:40:28.237307 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.237590 kubelet[3080]: E1216 13:40:28.237473 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.237590 kubelet[3080]: W1216 13:40:28.237478 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.237590 kubelet[3080]: E1216 13:40:28.237486 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.237842 kubelet[3080]: E1216 13:40:28.237640 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.237842 kubelet[3080]: W1216 13:40:28.237646 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.237842 kubelet[3080]: E1216 13:40:28.237652 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.237842 kubelet[3080]: E1216 13:40:28.237788 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.237842 kubelet[3080]: W1216 13:40:28.237793 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.237842 kubelet[3080]: E1216 13:40:28.237798 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.237986 kubelet[3080]: E1216 13:40:28.237974 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.237986 kubelet[3080]: W1216 13:40:28.237983 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.238073 kubelet[3080]: E1216 13:40:28.237988 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.238536 kubelet[3080]: E1216 13:40:28.238508 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.238536 kubelet[3080]: W1216 13:40:28.238530 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.238602 kubelet[3080]: E1216 13:40:28.238558 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.238796 kubelet[3080]: E1216 13:40:28.238727 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.238796 kubelet[3080]: W1216 13:40:28.238740 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.238796 kubelet[3080]: E1216 13:40:28.238747 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.238919 kubelet[3080]: E1216 13:40:28.238908 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.238919 kubelet[3080]: W1216 13:40:28.238918 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.238961 kubelet[3080]: E1216 13:40:28.238926 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.239068 kubelet[3080]: E1216 13:40:28.239053 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.239068 kubelet[3080]: W1216 13:40:28.239061 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.239068 kubelet[3080]: E1216 13:40:28.239067 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.239195 kubelet[3080]: E1216 13:40:28.239186 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.239195 kubelet[3080]: W1216 13:40:28.239193 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.239237 kubelet[3080]: E1216 13:40:28.239199 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.239362 kubelet[3080]: E1216 13:40:28.239351 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.239362 kubelet[3080]: W1216 13:40:28.239359 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.239412 kubelet[3080]: E1216 13:40:28.239364 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.239590 kubelet[3080]: E1216 13:40:28.239581 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.239590 kubelet[3080]: W1216 13:40:28.239589 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.239640 kubelet[3080]: E1216 13:40:28.239596 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.239847 kubelet[3080]: E1216 13:40:28.239828 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.239873 kubelet[3080]: W1216 13:40:28.239844 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.239873 kubelet[3080]: E1216 13:40:28.239857 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.240035 kubelet[3080]: E1216 13:40:28.240026 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.240035 kubelet[3080]: W1216 13:40:28.240035 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.240073 kubelet[3080]: E1216 13:40:28.240043 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.240249 kubelet[3080]: E1216 13:40:28.240240 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.240272 kubelet[3080]: W1216 13:40:28.240249 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.240272 kubelet[3080]: E1216 13:40:28.240256 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.240424 kubelet[3080]: E1216 13:40:28.240415 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.240424 kubelet[3080]: W1216 13:40:28.240424 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.240468 kubelet[3080]: E1216 13:40:28.240430 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.246372 kubelet[3080]: E1216 13:40:28.246337 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:28.246372 kubelet[3080]: W1216 13:40:28.246363 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:28.246513 kubelet[3080]: E1216 13:40:28.246410 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:28.256065 containerd[1779]: time="2025-12-16T13:40:28.256023555Z" level=info msg="connecting to shim 966c8a7be74c14f6cf939cc4d2daa54cbbe8cf0f78cb4c059bfd6149c31cfeef" address="unix:///run/containerd/s/f86a0b195f8d78164b7683fe688c4a2f3b017a716e22c5831fb9b4086eebaea5" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:40:28.293844 systemd[1]: Started cri-containerd-966c8a7be74c14f6cf939cc4d2daa54cbbe8cf0f78cb4c059bfd6149c31cfeef.scope - libcontainer container 966c8a7be74c14f6cf939cc4d2daa54cbbe8cf0f78cb4c059bfd6149c31cfeef. Dec 16 13:40:28.318941 containerd[1779]: time="2025-12-16T13:40:28.318880767Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-tgbz2,Uid:f10d75dd-ab6f-4506-8ddc-e2aebecea5b5,Namespace:calico-system,Attempt:0,} returns sandbox id \"966c8a7be74c14f6cf939cc4d2daa54cbbe8cf0f78cb4c059bfd6149c31cfeef\"" Dec 16 13:40:29.414169 kubelet[3080]: E1216 13:40:29.414099 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:40:29.716027 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1605854708.mount: Deactivated successfully. Dec 16 13:40:30.586875 containerd[1779]: time="2025-12-16T13:40:30.586807614Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:40:30.588009 containerd[1779]: time="2025-12-16T13:40:30.587972081Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=35234628" Dec 16 13:40:30.589594 containerd[1779]: time="2025-12-16T13:40:30.589558708Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:40:30.592380 containerd[1779]: time="2025-12-16T13:40:30.592341109Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:40:30.592823 containerd[1779]: time="2025-12-16T13:40:30.592799742Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 2.396137247s" Dec 16 13:40:30.592856 containerd[1779]: time="2025-12-16T13:40:30.592824760Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Dec 16 13:40:30.593595 containerd[1779]: time="2025-12-16T13:40:30.593567712Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Dec 16 13:40:30.603879 containerd[1779]: time="2025-12-16T13:40:30.603832951Z" level=info msg="CreateContainer within sandbox \"961626ce4ffde7f7b7f5647a9de3ab5d709d4dad3df216691e14856e8191ce77\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Dec 16 13:40:30.612600 containerd[1779]: time="2025-12-16T13:40:30.612526858Z" level=info msg="Container edd6ff8e02bba7b3bf627ab1264fc4294d8424fb1abf8dde7cc11395e1ee92f5: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:40:30.622110 containerd[1779]: time="2025-12-16T13:40:30.622064670Z" level=info msg="CreateContainer within sandbox \"961626ce4ffde7f7b7f5647a9de3ab5d709d4dad3df216691e14856e8191ce77\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"edd6ff8e02bba7b3bf627ab1264fc4294d8424fb1abf8dde7cc11395e1ee92f5\"" Dec 16 13:40:30.622709 containerd[1779]: time="2025-12-16T13:40:30.622602140Z" level=info msg="StartContainer for \"edd6ff8e02bba7b3bf627ab1264fc4294d8424fb1abf8dde7cc11395e1ee92f5\"" Dec 16 13:40:30.623741 containerd[1779]: time="2025-12-16T13:40:30.623678049Z" level=info msg="connecting to shim edd6ff8e02bba7b3bf627ab1264fc4294d8424fb1abf8dde7cc11395e1ee92f5" address="unix:///run/containerd/s/951e37ff36a1ff70efaed3cec8baf688146ca825b6178e208fe7a03f70e10f74" protocol=ttrpc version=3 Dec 16 13:40:30.648766 systemd[1]: Started cri-containerd-edd6ff8e02bba7b3bf627ab1264fc4294d8424fb1abf8dde7cc11395e1ee92f5.scope - libcontainer container edd6ff8e02bba7b3bf627ab1264fc4294d8424fb1abf8dde7cc11395e1ee92f5. Dec 16 13:40:30.695118 containerd[1779]: time="2025-12-16T13:40:30.695066119Z" level=info msg="StartContainer for \"edd6ff8e02bba7b3bf627ab1264fc4294d8424fb1abf8dde7cc11395e1ee92f5\" returns successfully" Dec 16 13:40:31.414083 kubelet[3080]: E1216 13:40:31.414028 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:40:31.482073 kubelet[3080]: I1216 13:40:31.482022 3080 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-69d9dd669-ck6rh" podStartSLOduration=2.084983147 podStartE2EDuration="4.482009469s" podCreationTimestamp="2025-12-16 13:40:27 +0000 UTC" firstStartedPulling="2025-12-16 13:40:28.196426291 +0000 UTC m=+21.863673348" lastFinishedPulling="2025-12-16 13:40:30.593452613 +0000 UTC m=+24.260699670" observedRunningTime="2025-12-16 13:40:31.481962595 +0000 UTC m=+25.149209695" watchObservedRunningTime="2025-12-16 13:40:31.482009469 +0000 UTC m=+25.149256549" Dec 16 13:40:31.544157 kubelet[3080]: E1216 13:40:31.544103 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.544157 kubelet[3080]: W1216 13:40:31.544132 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.544157 kubelet[3080]: E1216 13:40:31.544153 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.544346 kubelet[3080]: E1216 13:40:31.544335 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.544346 kubelet[3080]: W1216 13:40:31.544343 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.544402 kubelet[3080]: E1216 13:40:31.544349 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.544480 kubelet[3080]: E1216 13:40:31.544471 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.544480 kubelet[3080]: W1216 13:40:31.544478 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.544524 kubelet[3080]: E1216 13:40:31.544485 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.544679 kubelet[3080]: E1216 13:40:31.544663 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.544679 kubelet[3080]: W1216 13:40:31.544672 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.544679 kubelet[3080]: E1216 13:40:31.544678 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.544810 kubelet[3080]: E1216 13:40:31.544796 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.544810 kubelet[3080]: W1216 13:40:31.544804 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.544854 kubelet[3080]: E1216 13:40:31.544810 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.544932 kubelet[3080]: E1216 13:40:31.544923 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.544932 kubelet[3080]: W1216 13:40:31.544930 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.544976 kubelet[3080]: E1216 13:40:31.544936 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.545047 kubelet[3080]: E1216 13:40:31.545037 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.545047 kubelet[3080]: W1216 13:40:31.545045 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.545088 kubelet[3080]: E1216 13:40:31.545051 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.545168 kubelet[3080]: E1216 13:40:31.545158 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.545168 kubelet[3080]: W1216 13:40:31.545166 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.545213 kubelet[3080]: E1216 13:40:31.545172 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.545304 kubelet[3080]: E1216 13:40:31.545293 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.545304 kubelet[3080]: W1216 13:40:31.545300 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.545344 kubelet[3080]: E1216 13:40:31.545306 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.545444 kubelet[3080]: E1216 13:40:31.545434 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.545444 kubelet[3080]: W1216 13:40:31.545441 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.545490 kubelet[3080]: E1216 13:40:31.545446 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.545585 kubelet[3080]: E1216 13:40:31.545574 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.545585 kubelet[3080]: W1216 13:40:31.545582 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.545625 kubelet[3080]: E1216 13:40:31.545587 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.545704 kubelet[3080]: E1216 13:40:31.545694 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.545704 kubelet[3080]: W1216 13:40:31.545701 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.545750 kubelet[3080]: E1216 13:40:31.545729 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.545858 kubelet[3080]: E1216 13:40:31.545847 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.545858 kubelet[3080]: W1216 13:40:31.545854 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.545898 kubelet[3080]: E1216 13:40:31.545860 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.545983 kubelet[3080]: E1216 13:40:31.545972 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.545983 kubelet[3080]: W1216 13:40:31.545979 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.546029 kubelet[3080]: E1216 13:40:31.545985 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.546103 kubelet[3080]: E1216 13:40:31.546093 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.546103 kubelet[3080]: W1216 13:40:31.546101 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.546149 kubelet[3080]: E1216 13:40:31.546107 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.560505 kubelet[3080]: E1216 13:40:31.560481 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.560505 kubelet[3080]: W1216 13:40:31.560497 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.560505 kubelet[3080]: E1216 13:40:31.560510 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.560677 kubelet[3080]: E1216 13:40:31.560660 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.560677 kubelet[3080]: W1216 13:40:31.560666 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.560677 kubelet[3080]: E1216 13:40:31.560672 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.560917 kubelet[3080]: E1216 13:40:31.560891 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.560917 kubelet[3080]: W1216 13:40:31.560910 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.560964 kubelet[3080]: E1216 13:40:31.560923 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.561104 kubelet[3080]: E1216 13:40:31.561088 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.561104 kubelet[3080]: W1216 13:40:31.561096 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.561104 kubelet[3080]: E1216 13:40:31.561103 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.561245 kubelet[3080]: E1216 13:40:31.561235 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.561245 kubelet[3080]: W1216 13:40:31.561242 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.561287 kubelet[3080]: E1216 13:40:31.561250 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.561395 kubelet[3080]: E1216 13:40:31.561385 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.561395 kubelet[3080]: W1216 13:40:31.561393 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.561435 kubelet[3080]: E1216 13:40:31.561399 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.561653 kubelet[3080]: E1216 13:40:31.561635 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.561653 kubelet[3080]: W1216 13:40:31.561650 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.561699 kubelet[3080]: E1216 13:40:31.561658 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.561831 kubelet[3080]: E1216 13:40:31.561819 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.561831 kubelet[3080]: W1216 13:40:31.561829 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.561872 kubelet[3080]: E1216 13:40:31.561836 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.561990 kubelet[3080]: E1216 13:40:31.561980 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.561990 kubelet[3080]: W1216 13:40:31.561988 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.562032 kubelet[3080]: E1216 13:40:31.561994 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.562117 kubelet[3080]: E1216 13:40:31.562107 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.562117 kubelet[3080]: W1216 13:40:31.562114 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.562155 kubelet[3080]: E1216 13:40:31.562120 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.562245 kubelet[3080]: E1216 13:40:31.562235 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.562245 kubelet[3080]: W1216 13:40:31.562244 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.562294 kubelet[3080]: E1216 13:40:31.562250 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.562396 kubelet[3080]: E1216 13:40:31.562386 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.562396 kubelet[3080]: W1216 13:40:31.562394 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.562433 kubelet[3080]: E1216 13:40:31.562399 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.562519 kubelet[3080]: E1216 13:40:31.562510 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.562519 kubelet[3080]: W1216 13:40:31.562517 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.562570 kubelet[3080]: E1216 13:40:31.562522 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.562666 kubelet[3080]: E1216 13:40:31.562657 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.562666 kubelet[3080]: W1216 13:40:31.562664 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.562707 kubelet[3080]: E1216 13:40:31.562671 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.562920 kubelet[3080]: E1216 13:40:31.562904 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.562942 kubelet[3080]: W1216 13:40:31.562919 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.562942 kubelet[3080]: E1216 13:40:31.562930 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.563092 kubelet[3080]: E1216 13:40:31.563082 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.563092 kubelet[3080]: W1216 13:40:31.563090 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.563131 kubelet[3080]: E1216 13:40:31.563097 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.563340 kubelet[3080]: E1216 13:40:31.563326 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.563364 kubelet[3080]: W1216 13:40:31.563340 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.563364 kubelet[3080]: E1216 13:40:31.563350 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:31.563490 kubelet[3080]: E1216 13:40:31.563481 3080 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:40:31.563490 kubelet[3080]: W1216 13:40:31.563489 3080 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:40:31.563528 kubelet[3080]: E1216 13:40:31.563495 3080 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:40:32.241736 containerd[1779]: time="2025-12-16T13:40:32.241664122Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:40:32.242894 containerd[1779]: time="2025-12-16T13:40:32.242871547Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4446754" Dec 16 13:40:32.244334 containerd[1779]: time="2025-12-16T13:40:32.244285449Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:40:32.247083 containerd[1779]: time="2025-12-16T13:40:32.246717536Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:40:32.247216 containerd[1779]: time="2025-12-16T13:40:32.247192642Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.653597079s" Dec 16 13:40:32.247359 containerd[1779]: time="2025-12-16T13:40:32.247268325Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Dec 16 13:40:32.251205 containerd[1779]: time="2025-12-16T13:40:32.251172838Z" level=info msg="CreateContainer within sandbox \"966c8a7be74c14f6cf939cc4d2daa54cbbe8cf0f78cb4c059bfd6149c31cfeef\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Dec 16 13:40:32.260753 containerd[1779]: time="2025-12-16T13:40:32.260707072Z" level=info msg="Container d3e76ded9d5a74adc5d0894755fa827a06664b1222381b8ceaf422482314c602: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:40:32.270701 containerd[1779]: time="2025-12-16T13:40:32.270665627Z" level=info msg="CreateContainer within sandbox \"966c8a7be74c14f6cf939cc4d2daa54cbbe8cf0f78cb4c059bfd6149c31cfeef\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"d3e76ded9d5a74adc5d0894755fa827a06664b1222381b8ceaf422482314c602\"" Dec 16 13:40:32.271260 containerd[1779]: time="2025-12-16T13:40:32.271227308Z" level=info msg="StartContainer for \"d3e76ded9d5a74adc5d0894755fa827a06664b1222381b8ceaf422482314c602\"" Dec 16 13:40:32.272523 containerd[1779]: time="2025-12-16T13:40:32.272487737Z" level=info msg="connecting to shim d3e76ded9d5a74adc5d0894755fa827a06664b1222381b8ceaf422482314c602" address="unix:///run/containerd/s/f86a0b195f8d78164b7683fe688c4a2f3b017a716e22c5831fb9b4086eebaea5" protocol=ttrpc version=3 Dec 16 13:40:32.299948 systemd[1]: Started cri-containerd-d3e76ded9d5a74adc5d0894755fa827a06664b1222381b8ceaf422482314c602.scope - libcontainer container d3e76ded9d5a74adc5d0894755fa827a06664b1222381b8ceaf422482314c602. Dec 16 13:40:32.385234 containerd[1779]: time="2025-12-16T13:40:32.385190318Z" level=info msg="StartContainer for \"d3e76ded9d5a74adc5d0894755fa827a06664b1222381b8ceaf422482314c602\" returns successfully" Dec 16 13:40:32.391232 systemd[1]: cri-containerd-d3e76ded9d5a74adc5d0894755fa827a06664b1222381b8ceaf422482314c602.scope: Deactivated successfully. Dec 16 13:40:32.393876 containerd[1779]: time="2025-12-16T13:40:32.393844488Z" level=info msg="received container exit event container_id:\"d3e76ded9d5a74adc5d0894755fa827a06664b1222381b8ceaf422482314c602\" id:\"d3e76ded9d5a74adc5d0894755fa827a06664b1222381b8ceaf422482314c602\" pid:3863 exited_at:{seconds:1765892432 nanos:393612864}" Dec 16 13:40:32.412750 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d3e76ded9d5a74adc5d0894755fa827a06664b1222381b8ceaf422482314c602-rootfs.mount: Deactivated successfully. Dec 16 13:40:32.475740 containerd[1779]: time="2025-12-16T13:40:32.475706932Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Dec 16 13:40:33.414358 kubelet[3080]: E1216 13:40:33.414300 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:40:35.413820 kubelet[3080]: E1216 13:40:35.413779 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:40:35.760776 containerd[1779]: time="2025-12-16T13:40:35.760669991Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:40:35.761825 containerd[1779]: time="2025-12-16T13:40:35.761788263Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70446859" Dec 16 13:40:35.763320 containerd[1779]: time="2025-12-16T13:40:35.763291926Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:40:35.765232 containerd[1779]: time="2025-12-16T13:40:35.765208404Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:40:35.765732 containerd[1779]: time="2025-12-16T13:40:35.765710291Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 3.289968645s" Dec 16 13:40:35.765762 containerd[1779]: time="2025-12-16T13:40:35.765737980Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Dec 16 13:40:35.769220 containerd[1779]: time="2025-12-16T13:40:35.769188331Z" level=info msg="CreateContainer within sandbox \"966c8a7be74c14f6cf939cc4d2daa54cbbe8cf0f78cb4c059bfd6149c31cfeef\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Dec 16 13:40:35.779614 containerd[1779]: time="2025-12-16T13:40:35.779541490Z" level=info msg="Container 9bc408590bf0f500ea271e2b87c062c73e9b469ea2d0fd7dbaa2b6092d283582: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:40:35.790576 containerd[1779]: time="2025-12-16T13:40:35.789815091Z" level=info msg="CreateContainer within sandbox \"966c8a7be74c14f6cf939cc4d2daa54cbbe8cf0f78cb4c059bfd6149c31cfeef\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"9bc408590bf0f500ea271e2b87c062c73e9b469ea2d0fd7dbaa2b6092d283582\"" Dec 16 13:40:35.790874 containerd[1779]: time="2025-12-16T13:40:35.790676467Z" level=info msg="StartContainer for \"9bc408590bf0f500ea271e2b87c062c73e9b469ea2d0fd7dbaa2b6092d283582\"" Dec 16 13:40:35.793467 containerd[1779]: time="2025-12-16T13:40:35.793436144Z" level=info msg="connecting to shim 9bc408590bf0f500ea271e2b87c062c73e9b469ea2d0fd7dbaa2b6092d283582" address="unix:///run/containerd/s/f86a0b195f8d78164b7683fe688c4a2f3b017a716e22c5831fb9b4086eebaea5" protocol=ttrpc version=3 Dec 16 13:40:35.818730 systemd[1]: Started cri-containerd-9bc408590bf0f500ea271e2b87c062c73e9b469ea2d0fd7dbaa2b6092d283582.scope - libcontainer container 9bc408590bf0f500ea271e2b87c062c73e9b469ea2d0fd7dbaa2b6092d283582. Dec 16 13:40:35.902924 containerd[1779]: time="2025-12-16T13:40:35.902870656Z" level=info msg="StartContainer for \"9bc408590bf0f500ea271e2b87c062c73e9b469ea2d0fd7dbaa2b6092d283582\" returns successfully" Dec 16 13:40:36.280348 systemd[1]: cri-containerd-9bc408590bf0f500ea271e2b87c062c73e9b469ea2d0fd7dbaa2b6092d283582.scope: Deactivated successfully. Dec 16 13:40:36.280604 systemd[1]: cri-containerd-9bc408590bf0f500ea271e2b87c062c73e9b469ea2d0fd7dbaa2b6092d283582.scope: Consumed 488ms CPU time, 192.6M memory peak, 171.3M written to disk. Dec 16 13:40:36.281255 containerd[1779]: time="2025-12-16T13:40:36.281226407Z" level=info msg="received container exit event container_id:\"9bc408590bf0f500ea271e2b87c062c73e9b469ea2d0fd7dbaa2b6092d283582\" id:\"9bc408590bf0f500ea271e2b87c062c73e9b469ea2d0fd7dbaa2b6092d283582\" pid:3925 exited_at:{seconds:1765892436 nanos:281046606}" Dec 16 13:40:36.300743 kubelet[3080]: I1216 13:40:36.299992 3080 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Dec 16 13:40:36.300297 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9bc408590bf0f500ea271e2b87c062c73e9b469ea2d0fd7dbaa2b6092d283582-rootfs.mount: Deactivated successfully. Dec 16 13:40:36.385795 systemd[1]: Created slice kubepods-burstable-pod94bfd095_3b1b_4404_872c_700cc2c230a9.slice - libcontainer container kubepods-burstable-pod94bfd095_3b1b_4404_872c_700cc2c230a9.slice. Dec 16 13:40:36.394531 systemd[1]: Created slice kubepods-besteffort-pod4477497f_a609_4791_b674_25df11e8ec73.slice - libcontainer container kubepods-besteffort-pod4477497f_a609_4791_b674_25df11e8ec73.slice. Dec 16 13:40:36.399613 systemd[1]: Created slice kubepods-besteffort-pod1fd68d8f_7c37_4d7e_a053_da6704f9d286.slice - libcontainer container kubepods-besteffort-pod1fd68d8f_7c37_4d7e_a053_da6704f9d286.slice. Dec 16 13:40:36.405464 systemd[1]: Created slice kubepods-burstable-pod22edb7e4_dff8_4395_80d0_f36295b5be55.slice - libcontainer container kubepods-burstable-pod22edb7e4_dff8_4395_80d0_f36295b5be55.slice. Dec 16 13:40:36.411929 systemd[1]: Created slice kubepods-besteffort-podbc92ca59_7a99_4554_9e6f_b60493200978.slice - libcontainer container kubepods-besteffort-podbc92ca59_7a99_4554_9e6f_b60493200978.slice. Dec 16 13:40:36.417387 systemd[1]: Created slice kubepods-besteffort-pod59d04f45_d51d_4c79_b4da_d0334682cd90.slice - libcontainer container kubepods-besteffort-pod59d04f45_d51d_4c79_b4da_d0334682cd90.slice. Dec 16 13:40:36.420648 systemd[1]: Created slice kubepods-besteffort-podae6d0914_be51_4e9e_9abc_d81494f00693.slice - libcontainer container kubepods-besteffort-podae6d0914_be51_4e9e_9abc_d81494f00693.slice. Dec 16 13:40:36.485746 containerd[1779]: time="2025-12-16T13:40:36.485699743Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Dec 16 13:40:36.497428 kubelet[3080]: I1216 13:40:36.497367 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdnqc\" (UniqueName: \"kubernetes.io/projected/4477497f-a609-4791-b674-25df11e8ec73-kube-api-access-sdnqc\") pod \"calico-kube-controllers-5d9cf7b5c4-4vfj5\" (UID: \"4477497f-a609-4791-b674-25df11e8ec73\") " pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" Dec 16 13:40:36.497428 kubelet[3080]: I1216 13:40:36.497408 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22edb7e4-dff8-4395-80d0-f36295b5be55-config-volume\") pod \"coredns-674b8bbfcf-qpg2m\" (UID: \"22edb7e4-dff8-4395-80d0-f36295b5be55\") " pod="kube-system/coredns-674b8bbfcf-qpg2m" Dec 16 13:40:36.497824 kubelet[3080]: I1216 13:40:36.497452 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94bfd095-3b1b-4404-872c-700cc2c230a9-config-volume\") pod \"coredns-674b8bbfcf-hpqzp\" (UID: \"94bfd095-3b1b-4404-872c-700cc2c230a9\") " pod="kube-system/coredns-674b8bbfcf-hpqzp" Dec 16 13:40:36.497824 kubelet[3080]: I1216 13:40:36.497477 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc92ca59-7a99-4554-9e6f-b60493200978-config\") pod \"goldmane-666569f655-zxbn9\" (UID: \"bc92ca59-7a99-4554-9e6f-b60493200978\") " pod="calico-system/goldmane-666569f655-zxbn9" Dec 16 13:40:36.497824 kubelet[3080]: I1216 13:40:36.497492 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns7vw\" (UniqueName: \"kubernetes.io/projected/1fd68d8f-7c37-4d7e-a053-da6704f9d286-kube-api-access-ns7vw\") pod \"whisker-5b6dc97d4d-lggqm\" (UID: \"1fd68d8f-7c37-4d7e-a053-da6704f9d286\") " pod="calico-system/whisker-5b6dc97d4d-lggqm" Dec 16 13:40:36.497824 kubelet[3080]: I1216 13:40:36.497508 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2grc9\" (UniqueName: \"kubernetes.io/projected/59d04f45-d51d-4c79-b4da-d0334682cd90-kube-api-access-2grc9\") pod \"calico-apiserver-574757c556-w6l7d\" (UID: \"59d04f45-d51d-4c79-b4da-d0334682cd90\") " pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" Dec 16 13:40:36.497824 kubelet[3080]: I1216 13:40:36.497526 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ae6d0914-be51-4e9e-9abc-d81494f00693-calico-apiserver-certs\") pod \"calico-apiserver-574757c556-b26rp\" (UID: \"ae6d0914-be51-4e9e-9abc-d81494f00693\") " pod="calico-apiserver/calico-apiserver-574757c556-b26rp" Dec 16 13:40:36.497944 kubelet[3080]: I1216 13:40:36.497541 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/1fd68d8f-7c37-4d7e-a053-da6704f9d286-whisker-backend-key-pair\") pod \"whisker-5b6dc97d4d-lggqm\" (UID: \"1fd68d8f-7c37-4d7e-a053-da6704f9d286\") " pod="calico-system/whisker-5b6dc97d4d-lggqm" Dec 16 13:40:36.497944 kubelet[3080]: I1216 13:40:36.497578 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkcfk\" (UniqueName: \"kubernetes.io/projected/22edb7e4-dff8-4395-80d0-f36295b5be55-kube-api-access-pkcfk\") pod \"coredns-674b8bbfcf-qpg2m\" (UID: \"22edb7e4-dff8-4395-80d0-f36295b5be55\") " pod="kube-system/coredns-674b8bbfcf-qpg2m" Dec 16 13:40:36.497944 kubelet[3080]: I1216 13:40:36.497599 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc92ca59-7a99-4554-9e6f-b60493200978-goldmane-ca-bundle\") pod \"goldmane-666569f655-zxbn9\" (UID: \"bc92ca59-7a99-4554-9e6f-b60493200978\") " pod="calico-system/goldmane-666569f655-zxbn9" Dec 16 13:40:36.497944 kubelet[3080]: I1216 13:40:36.497614 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/bc92ca59-7a99-4554-9e6f-b60493200978-goldmane-key-pair\") pod \"goldmane-666569f655-zxbn9\" (UID: \"bc92ca59-7a99-4554-9e6f-b60493200978\") " pod="calico-system/goldmane-666569f655-zxbn9" Dec 16 13:40:36.497944 kubelet[3080]: I1216 13:40:36.497629 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbfcp\" (UniqueName: \"kubernetes.io/projected/bc92ca59-7a99-4554-9e6f-b60493200978-kube-api-access-cbfcp\") pod \"goldmane-666569f655-zxbn9\" (UID: \"bc92ca59-7a99-4554-9e6f-b60493200978\") " pod="calico-system/goldmane-666569f655-zxbn9" Dec 16 13:40:36.498047 kubelet[3080]: I1216 13:40:36.497645 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88k89\" (UniqueName: \"kubernetes.io/projected/ae6d0914-be51-4e9e-9abc-d81494f00693-kube-api-access-88k89\") pod \"calico-apiserver-574757c556-b26rp\" (UID: \"ae6d0914-be51-4e9e-9abc-d81494f00693\") " pod="calico-apiserver/calico-apiserver-574757c556-b26rp" Dec 16 13:40:36.498047 kubelet[3080]: I1216 13:40:36.497660 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4477497f-a609-4791-b674-25df11e8ec73-tigera-ca-bundle\") pod \"calico-kube-controllers-5d9cf7b5c4-4vfj5\" (UID: \"4477497f-a609-4791-b674-25df11e8ec73\") " pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" Dec 16 13:40:36.498047 kubelet[3080]: I1216 13:40:36.497676 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1fd68d8f-7c37-4d7e-a053-da6704f9d286-whisker-ca-bundle\") pod \"whisker-5b6dc97d4d-lggqm\" (UID: \"1fd68d8f-7c37-4d7e-a053-da6704f9d286\") " pod="calico-system/whisker-5b6dc97d4d-lggqm" Dec 16 13:40:36.498047 kubelet[3080]: I1216 13:40:36.497689 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/59d04f45-d51d-4c79-b4da-d0334682cd90-calico-apiserver-certs\") pod \"calico-apiserver-574757c556-w6l7d\" (UID: \"59d04f45-d51d-4c79-b4da-d0334682cd90\") " pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" Dec 16 13:40:36.498047 kubelet[3080]: I1216 13:40:36.497702 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wslnf\" (UniqueName: \"kubernetes.io/projected/94bfd095-3b1b-4404-872c-700cc2c230a9-kube-api-access-wslnf\") pod \"coredns-674b8bbfcf-hpqzp\" (UID: \"94bfd095-3b1b-4404-872c-700cc2c230a9\") " pod="kube-system/coredns-674b8bbfcf-hpqzp" Dec 16 13:40:36.691261 containerd[1779]: time="2025-12-16T13:40:36.691225934Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-hpqzp,Uid:94bfd095-3b1b-4404-872c-700cc2c230a9,Namespace:kube-system,Attempt:0,}" Dec 16 13:40:36.697922 containerd[1779]: time="2025-12-16T13:40:36.697885257Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5d9cf7b5c4-4vfj5,Uid:4477497f-a609-4791-b674-25df11e8ec73,Namespace:calico-system,Attempt:0,}" Dec 16 13:40:36.702854 containerd[1779]: time="2025-12-16T13:40:36.702819017Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5b6dc97d4d-lggqm,Uid:1fd68d8f-7c37-4d7e-a053-da6704f9d286,Namespace:calico-system,Attempt:0,}" Dec 16 13:40:36.709687 containerd[1779]: time="2025-12-16T13:40:36.709649953Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-qpg2m,Uid:22edb7e4-dff8-4395-80d0-f36295b5be55,Namespace:kube-system,Attempt:0,}" Dec 16 13:40:36.716160 containerd[1779]: time="2025-12-16T13:40:36.715895651Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-zxbn9,Uid:bc92ca59-7a99-4554-9e6f-b60493200978,Namespace:calico-system,Attempt:0,}" Dec 16 13:40:36.720862 containerd[1779]: time="2025-12-16T13:40:36.719876147Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-574757c556-w6l7d,Uid:59d04f45-d51d-4c79-b4da-d0334682cd90,Namespace:calico-apiserver,Attempt:0,}" Dec 16 13:40:36.725809 containerd[1779]: time="2025-12-16T13:40:36.725766701Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-574757c556-b26rp,Uid:ae6d0914-be51-4e9e-9abc-d81494f00693,Namespace:calico-apiserver,Attempt:0,}" Dec 16 13:40:36.746165 containerd[1779]: time="2025-12-16T13:40:36.746120673Z" level=error msg="Failed to destroy network for sandbox \"d2936da3b58d0127460b24e88a347fb9e9abaa1319e448e32052c42addb622c8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:40:36.748476 containerd[1779]: time="2025-12-16T13:40:36.748436091Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-hpqzp,Uid:94bfd095-3b1b-4404-872c-700cc2c230a9,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d2936da3b58d0127460b24e88a347fb9e9abaa1319e448e32052c42addb622c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:40:36.748697 kubelet[3080]: E1216 13:40:36.748657 3080 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d2936da3b58d0127460b24e88a347fb9e9abaa1319e448e32052c42addb622c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:40:36.748745 kubelet[3080]: E1216 13:40:36.748729 3080 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d2936da3b58d0127460b24e88a347fb9e9abaa1319e448e32052c42addb622c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-hpqzp" Dec 16 13:40:36.748771 kubelet[3080]: E1216 13:40:36.748755 3080 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d2936da3b58d0127460b24e88a347fb9e9abaa1319e448e32052c42addb622c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-hpqzp" Dec 16 13:40:36.748826 kubelet[3080]: E1216 13:40:36.748804 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-hpqzp_kube-system(94bfd095-3b1b-4404-872c-700cc2c230a9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-hpqzp_kube-system(94bfd095-3b1b-4404-872c-700cc2c230a9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d2936da3b58d0127460b24e88a347fb9e9abaa1319e448e32052c42addb622c8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-hpqzp" podUID="94bfd095-3b1b-4404-872c-700cc2c230a9" Dec 16 13:40:36.752881 containerd[1779]: time="2025-12-16T13:40:36.752849423Z" level=error msg="Failed to destroy network for sandbox \"cbc11c7305a2262e6be64f1d037eacef5e5654a6829c6b4070e5bb09c649228f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:40:36.755135 containerd[1779]: time="2025-12-16T13:40:36.755099704Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5d9cf7b5c4-4vfj5,Uid:4477497f-a609-4791-b674-25df11e8ec73,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cbc11c7305a2262e6be64f1d037eacef5e5654a6829c6b4070e5bb09c649228f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:40:36.755313 kubelet[3080]: E1216 13:40:36.755278 3080 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cbc11c7305a2262e6be64f1d037eacef5e5654a6829c6b4070e5bb09c649228f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:40:36.755375 kubelet[3080]: E1216 13:40:36.755340 3080 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cbc11c7305a2262e6be64f1d037eacef5e5654a6829c6b4070e5bb09c649228f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" Dec 16 13:40:36.755375 kubelet[3080]: E1216 13:40:36.755360 3080 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cbc11c7305a2262e6be64f1d037eacef5e5654a6829c6b4070e5bb09c649228f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" Dec 16 13:40:36.755486 kubelet[3080]: E1216 13:40:36.755407 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5d9cf7b5c4-4vfj5_calico-system(4477497f-a609-4791-b674-25df11e8ec73)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5d9cf7b5c4-4vfj5_calico-system(4477497f-a609-4791-b674-25df11e8ec73)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cbc11c7305a2262e6be64f1d037eacef5e5654a6829c6b4070e5bb09c649228f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" podUID="4477497f-a609-4791-b674-25df11e8ec73" Dec 16 13:40:36.763697 containerd[1779]: time="2025-12-16T13:40:36.763650986Z" level=error msg="Failed to destroy network for sandbox \"aa6d98ca7e1b1031aedd6355bfb2ca13fae7b75a0b085f6fef1b35f07064fe1a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:40:36.765355 containerd[1779]: time="2025-12-16T13:40:36.765315678Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5b6dc97d4d-lggqm,Uid:1fd68d8f-7c37-4d7e-a053-da6704f9d286,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa6d98ca7e1b1031aedd6355bfb2ca13fae7b75a0b085f6fef1b35f07064fe1a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:40:36.765560 kubelet[3080]: E1216 13:40:36.765505 3080 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa6d98ca7e1b1031aedd6355bfb2ca13fae7b75a0b085f6fef1b35f07064fe1a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:40:36.765605 kubelet[3080]: E1216 13:40:36.765578 3080 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa6d98ca7e1b1031aedd6355bfb2ca13fae7b75a0b085f6fef1b35f07064fe1a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5b6dc97d4d-lggqm" Dec 16 13:40:36.765605 kubelet[3080]: E1216 13:40:36.765599 3080 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa6d98ca7e1b1031aedd6355bfb2ca13fae7b75a0b085f6fef1b35f07064fe1a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5b6dc97d4d-lggqm" Dec 16 13:40:36.765653 containerd[1779]: time="2025-12-16T13:40:36.765580420Z" level=error msg="Failed to destroy network for sandbox \"befe01ff2b358644776fa05b626db197459f82a579542aa59c4e9f37632bf8c7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:40:36.765676 kubelet[3080]: E1216 13:40:36.765648 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5b6dc97d4d-lggqm_calico-system(1fd68d8f-7c37-4d7e-a053-da6704f9d286)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5b6dc97d4d-lggqm_calico-system(1fd68d8f-7c37-4d7e-a053-da6704f9d286)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"aa6d98ca7e1b1031aedd6355bfb2ca13fae7b75a0b085f6fef1b35f07064fe1a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5b6dc97d4d-lggqm" podUID="1fd68d8f-7c37-4d7e-a053-da6704f9d286" Dec 16 13:40:36.769727 containerd[1779]: time="2025-12-16T13:40:36.769690226Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-qpg2m,Uid:22edb7e4-dff8-4395-80d0-f36295b5be55,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"befe01ff2b358644776fa05b626db197459f82a579542aa59c4e9f37632bf8c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:40:36.769908 kubelet[3080]: E1216 13:40:36.769875 3080 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"befe01ff2b358644776fa05b626db197459f82a579542aa59c4e9f37632bf8c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:40:36.769954 kubelet[3080]: E1216 13:40:36.769926 3080 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"befe01ff2b358644776fa05b626db197459f82a579542aa59c4e9f37632bf8c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-qpg2m" Dec 16 13:40:36.769954 kubelet[3080]: E1216 13:40:36.769946 3080 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"befe01ff2b358644776fa05b626db197459f82a579542aa59c4e9f37632bf8c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-qpg2m" Dec 16 13:40:36.770010 kubelet[3080]: E1216 13:40:36.769990 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-qpg2m_kube-system(22edb7e4-dff8-4395-80d0-f36295b5be55)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-qpg2m_kube-system(22edb7e4-dff8-4395-80d0-f36295b5be55)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"befe01ff2b358644776fa05b626db197459f82a579542aa59c4e9f37632bf8c7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-qpg2m" podUID="22edb7e4-dff8-4395-80d0-f36295b5be55" Dec 16 13:40:36.772203 containerd[1779]: time="2025-12-16T13:40:36.772129547Z" level=error msg="Failed to destroy network for sandbox \"68f5243fb545057b0d69d01c3b948f62e3473fec56dfb363f42c3c0933016baa\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:40:36.774816 containerd[1779]: time="2025-12-16T13:40:36.774778714Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-574757c556-w6l7d,Uid:59d04f45-d51d-4c79-b4da-d0334682cd90,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"68f5243fb545057b0d69d01c3b948f62e3473fec56dfb363f42c3c0933016baa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:40:36.775124 kubelet[3080]: E1216 13:40:36.775092 3080 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"68f5243fb545057b0d69d01c3b948f62e3473fec56dfb363f42c3c0933016baa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:40:36.775176 kubelet[3080]: E1216 13:40:36.775138 3080 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"68f5243fb545057b0d69d01c3b948f62e3473fec56dfb363f42c3c0933016baa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" Dec 16 13:40:36.775176 kubelet[3080]: E1216 13:40:36.775156 3080 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"68f5243fb545057b0d69d01c3b948f62e3473fec56dfb363f42c3c0933016baa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" Dec 16 13:40:36.775232 kubelet[3080]: E1216 13:40:36.775210 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-574757c556-w6l7d_calico-apiserver(59d04f45-d51d-4c79-b4da-d0334682cd90)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-574757c556-w6l7d_calico-apiserver(59d04f45-d51d-4c79-b4da-d0334682cd90)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"68f5243fb545057b0d69d01c3b948f62e3473fec56dfb363f42c3c0933016baa\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" podUID="59d04f45-d51d-4c79-b4da-d0334682cd90" Dec 16 13:40:36.775313 containerd[1779]: time="2025-12-16T13:40:36.775289353Z" level=error msg="Failed to destroy network for sandbox \"414306897061f005bb5ce2bce5247ae0f66eaa572d9f90a88f5e4b95e9791f60\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:40:36.777094 containerd[1779]: time="2025-12-16T13:40:36.777049827Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-zxbn9,Uid:bc92ca59-7a99-4554-9e6f-b60493200978,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"414306897061f005bb5ce2bce5247ae0f66eaa572d9f90a88f5e4b95e9791f60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:40:36.777217 kubelet[3080]: E1216 13:40:36.777194 3080 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"414306897061f005bb5ce2bce5247ae0f66eaa572d9f90a88f5e4b95e9791f60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:40:36.777256 kubelet[3080]: E1216 13:40:36.777233 3080 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"414306897061f005bb5ce2bce5247ae0f66eaa572d9f90a88f5e4b95e9791f60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-zxbn9" Dec 16 13:40:36.777280 kubelet[3080]: E1216 13:40:36.777259 3080 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"414306897061f005bb5ce2bce5247ae0f66eaa572d9f90a88f5e4b95e9791f60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-zxbn9" Dec 16 13:40:36.777316 kubelet[3080]: E1216 13:40:36.777298 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-zxbn9_calico-system(bc92ca59-7a99-4554-9e6f-b60493200978)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-zxbn9_calico-system(bc92ca59-7a99-4554-9e6f-b60493200978)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"414306897061f005bb5ce2bce5247ae0f66eaa572d9f90a88f5e4b95e9791f60\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-zxbn9" podUID="bc92ca59-7a99-4554-9e6f-b60493200978" Dec 16 13:40:36.785426 containerd[1779]: time="2025-12-16T13:40:36.785389993Z" level=error msg="Failed to destroy network for sandbox \"7eefafa4f7829c3498a1817dda1077b277522a72751b27960d8645d92777418f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:40:36.787515 containerd[1779]: time="2025-12-16T13:40:36.787484686Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-574757c556-b26rp,Uid:ae6d0914-be51-4e9e-9abc-d81494f00693,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7eefafa4f7829c3498a1817dda1077b277522a72751b27960d8645d92777418f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:40:36.789576 kubelet[3080]: E1216 13:40:36.787679 3080 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7eefafa4f7829c3498a1817dda1077b277522a72751b27960d8645d92777418f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:40:36.789576 kubelet[3080]: E1216 13:40:36.787728 3080 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7eefafa4f7829c3498a1817dda1077b277522a72751b27960d8645d92777418f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-574757c556-b26rp" Dec 16 13:40:36.789576 kubelet[3080]: E1216 13:40:36.787750 3080 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7eefafa4f7829c3498a1817dda1077b277522a72751b27960d8645d92777418f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-574757c556-b26rp" Dec 16 13:40:36.789715 kubelet[3080]: E1216 13:40:36.787795 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-574757c556-b26rp_calico-apiserver(ae6d0914-be51-4e9e-9abc-d81494f00693)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-574757c556-b26rp_calico-apiserver(ae6d0914-be51-4e9e-9abc-d81494f00693)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7eefafa4f7829c3498a1817dda1077b277522a72751b27960d8645d92777418f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-574757c556-b26rp" podUID="ae6d0914-be51-4e9e-9abc-d81494f00693" Dec 16 13:40:36.792273 systemd[1]: run-netns-cni\x2d1aa44c8f\x2d5969\x2dc73b\x2d07b4\x2d7fd4e78f6b90.mount: Deactivated successfully. Dec 16 13:40:37.417844 systemd[1]: Created slice kubepods-besteffort-pod713c1552_24ab_4b60_9872_de4f52adb23b.slice - libcontainer container kubepods-besteffort-pod713c1552_24ab_4b60_9872_de4f52adb23b.slice. Dec 16 13:40:37.419666 containerd[1779]: time="2025-12-16T13:40:37.419637974Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-wzss6,Uid:713c1552-24ab-4b60-9872-de4f52adb23b,Namespace:calico-system,Attempt:0,}" Dec 16 13:40:37.461995 containerd[1779]: time="2025-12-16T13:40:37.461943106Z" level=error msg="Failed to destroy network for sandbox \"c988464929f578dba402e60e4306bedeeedcf39fb3d2148f117212b72af12c6f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:40:37.463758 systemd[1]: run-netns-cni\x2d3e0997c0\x2df6ef\x2d9e2e\x2d4ab7\x2de8001c698cef.mount: Deactivated successfully. Dec 16 13:40:37.464163 containerd[1779]: time="2025-12-16T13:40:37.464069828Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-wzss6,Uid:713c1552-24ab-4b60-9872-de4f52adb23b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c988464929f578dba402e60e4306bedeeedcf39fb3d2148f117212b72af12c6f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:40:37.464332 kubelet[3080]: E1216 13:40:37.464294 3080 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c988464929f578dba402e60e4306bedeeedcf39fb3d2148f117212b72af12c6f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:40:37.464376 kubelet[3080]: E1216 13:40:37.464347 3080 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c988464929f578dba402e60e4306bedeeedcf39fb3d2148f117212b72af12c6f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-wzss6" Dec 16 13:40:37.464376 kubelet[3080]: E1216 13:40:37.464371 3080 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c988464929f578dba402e60e4306bedeeedcf39fb3d2148f117212b72af12c6f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-wzss6" Dec 16 13:40:37.464453 kubelet[3080]: E1216 13:40:37.464432 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-wzss6_calico-system(713c1552-24ab-4b60-9872-de4f52adb23b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-wzss6_calico-system(713c1552-24ab-4b60-9872-de4f52adb23b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c988464929f578dba402e60e4306bedeeedcf39fb3d2148f117212b72af12c6f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:40:43.111008 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2613172075.mount: Deactivated successfully. Dec 16 13:40:43.130846 containerd[1779]: time="2025-12-16T13:40:43.130795903Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:40:43.131906 containerd[1779]: time="2025-12-16T13:40:43.131874492Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156883675" Dec 16 13:40:43.133331 containerd[1779]: time="2025-12-16T13:40:43.133290590Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:40:43.135244 containerd[1779]: time="2025-12-16T13:40:43.135213770Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:40:43.135598 containerd[1779]: time="2025-12-16T13:40:43.135573533Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 6.649825838s" Dec 16 13:40:43.135634 containerd[1779]: time="2025-12-16T13:40:43.135603825Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Dec 16 13:40:43.155215 containerd[1779]: time="2025-12-16T13:40:43.155170398Z" level=info msg="CreateContainer within sandbox \"966c8a7be74c14f6cf939cc4d2daa54cbbe8cf0f78cb4c059bfd6149c31cfeef\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Dec 16 13:40:43.166119 containerd[1779]: time="2025-12-16T13:40:43.166027948Z" level=info msg="Container dcbf3b680371b142aff21b6c85d4d7b82c775843bd989262fbf04e2a684ae9c8: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:40:43.177664 containerd[1779]: time="2025-12-16T13:40:43.177563104Z" level=info msg="CreateContainer within sandbox \"966c8a7be74c14f6cf939cc4d2daa54cbbe8cf0f78cb4c059bfd6149c31cfeef\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"dcbf3b680371b142aff21b6c85d4d7b82c775843bd989262fbf04e2a684ae9c8\"" Dec 16 13:40:43.178235 containerd[1779]: time="2025-12-16T13:40:43.178156111Z" level=info msg="StartContainer for \"dcbf3b680371b142aff21b6c85d4d7b82c775843bd989262fbf04e2a684ae9c8\"" Dec 16 13:40:43.179368 containerd[1779]: time="2025-12-16T13:40:43.179345698Z" level=info msg="connecting to shim dcbf3b680371b142aff21b6c85d4d7b82c775843bd989262fbf04e2a684ae9c8" address="unix:///run/containerd/s/f86a0b195f8d78164b7683fe688c4a2f3b017a716e22c5831fb9b4086eebaea5" protocol=ttrpc version=3 Dec 16 13:40:43.211798 systemd[1]: Started cri-containerd-dcbf3b680371b142aff21b6c85d4d7b82c775843bd989262fbf04e2a684ae9c8.scope - libcontainer container dcbf3b680371b142aff21b6c85d4d7b82c775843bd989262fbf04e2a684ae9c8. Dec 16 13:40:43.296650 containerd[1779]: time="2025-12-16T13:40:43.296600096Z" level=info msg="StartContainer for \"dcbf3b680371b142aff21b6c85d4d7b82c775843bd989262fbf04e2a684ae9c8\" returns successfully" Dec 16 13:40:43.370926 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Dec 16 13:40:43.371049 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Dec 16 13:40:43.515569 kubelet[3080]: I1216 13:40:43.515233 3080 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-tgbz2" podStartSLOduration=1.698081319 podStartE2EDuration="16.515215957s" podCreationTimestamp="2025-12-16 13:40:27 +0000 UTC" firstStartedPulling="2025-12-16 13:40:28.319874794 +0000 UTC m=+21.987121851" lastFinishedPulling="2025-12-16 13:40:43.137009433 +0000 UTC m=+36.804256489" observedRunningTime="2025-12-16 13:40:43.515178006 +0000 UTC m=+37.182425085" watchObservedRunningTime="2025-12-16 13:40:43.515215957 +0000 UTC m=+37.182463014" Dec 16 13:40:43.542395 kubelet[3080]: I1216 13:40:43.542342 3080 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/1fd68d8f-7c37-4d7e-a053-da6704f9d286-whisker-backend-key-pair\") pod \"1fd68d8f-7c37-4d7e-a053-da6704f9d286\" (UID: \"1fd68d8f-7c37-4d7e-a053-da6704f9d286\") " Dec 16 13:40:43.542395 kubelet[3080]: I1216 13:40:43.542380 3080 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1fd68d8f-7c37-4d7e-a053-da6704f9d286-whisker-ca-bundle\") pod \"1fd68d8f-7c37-4d7e-a053-da6704f9d286\" (UID: \"1fd68d8f-7c37-4d7e-a053-da6704f9d286\") " Dec 16 13:40:43.542395 kubelet[3080]: I1216 13:40:43.542405 3080 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns7vw\" (UniqueName: \"kubernetes.io/projected/1fd68d8f-7c37-4d7e-a053-da6704f9d286-kube-api-access-ns7vw\") pod \"1fd68d8f-7c37-4d7e-a053-da6704f9d286\" (UID: \"1fd68d8f-7c37-4d7e-a053-da6704f9d286\") " Dec 16 13:40:43.543116 kubelet[3080]: I1216 13:40:43.543083 3080 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fd68d8f-7c37-4d7e-a053-da6704f9d286-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "1fd68d8f-7c37-4d7e-a053-da6704f9d286" (UID: "1fd68d8f-7c37-4d7e-a053-da6704f9d286"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Dec 16 13:40:43.545000 kubelet[3080]: I1216 13:40:43.544967 3080 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fd68d8f-7c37-4d7e-a053-da6704f9d286-kube-api-access-ns7vw" (OuterVolumeSpecName: "kube-api-access-ns7vw") pod "1fd68d8f-7c37-4d7e-a053-da6704f9d286" (UID: "1fd68d8f-7c37-4d7e-a053-da6704f9d286"). InnerVolumeSpecName "kube-api-access-ns7vw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Dec 16 13:40:43.545000 kubelet[3080]: I1216 13:40:43.544983 3080 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fd68d8f-7c37-4d7e-a053-da6704f9d286-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "1fd68d8f-7c37-4d7e-a053-da6704f9d286" (UID: "1fd68d8f-7c37-4d7e-a053-da6704f9d286"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Dec 16 13:40:43.643223 kubelet[3080]: I1216 13:40:43.643127 3080 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ns7vw\" (UniqueName: \"kubernetes.io/projected/1fd68d8f-7c37-4d7e-a053-da6704f9d286-kube-api-access-ns7vw\") on node \"ci-4459-2-2-a-7f096d1947\" DevicePath \"\"" Dec 16 13:40:43.643223 kubelet[3080]: I1216 13:40:43.643155 3080 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/1fd68d8f-7c37-4d7e-a053-da6704f9d286-whisker-backend-key-pair\") on node \"ci-4459-2-2-a-7f096d1947\" DevicePath \"\"" Dec 16 13:40:43.643223 kubelet[3080]: I1216 13:40:43.643164 3080 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1fd68d8f-7c37-4d7e-a053-da6704f9d286-whisker-ca-bundle\") on node \"ci-4459-2-2-a-7f096d1947\" DevicePath \"\"" Dec 16 13:40:43.802460 systemd[1]: Removed slice kubepods-besteffort-pod1fd68d8f_7c37_4d7e_a053_da6704f9d286.slice - libcontainer container kubepods-besteffort-pod1fd68d8f_7c37_4d7e_a053_da6704f9d286.slice. Dec 16 13:40:43.856049 systemd[1]: Created slice kubepods-besteffort-pod938ae2d1_4432_47b7_a9d3_a5acd90ddc02.slice - libcontainer container kubepods-besteffort-pod938ae2d1_4432_47b7_a9d3_a5acd90ddc02.slice. Dec 16 13:40:43.945155 kubelet[3080]: I1216 13:40:43.945041 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/938ae2d1-4432-47b7-a9d3-a5acd90ddc02-whisker-ca-bundle\") pod \"whisker-7bdfb97df8-fdm7d\" (UID: \"938ae2d1-4432-47b7-a9d3-a5acd90ddc02\") " pod="calico-system/whisker-7bdfb97df8-fdm7d" Dec 16 13:40:43.945155 kubelet[3080]: I1216 13:40:43.945091 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn2wh\" (UniqueName: \"kubernetes.io/projected/938ae2d1-4432-47b7-a9d3-a5acd90ddc02-kube-api-access-vn2wh\") pod \"whisker-7bdfb97df8-fdm7d\" (UID: \"938ae2d1-4432-47b7-a9d3-a5acd90ddc02\") " pod="calico-system/whisker-7bdfb97df8-fdm7d" Dec 16 13:40:43.945155 kubelet[3080]: I1216 13:40:43.945115 3080 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/938ae2d1-4432-47b7-a9d3-a5acd90ddc02-whisker-backend-key-pair\") pod \"whisker-7bdfb97df8-fdm7d\" (UID: \"938ae2d1-4432-47b7-a9d3-a5acd90ddc02\") " pod="calico-system/whisker-7bdfb97df8-fdm7d" Dec 16 13:40:44.113489 systemd[1]: var-lib-kubelet-pods-1fd68d8f\x2d7c37\x2d4d7e\x2da053\x2dda6704f9d286-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dns7vw.mount: Deactivated successfully. Dec 16 13:40:44.113586 systemd[1]: var-lib-kubelet-pods-1fd68d8f\x2d7c37\x2d4d7e\x2da053\x2dda6704f9d286-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Dec 16 13:40:44.159059 containerd[1779]: time="2025-12-16T13:40:44.159019638Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7bdfb97df8-fdm7d,Uid:938ae2d1-4432-47b7-a9d3-a5acd90ddc02,Namespace:calico-system,Attempt:0,}" Dec 16 13:40:44.246166 systemd-networkd[1684]: cali6fb1f73abc7: Link UP Dec 16 13:40:44.246327 systemd-networkd[1684]: cali6fb1f73abc7: Gained carrier Dec 16 13:40:44.257888 containerd[1779]: 2025-12-16 13:40:44.178 [INFO][4382] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 13:40:44.257888 containerd[1779]: 2025-12-16 13:40:44.188 [INFO][4382] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--a--7f096d1947-k8s-whisker--7bdfb97df8--fdm7d-eth0 whisker-7bdfb97df8- calico-system 938ae2d1-4432-47b7-a9d3-a5acd90ddc02 890 0 2025-12-16 13:40:43 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7bdfb97df8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-2-a-7f096d1947 whisker-7bdfb97df8-fdm7d eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali6fb1f73abc7 [] [] }} ContainerID="9f8fbc178c9616b51d5085937e0d910e093526eb8786551a99f8cc9e15f6b8f5" Namespace="calico-system" Pod="whisker-7bdfb97df8-fdm7d" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-whisker--7bdfb97df8--fdm7d-" Dec 16 13:40:44.257888 containerd[1779]: 2025-12-16 13:40:44.188 [INFO][4382] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9f8fbc178c9616b51d5085937e0d910e093526eb8786551a99f8cc9e15f6b8f5" Namespace="calico-system" Pod="whisker-7bdfb97df8-fdm7d" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-whisker--7bdfb97df8--fdm7d-eth0" Dec 16 13:40:44.257888 containerd[1779]: 2025-12-16 13:40:44.208 [INFO][4397] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9f8fbc178c9616b51d5085937e0d910e093526eb8786551a99f8cc9e15f6b8f5" HandleID="k8s-pod-network.9f8fbc178c9616b51d5085937e0d910e093526eb8786551a99f8cc9e15f6b8f5" Workload="ci--4459--2--2--a--7f096d1947-k8s-whisker--7bdfb97df8--fdm7d-eth0" Dec 16 13:40:44.258109 containerd[1779]: 2025-12-16 13:40:44.209 [INFO][4397] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="9f8fbc178c9616b51d5085937e0d910e093526eb8786551a99f8cc9e15f6b8f5" HandleID="k8s-pod-network.9f8fbc178c9616b51d5085937e0d910e093526eb8786551a99f8cc9e15f6b8f5" Workload="ci--4459--2--2--a--7f096d1947-k8s-whisker--7bdfb97df8--fdm7d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e670), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-a-7f096d1947", "pod":"whisker-7bdfb97df8-fdm7d", "timestamp":"2025-12-16 13:40:44.208947339 +0000 UTC"}, Hostname:"ci-4459-2-2-a-7f096d1947", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:40:44.258109 containerd[1779]: 2025-12-16 13:40:44.209 [INFO][4397] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:40:44.258109 containerd[1779]: 2025-12-16 13:40:44.209 [INFO][4397] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:40:44.258109 containerd[1779]: 2025-12-16 13:40:44.209 [INFO][4397] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-a-7f096d1947' Dec 16 13:40:44.258109 containerd[1779]: 2025-12-16 13:40:44.215 [INFO][4397] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9f8fbc178c9616b51d5085937e0d910e093526eb8786551a99f8cc9e15f6b8f5" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:44.258109 containerd[1779]: 2025-12-16 13:40:44.219 [INFO][4397] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:44.258109 containerd[1779]: 2025-12-16 13:40:44.222 [INFO][4397] ipam/ipam.go 511: Trying affinity for 192.168.32.0/26 host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:44.258109 containerd[1779]: 2025-12-16 13:40:44.224 [INFO][4397] ipam/ipam.go 158: Attempting to load block cidr=192.168.32.0/26 host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:44.258109 containerd[1779]: 2025-12-16 13:40:44.225 [INFO][4397] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.32.0/26 host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:44.258308 containerd[1779]: 2025-12-16 13:40:44.225 [INFO][4397] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.32.0/26 handle="k8s-pod-network.9f8fbc178c9616b51d5085937e0d910e093526eb8786551a99f8cc9e15f6b8f5" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:44.258308 containerd[1779]: 2025-12-16 13:40:44.226 [INFO][4397] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.9f8fbc178c9616b51d5085937e0d910e093526eb8786551a99f8cc9e15f6b8f5 Dec 16 13:40:44.258308 containerd[1779]: 2025-12-16 13:40:44.229 [INFO][4397] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.32.0/26 handle="k8s-pod-network.9f8fbc178c9616b51d5085937e0d910e093526eb8786551a99f8cc9e15f6b8f5" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:44.258308 containerd[1779]: 2025-12-16 13:40:44.235 [INFO][4397] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.32.1/26] block=192.168.32.0/26 handle="k8s-pod-network.9f8fbc178c9616b51d5085937e0d910e093526eb8786551a99f8cc9e15f6b8f5" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:44.258308 containerd[1779]: 2025-12-16 13:40:44.235 [INFO][4397] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.32.1/26] handle="k8s-pod-network.9f8fbc178c9616b51d5085937e0d910e093526eb8786551a99f8cc9e15f6b8f5" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:44.258308 containerd[1779]: 2025-12-16 13:40:44.235 [INFO][4397] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:40:44.258308 containerd[1779]: 2025-12-16 13:40:44.235 [INFO][4397] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.32.1/26] IPv6=[] ContainerID="9f8fbc178c9616b51d5085937e0d910e093526eb8786551a99f8cc9e15f6b8f5" HandleID="k8s-pod-network.9f8fbc178c9616b51d5085937e0d910e093526eb8786551a99f8cc9e15f6b8f5" Workload="ci--4459--2--2--a--7f096d1947-k8s-whisker--7bdfb97df8--fdm7d-eth0" Dec 16 13:40:44.258443 containerd[1779]: 2025-12-16 13:40:44.238 [INFO][4382] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9f8fbc178c9616b51d5085937e0d910e093526eb8786551a99f8cc9e15f6b8f5" Namespace="calico-system" Pod="whisker-7bdfb97df8-fdm7d" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-whisker--7bdfb97df8--fdm7d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--a--7f096d1947-k8s-whisker--7bdfb97df8--fdm7d-eth0", GenerateName:"whisker-7bdfb97df8-", Namespace:"calico-system", SelfLink:"", UID:"938ae2d1-4432-47b7-a9d3-a5acd90ddc02", ResourceVersion:"890", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 40, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7bdfb97df8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-a-7f096d1947", ContainerID:"", Pod:"whisker-7bdfb97df8-fdm7d", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.32.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali6fb1f73abc7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:40:44.258443 containerd[1779]: 2025-12-16 13:40:44.238 [INFO][4382] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.32.1/32] ContainerID="9f8fbc178c9616b51d5085937e0d910e093526eb8786551a99f8cc9e15f6b8f5" Namespace="calico-system" Pod="whisker-7bdfb97df8-fdm7d" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-whisker--7bdfb97df8--fdm7d-eth0" Dec 16 13:40:44.258511 containerd[1779]: 2025-12-16 13:40:44.238 [INFO][4382] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6fb1f73abc7 ContainerID="9f8fbc178c9616b51d5085937e0d910e093526eb8786551a99f8cc9e15f6b8f5" Namespace="calico-system" Pod="whisker-7bdfb97df8-fdm7d" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-whisker--7bdfb97df8--fdm7d-eth0" Dec 16 13:40:44.258511 containerd[1779]: 2025-12-16 13:40:44.246 [INFO][4382] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9f8fbc178c9616b51d5085937e0d910e093526eb8786551a99f8cc9e15f6b8f5" Namespace="calico-system" Pod="whisker-7bdfb97df8-fdm7d" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-whisker--7bdfb97df8--fdm7d-eth0" Dec 16 13:40:44.258564 containerd[1779]: 2025-12-16 13:40:44.246 [INFO][4382] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9f8fbc178c9616b51d5085937e0d910e093526eb8786551a99f8cc9e15f6b8f5" Namespace="calico-system" Pod="whisker-7bdfb97df8-fdm7d" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-whisker--7bdfb97df8--fdm7d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--a--7f096d1947-k8s-whisker--7bdfb97df8--fdm7d-eth0", GenerateName:"whisker-7bdfb97df8-", Namespace:"calico-system", SelfLink:"", UID:"938ae2d1-4432-47b7-a9d3-a5acd90ddc02", ResourceVersion:"890", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 40, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7bdfb97df8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-a-7f096d1947", ContainerID:"9f8fbc178c9616b51d5085937e0d910e093526eb8786551a99f8cc9e15f6b8f5", Pod:"whisker-7bdfb97df8-fdm7d", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.32.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali6fb1f73abc7", MAC:"62:9b:d4:0b:28:64", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:40:44.258622 containerd[1779]: 2025-12-16 13:40:44.256 [INFO][4382] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9f8fbc178c9616b51d5085937e0d910e093526eb8786551a99f8cc9e15f6b8f5" Namespace="calico-system" Pod="whisker-7bdfb97df8-fdm7d" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-whisker--7bdfb97df8--fdm7d-eth0" Dec 16 13:40:44.283110 containerd[1779]: time="2025-12-16T13:40:44.283071981Z" level=info msg="connecting to shim 9f8fbc178c9616b51d5085937e0d910e093526eb8786551a99f8cc9e15f6b8f5" address="unix:///run/containerd/s/581bd68782d3ac4bf8a8f2460ccb8ba87ea2c8323e43c55f47a5dd9ebbf3f5ee" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:40:44.315776 systemd[1]: Started cri-containerd-9f8fbc178c9616b51d5085937e0d910e093526eb8786551a99f8cc9e15f6b8f5.scope - libcontainer container 9f8fbc178c9616b51d5085937e0d910e093526eb8786551a99f8cc9e15f6b8f5. Dec 16 13:40:44.356736 containerd[1779]: time="2025-12-16T13:40:44.356697024Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7bdfb97df8-fdm7d,Uid:938ae2d1-4432-47b7-a9d3-a5acd90ddc02,Namespace:calico-system,Attempt:0,} returns sandbox id \"9f8fbc178c9616b51d5085937e0d910e093526eb8786551a99f8cc9e15f6b8f5\"" Dec 16 13:40:44.357896 containerd[1779]: time="2025-12-16T13:40:44.357869280Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 13:40:44.415412 kubelet[3080]: I1216 13:40:44.415375 3080 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fd68d8f-7c37-4d7e-a053-da6704f9d286" path="/var/lib/kubelet/pods/1fd68d8f-7c37-4d7e-a053-da6704f9d286/volumes" Dec 16 13:40:44.713275 containerd[1779]: time="2025-12-16T13:40:44.713230189Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:40:44.714864 containerd[1779]: time="2025-12-16T13:40:44.714814655Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 13:40:44.714960 containerd[1779]: time="2025-12-16T13:40:44.714879622Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 13:40:44.716120 kubelet[3080]: E1216 13:40:44.715059 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:40:44.716120 kubelet[3080]: E1216 13:40:44.715973 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:40:44.718563 kubelet[3080]: E1216 13:40:44.716992 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:f859dfb58e5e4f5cbc43012fb92832ac,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vn2wh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7bdfb97df8-fdm7d_calico-system(938ae2d1-4432-47b7-a9d3-a5acd90ddc02): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 13:40:44.720387 containerd[1779]: time="2025-12-16T13:40:44.720357175Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 13:40:45.013487 systemd-networkd[1684]: vxlan.calico: Link UP Dec 16 13:40:45.013495 systemd-networkd[1684]: vxlan.calico: Gained carrier Dec 16 13:40:45.051653 containerd[1779]: time="2025-12-16T13:40:45.051591376Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:40:45.053594 containerd[1779]: time="2025-12-16T13:40:45.053512419Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 13:40:45.053695 containerd[1779]: time="2025-12-16T13:40:45.053590797Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 13:40:45.053843 kubelet[3080]: E1216 13:40:45.053795 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:40:45.053889 kubelet[3080]: E1216 13:40:45.053848 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:40:45.054004 kubelet[3080]: E1216 13:40:45.053965 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vn2wh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7bdfb97df8-fdm7d_calico-system(938ae2d1-4432-47b7-a9d3-a5acd90ddc02): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 13:40:45.055985 kubelet[3080]: E1216 13:40:45.055926 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7bdfb97df8-fdm7d" podUID="938ae2d1-4432-47b7-a9d3-a5acd90ddc02" Dec 16 13:40:45.503683 kubelet[3080]: E1216 13:40:45.503640 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7bdfb97df8-fdm7d" podUID="938ae2d1-4432-47b7-a9d3-a5acd90ddc02" Dec 16 13:40:46.274832 systemd-networkd[1684]: cali6fb1f73abc7: Gained IPv6LL Dec 16 13:40:46.658683 systemd-networkd[1684]: vxlan.calico: Gained IPv6LL Dec 16 13:40:47.414657 containerd[1779]: time="2025-12-16T13:40:47.414607348Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-574757c556-b26rp,Uid:ae6d0914-be51-4e9e-9abc-d81494f00693,Namespace:calico-apiserver,Attempt:0,}" Dec 16 13:40:47.514756 systemd-networkd[1684]: cali6f4a2f74418: Link UP Dec 16 13:40:47.515066 systemd-networkd[1684]: cali6f4a2f74418: Gained carrier Dec 16 13:40:47.526674 containerd[1779]: 2025-12-16 13:40:47.449 [INFO][4727] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--b26rp-eth0 calico-apiserver-574757c556- calico-apiserver ae6d0914-be51-4e9e-9abc-d81494f00693 825 0 2025-12-16 13:40:23 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:574757c556 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-a-7f096d1947 calico-apiserver-574757c556-b26rp eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali6f4a2f74418 [] [] }} ContainerID="7b7c53129beac9fc1b3dcaa92a75f4793fb1bcf2820ad39bff400c3e9ca685f8" Namespace="calico-apiserver" Pod="calico-apiserver-574757c556-b26rp" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--b26rp-" Dec 16 13:40:47.526674 containerd[1779]: 2025-12-16 13:40:47.449 [INFO][4727] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7b7c53129beac9fc1b3dcaa92a75f4793fb1bcf2820ad39bff400c3e9ca685f8" Namespace="calico-apiserver" Pod="calico-apiserver-574757c556-b26rp" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--b26rp-eth0" Dec 16 13:40:47.526674 containerd[1779]: 2025-12-16 13:40:47.474 [INFO][4745] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7b7c53129beac9fc1b3dcaa92a75f4793fb1bcf2820ad39bff400c3e9ca685f8" HandleID="k8s-pod-network.7b7c53129beac9fc1b3dcaa92a75f4793fb1bcf2820ad39bff400c3e9ca685f8" Workload="ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--b26rp-eth0" Dec 16 13:40:47.526895 containerd[1779]: 2025-12-16 13:40:47.474 [INFO][4745] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="7b7c53129beac9fc1b3dcaa92a75f4793fb1bcf2820ad39bff400c3e9ca685f8" HandleID="k8s-pod-network.7b7c53129beac9fc1b3dcaa92a75f4793fb1bcf2820ad39bff400c3e9ca685f8" Workload="ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--b26rp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002df070), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-a-7f096d1947", "pod":"calico-apiserver-574757c556-b26rp", "timestamp":"2025-12-16 13:40:47.474248253 +0000 UTC"}, Hostname:"ci-4459-2-2-a-7f096d1947", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:40:47.526895 containerd[1779]: 2025-12-16 13:40:47.474 [INFO][4745] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:40:47.526895 containerd[1779]: 2025-12-16 13:40:47.474 [INFO][4745] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:40:47.526895 containerd[1779]: 2025-12-16 13:40:47.474 [INFO][4745] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-a-7f096d1947' Dec 16 13:40:47.526895 containerd[1779]: 2025-12-16 13:40:47.481 [INFO][4745] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7b7c53129beac9fc1b3dcaa92a75f4793fb1bcf2820ad39bff400c3e9ca685f8" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:47.526895 containerd[1779]: 2025-12-16 13:40:47.485 [INFO][4745] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:47.526895 containerd[1779]: 2025-12-16 13:40:47.490 [INFO][4745] ipam/ipam.go 511: Trying affinity for 192.168.32.0/26 host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:47.526895 containerd[1779]: 2025-12-16 13:40:47.493 [INFO][4745] ipam/ipam.go 158: Attempting to load block cidr=192.168.32.0/26 host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:47.526895 containerd[1779]: 2025-12-16 13:40:47.496 [INFO][4745] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.32.0/26 host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:47.527123 containerd[1779]: 2025-12-16 13:40:47.496 [INFO][4745] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.32.0/26 handle="k8s-pod-network.7b7c53129beac9fc1b3dcaa92a75f4793fb1bcf2820ad39bff400c3e9ca685f8" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:47.527123 containerd[1779]: 2025-12-16 13:40:47.498 [INFO][4745] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.7b7c53129beac9fc1b3dcaa92a75f4793fb1bcf2820ad39bff400c3e9ca685f8 Dec 16 13:40:47.527123 containerd[1779]: 2025-12-16 13:40:47.503 [INFO][4745] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.32.0/26 handle="k8s-pod-network.7b7c53129beac9fc1b3dcaa92a75f4793fb1bcf2820ad39bff400c3e9ca685f8" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:47.527123 containerd[1779]: 2025-12-16 13:40:47.510 [INFO][4745] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.32.2/26] block=192.168.32.0/26 handle="k8s-pod-network.7b7c53129beac9fc1b3dcaa92a75f4793fb1bcf2820ad39bff400c3e9ca685f8" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:47.527123 containerd[1779]: 2025-12-16 13:40:47.511 [INFO][4745] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.32.2/26] handle="k8s-pod-network.7b7c53129beac9fc1b3dcaa92a75f4793fb1bcf2820ad39bff400c3e9ca685f8" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:47.527123 containerd[1779]: 2025-12-16 13:40:47.511 [INFO][4745] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:40:47.527123 containerd[1779]: 2025-12-16 13:40:47.511 [INFO][4745] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.32.2/26] IPv6=[] ContainerID="7b7c53129beac9fc1b3dcaa92a75f4793fb1bcf2820ad39bff400c3e9ca685f8" HandleID="k8s-pod-network.7b7c53129beac9fc1b3dcaa92a75f4793fb1bcf2820ad39bff400c3e9ca685f8" Workload="ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--b26rp-eth0" Dec 16 13:40:47.527255 containerd[1779]: 2025-12-16 13:40:47.513 [INFO][4727] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7b7c53129beac9fc1b3dcaa92a75f4793fb1bcf2820ad39bff400c3e9ca685f8" Namespace="calico-apiserver" Pod="calico-apiserver-574757c556-b26rp" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--b26rp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--b26rp-eth0", GenerateName:"calico-apiserver-574757c556-", Namespace:"calico-apiserver", SelfLink:"", UID:"ae6d0914-be51-4e9e-9abc-d81494f00693", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 40, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"574757c556", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-a-7f096d1947", ContainerID:"", Pod:"calico-apiserver-574757c556-b26rp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.32.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6f4a2f74418", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:40:47.527307 containerd[1779]: 2025-12-16 13:40:47.513 [INFO][4727] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.32.2/32] ContainerID="7b7c53129beac9fc1b3dcaa92a75f4793fb1bcf2820ad39bff400c3e9ca685f8" Namespace="calico-apiserver" Pod="calico-apiserver-574757c556-b26rp" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--b26rp-eth0" Dec 16 13:40:47.527307 containerd[1779]: 2025-12-16 13:40:47.513 [INFO][4727] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6f4a2f74418 ContainerID="7b7c53129beac9fc1b3dcaa92a75f4793fb1bcf2820ad39bff400c3e9ca685f8" Namespace="calico-apiserver" Pod="calico-apiserver-574757c556-b26rp" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--b26rp-eth0" Dec 16 13:40:47.527307 containerd[1779]: 2025-12-16 13:40:47.515 [INFO][4727] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7b7c53129beac9fc1b3dcaa92a75f4793fb1bcf2820ad39bff400c3e9ca685f8" Namespace="calico-apiserver" Pod="calico-apiserver-574757c556-b26rp" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--b26rp-eth0" Dec 16 13:40:47.527365 containerd[1779]: 2025-12-16 13:40:47.515 [INFO][4727] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7b7c53129beac9fc1b3dcaa92a75f4793fb1bcf2820ad39bff400c3e9ca685f8" Namespace="calico-apiserver" Pod="calico-apiserver-574757c556-b26rp" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--b26rp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--b26rp-eth0", GenerateName:"calico-apiserver-574757c556-", Namespace:"calico-apiserver", SelfLink:"", UID:"ae6d0914-be51-4e9e-9abc-d81494f00693", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 40, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"574757c556", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-a-7f096d1947", ContainerID:"7b7c53129beac9fc1b3dcaa92a75f4793fb1bcf2820ad39bff400c3e9ca685f8", Pod:"calico-apiserver-574757c556-b26rp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.32.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6f4a2f74418", MAC:"da:93:5e:32:7d:ae", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:40:47.527423 containerd[1779]: 2025-12-16 13:40:47.525 [INFO][4727] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7b7c53129beac9fc1b3dcaa92a75f4793fb1bcf2820ad39bff400c3e9ca685f8" Namespace="calico-apiserver" Pod="calico-apiserver-574757c556-b26rp" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--b26rp-eth0" Dec 16 13:40:47.551697 containerd[1779]: time="2025-12-16T13:40:47.551642617Z" level=info msg="connecting to shim 7b7c53129beac9fc1b3dcaa92a75f4793fb1bcf2820ad39bff400c3e9ca685f8" address="unix:///run/containerd/s/b9df207f383960513c2576c50174340c260bc7ee1a01325f60b96e593db98be9" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:40:47.577782 systemd[1]: Started cri-containerd-7b7c53129beac9fc1b3dcaa92a75f4793fb1bcf2820ad39bff400c3e9ca685f8.scope - libcontainer container 7b7c53129beac9fc1b3dcaa92a75f4793fb1bcf2820ad39bff400c3e9ca685f8. Dec 16 13:40:47.620993 containerd[1779]: time="2025-12-16T13:40:47.620943698Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-574757c556-b26rp,Uid:ae6d0914-be51-4e9e-9abc-d81494f00693,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"7b7c53129beac9fc1b3dcaa92a75f4793fb1bcf2820ad39bff400c3e9ca685f8\"" Dec 16 13:40:47.622357 containerd[1779]: time="2025-12-16T13:40:47.622138102Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:40:47.943944 containerd[1779]: time="2025-12-16T13:40:47.943882631Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:40:47.945501 containerd[1779]: time="2025-12-16T13:40:47.945456510Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:40:47.945583 containerd[1779]: time="2025-12-16T13:40:47.945529209Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:40:47.945755 kubelet[3080]: E1216 13:40:47.945687 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:40:47.945755 kubelet[3080]: E1216 13:40:47.945744 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:40:47.946344 kubelet[3080]: E1216 13:40:47.945871 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-88k89,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-574757c556-b26rp_calico-apiserver(ae6d0914-be51-4e9e-9abc-d81494f00693): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:40:47.947084 kubelet[3080]: E1216 13:40:47.947054 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-b26rp" podUID="ae6d0914-be51-4e9e-9abc-d81494f00693" Dec 16 13:40:48.509797 kubelet[3080]: E1216 13:40:48.509763 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-b26rp" podUID="ae6d0914-be51-4e9e-9abc-d81494f00693" Dec 16 13:40:48.834710 systemd-networkd[1684]: cali6f4a2f74418: Gained IPv6LL Dec 16 13:40:49.415801 containerd[1779]: time="2025-12-16T13:40:49.414852551Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-wzss6,Uid:713c1552-24ab-4b60-9872-de4f52adb23b,Namespace:calico-system,Attempt:0,}" Dec 16 13:40:49.415801 containerd[1779]: time="2025-12-16T13:40:49.415303668Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-zxbn9,Uid:bc92ca59-7a99-4554-9e6f-b60493200978,Namespace:calico-system,Attempt:0,}" Dec 16 13:40:49.415801 containerd[1779]: time="2025-12-16T13:40:49.415320989Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-574757c556-w6l7d,Uid:59d04f45-d51d-4c79-b4da-d0334682cd90,Namespace:calico-apiserver,Attempt:0,}" Dec 16 13:40:49.511363 kubelet[3080]: E1216 13:40:49.511324 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-b26rp" podUID="ae6d0914-be51-4e9e-9abc-d81494f00693" Dec 16 13:40:49.523404 systemd-networkd[1684]: calif6d16bb442d: Link UP Dec 16 13:40:49.523804 systemd-networkd[1684]: calif6d16bb442d: Gained carrier Dec 16 13:40:49.538571 containerd[1779]: 2025-12-16 13:40:49.460 [INFO][4814] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--a--7f096d1947-k8s-csi--node--driver--wzss6-eth0 csi-node-driver- calico-system 713c1552-24ab-4b60-9872-de4f52adb23b 709 0 2025-12-16 13:40:28 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-2-a-7f096d1947 csi-node-driver-wzss6 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calif6d16bb442d [] [] }} ContainerID="f3a291bf162f045d49b5a6401f6f7be3bfafda59ffe8cd769e0edc67aa4fa277" Namespace="calico-system" Pod="csi-node-driver-wzss6" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-csi--node--driver--wzss6-" Dec 16 13:40:49.538571 containerd[1779]: 2025-12-16 13:40:49.460 [INFO][4814] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f3a291bf162f045d49b5a6401f6f7be3bfafda59ffe8cd769e0edc67aa4fa277" Namespace="calico-system" Pod="csi-node-driver-wzss6" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-csi--node--driver--wzss6-eth0" Dec 16 13:40:49.538571 containerd[1779]: 2025-12-16 13:40:49.485 [INFO][4864] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f3a291bf162f045d49b5a6401f6f7be3bfafda59ffe8cd769e0edc67aa4fa277" HandleID="k8s-pod-network.f3a291bf162f045d49b5a6401f6f7be3bfafda59ffe8cd769e0edc67aa4fa277" Workload="ci--4459--2--2--a--7f096d1947-k8s-csi--node--driver--wzss6-eth0" Dec 16 13:40:49.538845 containerd[1779]: 2025-12-16 13:40:49.485 [INFO][4864] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="f3a291bf162f045d49b5a6401f6f7be3bfafda59ffe8cd769e0edc67aa4fa277" HandleID="k8s-pod-network.f3a291bf162f045d49b5a6401f6f7be3bfafda59ffe8cd769e0edc67aa4fa277" Workload="ci--4459--2--2--a--7f096d1947-k8s-csi--node--driver--wzss6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003b40e0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-a-7f096d1947", "pod":"csi-node-driver-wzss6", "timestamp":"2025-12-16 13:40:49.485459109 +0000 UTC"}, Hostname:"ci-4459-2-2-a-7f096d1947", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:40:49.538845 containerd[1779]: 2025-12-16 13:40:49.485 [INFO][4864] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:40:49.538845 containerd[1779]: 2025-12-16 13:40:49.485 [INFO][4864] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:40:49.538845 containerd[1779]: 2025-12-16 13:40:49.485 [INFO][4864] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-a-7f096d1947' Dec 16 13:40:49.538845 containerd[1779]: 2025-12-16 13:40:49.493 [INFO][4864] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f3a291bf162f045d49b5a6401f6f7be3bfafda59ffe8cd769e0edc67aa4fa277" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:49.538845 containerd[1779]: 2025-12-16 13:40:49.498 [INFO][4864] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:49.538845 containerd[1779]: 2025-12-16 13:40:49.503 [INFO][4864] ipam/ipam.go 511: Trying affinity for 192.168.32.0/26 host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:49.538845 containerd[1779]: 2025-12-16 13:40:49.505 [INFO][4864] ipam/ipam.go 158: Attempting to load block cidr=192.168.32.0/26 host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:49.538845 containerd[1779]: 2025-12-16 13:40:49.507 [INFO][4864] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.32.0/26 host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:49.539067 containerd[1779]: 2025-12-16 13:40:49.507 [INFO][4864] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.32.0/26 handle="k8s-pod-network.f3a291bf162f045d49b5a6401f6f7be3bfafda59ffe8cd769e0edc67aa4fa277" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:49.539067 containerd[1779]: 2025-12-16 13:40:49.508 [INFO][4864] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.f3a291bf162f045d49b5a6401f6f7be3bfafda59ffe8cd769e0edc67aa4fa277 Dec 16 13:40:49.539067 containerd[1779]: 2025-12-16 13:40:49.513 [INFO][4864] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.32.0/26 handle="k8s-pod-network.f3a291bf162f045d49b5a6401f6f7be3bfafda59ffe8cd769e0edc67aa4fa277" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:49.539067 containerd[1779]: 2025-12-16 13:40:49.519 [INFO][4864] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.32.3/26] block=192.168.32.0/26 handle="k8s-pod-network.f3a291bf162f045d49b5a6401f6f7be3bfafda59ffe8cd769e0edc67aa4fa277" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:49.539067 containerd[1779]: 2025-12-16 13:40:49.520 [INFO][4864] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.32.3/26] handle="k8s-pod-network.f3a291bf162f045d49b5a6401f6f7be3bfafda59ffe8cd769e0edc67aa4fa277" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:49.539067 containerd[1779]: 2025-12-16 13:40:49.520 [INFO][4864] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:40:49.539067 containerd[1779]: 2025-12-16 13:40:49.520 [INFO][4864] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.32.3/26] IPv6=[] ContainerID="f3a291bf162f045d49b5a6401f6f7be3bfafda59ffe8cd769e0edc67aa4fa277" HandleID="k8s-pod-network.f3a291bf162f045d49b5a6401f6f7be3bfafda59ffe8cd769e0edc67aa4fa277" Workload="ci--4459--2--2--a--7f096d1947-k8s-csi--node--driver--wzss6-eth0" Dec 16 13:40:49.539193 containerd[1779]: 2025-12-16 13:40:49.521 [INFO][4814] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f3a291bf162f045d49b5a6401f6f7be3bfafda59ffe8cd769e0edc67aa4fa277" Namespace="calico-system" Pod="csi-node-driver-wzss6" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-csi--node--driver--wzss6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--a--7f096d1947-k8s-csi--node--driver--wzss6-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"713c1552-24ab-4b60-9872-de4f52adb23b", ResourceVersion:"709", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 40, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-a-7f096d1947", ContainerID:"", Pod:"csi-node-driver-wzss6", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.32.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif6d16bb442d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:40:49.539242 containerd[1779]: 2025-12-16 13:40:49.521 [INFO][4814] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.32.3/32] ContainerID="f3a291bf162f045d49b5a6401f6f7be3bfafda59ffe8cd769e0edc67aa4fa277" Namespace="calico-system" Pod="csi-node-driver-wzss6" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-csi--node--driver--wzss6-eth0" Dec 16 13:40:49.539242 containerd[1779]: 2025-12-16 13:40:49.521 [INFO][4814] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif6d16bb442d ContainerID="f3a291bf162f045d49b5a6401f6f7be3bfafda59ffe8cd769e0edc67aa4fa277" Namespace="calico-system" Pod="csi-node-driver-wzss6" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-csi--node--driver--wzss6-eth0" Dec 16 13:40:49.539242 containerd[1779]: 2025-12-16 13:40:49.524 [INFO][4814] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f3a291bf162f045d49b5a6401f6f7be3bfafda59ffe8cd769e0edc67aa4fa277" Namespace="calico-system" Pod="csi-node-driver-wzss6" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-csi--node--driver--wzss6-eth0" Dec 16 13:40:49.539310 containerd[1779]: 2025-12-16 13:40:49.524 [INFO][4814] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f3a291bf162f045d49b5a6401f6f7be3bfafda59ffe8cd769e0edc67aa4fa277" Namespace="calico-system" Pod="csi-node-driver-wzss6" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-csi--node--driver--wzss6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--a--7f096d1947-k8s-csi--node--driver--wzss6-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"713c1552-24ab-4b60-9872-de4f52adb23b", ResourceVersion:"709", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 40, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-a-7f096d1947", ContainerID:"f3a291bf162f045d49b5a6401f6f7be3bfafda59ffe8cd769e0edc67aa4fa277", Pod:"csi-node-driver-wzss6", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.32.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif6d16bb442d", MAC:"3a:5a:8d:d4:7a:15", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:40:49.539359 containerd[1779]: 2025-12-16 13:40:49.535 [INFO][4814] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f3a291bf162f045d49b5a6401f6f7be3bfafda59ffe8cd769e0edc67aa4fa277" Namespace="calico-system" Pod="csi-node-driver-wzss6" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-csi--node--driver--wzss6-eth0" Dec 16 13:40:49.564343 containerd[1779]: time="2025-12-16T13:40:49.564288395Z" level=info msg="connecting to shim f3a291bf162f045d49b5a6401f6f7be3bfafda59ffe8cd769e0edc67aa4fa277" address="unix:///run/containerd/s/2fc3c3fb72086344371b6f8928423b0943c729371a8dba45862f409648e13861" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:40:49.593786 systemd[1]: Started cri-containerd-f3a291bf162f045d49b5a6401f6f7be3bfafda59ffe8cd769e0edc67aa4fa277.scope - libcontainer container f3a291bf162f045d49b5a6401f6f7be3bfafda59ffe8cd769e0edc67aa4fa277. Dec 16 13:40:49.620416 containerd[1779]: time="2025-12-16T13:40:49.620377423Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-wzss6,Uid:713c1552-24ab-4b60-9872-de4f52adb23b,Namespace:calico-system,Attempt:0,} returns sandbox id \"f3a291bf162f045d49b5a6401f6f7be3bfafda59ffe8cd769e0edc67aa4fa277\"" Dec 16 13:40:49.621622 containerd[1779]: time="2025-12-16T13:40:49.621560900Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 13:40:49.629619 systemd-networkd[1684]: calic51d4dfe24f: Link UP Dec 16 13:40:49.630894 systemd-networkd[1684]: calic51d4dfe24f: Gained carrier Dec 16 13:40:49.647915 containerd[1779]: 2025-12-16 13:40:49.461 [INFO][4836] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--w6l7d-eth0 calico-apiserver-574757c556- calico-apiserver 59d04f45-d51d-4c79-b4da-d0334682cd90 826 0 2025-12-16 13:40:23 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:574757c556 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-a-7f096d1947 calico-apiserver-574757c556-w6l7d eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calic51d4dfe24f [] [] }} ContainerID="970831f8a0b38d8be612780104f81f0121c5c11bfd949f85554be0d9da0627a0" Namespace="calico-apiserver" Pod="calico-apiserver-574757c556-w6l7d" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--w6l7d-" Dec 16 13:40:49.647915 containerd[1779]: 2025-12-16 13:40:49.461 [INFO][4836] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="970831f8a0b38d8be612780104f81f0121c5c11bfd949f85554be0d9da0627a0" Namespace="calico-apiserver" Pod="calico-apiserver-574757c556-w6l7d" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--w6l7d-eth0" Dec 16 13:40:49.647915 containerd[1779]: 2025-12-16 13:40:49.486 [INFO][4866] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="970831f8a0b38d8be612780104f81f0121c5c11bfd949f85554be0d9da0627a0" HandleID="k8s-pod-network.970831f8a0b38d8be612780104f81f0121c5c11bfd949f85554be0d9da0627a0" Workload="ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--w6l7d-eth0" Dec 16 13:40:49.648118 containerd[1779]: 2025-12-16 13:40:49.486 [INFO][4866] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="970831f8a0b38d8be612780104f81f0121c5c11bfd949f85554be0d9da0627a0" HandleID="k8s-pod-network.970831f8a0b38d8be612780104f81f0121c5c11bfd949f85554be0d9da0627a0" Workload="ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--w6l7d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001a5650), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-a-7f096d1947", "pod":"calico-apiserver-574757c556-w6l7d", "timestamp":"2025-12-16 13:40:49.486383653 +0000 UTC"}, Hostname:"ci-4459-2-2-a-7f096d1947", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:40:49.648118 containerd[1779]: 2025-12-16 13:40:49.486 [INFO][4866] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:40:49.648118 containerd[1779]: 2025-12-16 13:40:49.520 [INFO][4866] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:40:49.648118 containerd[1779]: 2025-12-16 13:40:49.520 [INFO][4866] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-a-7f096d1947' Dec 16 13:40:49.648118 containerd[1779]: 2025-12-16 13:40:49.594 [INFO][4866] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.970831f8a0b38d8be612780104f81f0121c5c11bfd949f85554be0d9da0627a0" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:49.648118 containerd[1779]: 2025-12-16 13:40:49.600 [INFO][4866] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:49.648118 containerd[1779]: 2025-12-16 13:40:49.605 [INFO][4866] ipam/ipam.go 511: Trying affinity for 192.168.32.0/26 host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:49.648118 containerd[1779]: 2025-12-16 13:40:49.608 [INFO][4866] ipam/ipam.go 158: Attempting to load block cidr=192.168.32.0/26 host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:49.648118 containerd[1779]: 2025-12-16 13:40:49.610 [INFO][4866] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.32.0/26 host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:49.648303 containerd[1779]: 2025-12-16 13:40:49.610 [INFO][4866] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.32.0/26 handle="k8s-pod-network.970831f8a0b38d8be612780104f81f0121c5c11bfd949f85554be0d9da0627a0" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:49.648303 containerd[1779]: 2025-12-16 13:40:49.612 [INFO][4866] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.970831f8a0b38d8be612780104f81f0121c5c11bfd949f85554be0d9da0627a0 Dec 16 13:40:49.648303 containerd[1779]: 2025-12-16 13:40:49.617 [INFO][4866] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.32.0/26 handle="k8s-pod-network.970831f8a0b38d8be612780104f81f0121c5c11bfd949f85554be0d9da0627a0" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:49.648303 containerd[1779]: 2025-12-16 13:40:49.625 [INFO][4866] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.32.4/26] block=192.168.32.0/26 handle="k8s-pod-network.970831f8a0b38d8be612780104f81f0121c5c11bfd949f85554be0d9da0627a0" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:49.648303 containerd[1779]: 2025-12-16 13:40:49.625 [INFO][4866] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.32.4/26] handle="k8s-pod-network.970831f8a0b38d8be612780104f81f0121c5c11bfd949f85554be0d9da0627a0" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:49.648303 containerd[1779]: 2025-12-16 13:40:49.625 [INFO][4866] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:40:49.648303 containerd[1779]: 2025-12-16 13:40:49.625 [INFO][4866] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.32.4/26] IPv6=[] ContainerID="970831f8a0b38d8be612780104f81f0121c5c11bfd949f85554be0d9da0627a0" HandleID="k8s-pod-network.970831f8a0b38d8be612780104f81f0121c5c11bfd949f85554be0d9da0627a0" Workload="ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--w6l7d-eth0" Dec 16 13:40:49.648436 containerd[1779]: 2025-12-16 13:40:49.627 [INFO][4836] cni-plugin/k8s.go 418: Populated endpoint ContainerID="970831f8a0b38d8be612780104f81f0121c5c11bfd949f85554be0d9da0627a0" Namespace="calico-apiserver" Pod="calico-apiserver-574757c556-w6l7d" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--w6l7d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--w6l7d-eth0", GenerateName:"calico-apiserver-574757c556-", Namespace:"calico-apiserver", SelfLink:"", UID:"59d04f45-d51d-4c79-b4da-d0334682cd90", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 40, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"574757c556", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-a-7f096d1947", ContainerID:"", Pod:"calico-apiserver-574757c556-w6l7d", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.32.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic51d4dfe24f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:40:49.648490 containerd[1779]: 2025-12-16 13:40:49.627 [INFO][4836] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.32.4/32] ContainerID="970831f8a0b38d8be612780104f81f0121c5c11bfd949f85554be0d9da0627a0" Namespace="calico-apiserver" Pod="calico-apiserver-574757c556-w6l7d" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--w6l7d-eth0" Dec 16 13:40:49.648490 containerd[1779]: 2025-12-16 13:40:49.627 [INFO][4836] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic51d4dfe24f ContainerID="970831f8a0b38d8be612780104f81f0121c5c11bfd949f85554be0d9da0627a0" Namespace="calico-apiserver" Pod="calico-apiserver-574757c556-w6l7d" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--w6l7d-eth0" Dec 16 13:40:49.648490 containerd[1779]: 2025-12-16 13:40:49.630 [INFO][4836] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="970831f8a0b38d8be612780104f81f0121c5c11bfd949f85554be0d9da0627a0" Namespace="calico-apiserver" Pod="calico-apiserver-574757c556-w6l7d" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--w6l7d-eth0" Dec 16 13:40:49.648568 containerd[1779]: 2025-12-16 13:40:49.632 [INFO][4836] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="970831f8a0b38d8be612780104f81f0121c5c11bfd949f85554be0d9da0627a0" Namespace="calico-apiserver" Pod="calico-apiserver-574757c556-w6l7d" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--w6l7d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--w6l7d-eth0", GenerateName:"calico-apiserver-574757c556-", Namespace:"calico-apiserver", SelfLink:"", UID:"59d04f45-d51d-4c79-b4da-d0334682cd90", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 40, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"574757c556", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-a-7f096d1947", ContainerID:"970831f8a0b38d8be612780104f81f0121c5c11bfd949f85554be0d9da0627a0", Pod:"calico-apiserver-574757c556-w6l7d", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.32.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic51d4dfe24f", MAC:"ea:67:09:c4:6b:b3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:40:49.648627 containerd[1779]: 2025-12-16 13:40:49.646 [INFO][4836] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="970831f8a0b38d8be612780104f81f0121c5c11bfd949f85554be0d9da0627a0" Namespace="calico-apiserver" Pod="calico-apiserver-574757c556-w6l7d" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-calico--apiserver--574757c556--w6l7d-eth0" Dec 16 13:40:49.676272 containerd[1779]: time="2025-12-16T13:40:49.676159239Z" level=info msg="connecting to shim 970831f8a0b38d8be612780104f81f0121c5c11bfd949f85554be0d9da0627a0" address="unix:///run/containerd/s/ce6bff6971818ad763cd3c3dfea418a1d0d7073f4a49d463d660399c8308a782" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:40:49.698200 systemd[1]: Started cri-containerd-970831f8a0b38d8be612780104f81f0121c5c11bfd949f85554be0d9da0627a0.scope - libcontainer container 970831f8a0b38d8be612780104f81f0121c5c11bfd949f85554be0d9da0627a0. Dec 16 13:40:49.729063 systemd-networkd[1684]: cali713a9a1a194: Link UP Dec 16 13:40:49.729248 systemd-networkd[1684]: cali713a9a1a194: Gained carrier Dec 16 13:40:49.747205 containerd[1779]: time="2025-12-16T13:40:49.747169439Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-574757c556-w6l7d,Uid:59d04f45-d51d-4c79-b4da-d0334682cd90,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"970831f8a0b38d8be612780104f81f0121c5c11bfd949f85554be0d9da0627a0\"" Dec 16 13:40:49.965294 containerd[1779]: time="2025-12-16T13:40:49.965134049Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:40:49.967298 containerd[1779]: time="2025-12-16T13:40:49.967229923Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 13:40:49.967398 containerd[1779]: time="2025-12-16T13:40:49.967324324Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 13:40:49.967490 kubelet[3080]: E1216 13:40:49.967449 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:40:49.967537 kubelet[3080]: E1216 13:40:49.967493 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:40:49.967799 kubelet[3080]: E1216 13:40:49.967711 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t2425,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-wzss6_calico-system(713c1552-24ab-4b60-9872-de4f52adb23b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 13:40:49.967910 containerd[1779]: time="2025-12-16T13:40:49.967852152Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:40:50.300619 containerd[1779]: time="2025-12-16T13:40:50.300485678Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:40:50.302180 containerd[1779]: time="2025-12-16T13:40:50.302127171Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:40:50.302286 containerd[1779]: time="2025-12-16T13:40:50.302171929Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:40:50.302335 kubelet[3080]: E1216 13:40:50.302303 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:40:50.302373 kubelet[3080]: E1216 13:40:50.302349 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:40:50.302650 kubelet[3080]: E1216 13:40:50.302587 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2grc9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-574757c556-w6l7d_calico-apiserver(59d04f45-d51d-4c79-b4da-d0334682cd90): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:40:50.302789 containerd[1779]: time="2025-12-16T13:40:50.302753411Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 13:40:50.304096 kubelet[3080]: E1216 13:40:50.304050 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" podUID="59d04f45-d51d-4c79-b4da-d0334682cd90" Dec 16 13:40:50.394507 containerd[1779]: 2025-12-16 13:40:49.464 [INFO][4820] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--a--7f096d1947-k8s-goldmane--666569f655--zxbn9-eth0 goldmane-666569f655- calico-system bc92ca59-7a99-4554-9e6f-b60493200978 828 0 2025-12-16 13:40:25 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-2-a-7f096d1947 goldmane-666569f655-zxbn9 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali713a9a1a194 [] [] }} ContainerID="d593178e01f9d09dea0493ac4a227101fc53e7f3c1335d185e1268c9d72971ec" Namespace="calico-system" Pod="goldmane-666569f655-zxbn9" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-goldmane--666569f655--zxbn9-" Dec 16 13:40:50.394507 containerd[1779]: 2025-12-16 13:40:49.464 [INFO][4820] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d593178e01f9d09dea0493ac4a227101fc53e7f3c1335d185e1268c9d72971ec" Namespace="calico-system" Pod="goldmane-666569f655-zxbn9" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-goldmane--666569f655--zxbn9-eth0" Dec 16 13:40:50.394507 containerd[1779]: 2025-12-16 13:40:49.494 [INFO][4876] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d593178e01f9d09dea0493ac4a227101fc53e7f3c1335d185e1268c9d72971ec" HandleID="k8s-pod-network.d593178e01f9d09dea0493ac4a227101fc53e7f3c1335d185e1268c9d72971ec" Workload="ci--4459--2--2--a--7f096d1947-k8s-goldmane--666569f655--zxbn9-eth0" Dec 16 13:40:50.394721 containerd[1779]: 2025-12-16 13:40:49.494 [INFO][4876] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="d593178e01f9d09dea0493ac4a227101fc53e7f3c1335d185e1268c9d72971ec" HandleID="k8s-pod-network.d593178e01f9d09dea0493ac4a227101fc53e7f3c1335d185e1268c9d72971ec" Workload="ci--4459--2--2--a--7f096d1947-k8s-goldmane--666569f655--zxbn9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002df590), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-a-7f096d1947", "pod":"goldmane-666569f655-zxbn9", "timestamp":"2025-12-16 13:40:49.494193449 +0000 UTC"}, Hostname:"ci-4459-2-2-a-7f096d1947", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:40:50.394721 containerd[1779]: 2025-12-16 13:40:49.494 [INFO][4876] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:40:50.394721 containerd[1779]: 2025-12-16 13:40:49.625 [INFO][4876] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:40:50.394721 containerd[1779]: 2025-12-16 13:40:49.625 [INFO][4876] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-a-7f096d1947' Dec 16 13:40:50.394721 containerd[1779]: 2025-12-16 13:40:49.694 [INFO][4876] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d593178e01f9d09dea0493ac4a227101fc53e7f3c1335d185e1268c9d72971ec" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.394721 containerd[1779]: 2025-12-16 13:40:49.702 [INFO][4876] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.394721 containerd[1779]: 2025-12-16 13:40:49.707 [INFO][4876] ipam/ipam.go 511: Trying affinity for 192.168.32.0/26 host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.394721 containerd[1779]: 2025-12-16 13:40:49.709 [INFO][4876] ipam/ipam.go 158: Attempting to load block cidr=192.168.32.0/26 host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.394721 containerd[1779]: 2025-12-16 13:40:49.711 [INFO][4876] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.32.0/26 host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.394914 containerd[1779]: 2025-12-16 13:40:49.711 [INFO][4876] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.32.0/26 handle="k8s-pod-network.d593178e01f9d09dea0493ac4a227101fc53e7f3c1335d185e1268c9d72971ec" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.394914 containerd[1779]: 2025-12-16 13:40:49.712 [INFO][4876] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.d593178e01f9d09dea0493ac4a227101fc53e7f3c1335d185e1268c9d72971ec Dec 16 13:40:50.394914 containerd[1779]: 2025-12-16 13:40:49.716 [INFO][4876] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.32.0/26 handle="k8s-pod-network.d593178e01f9d09dea0493ac4a227101fc53e7f3c1335d185e1268c9d72971ec" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.394914 containerd[1779]: 2025-12-16 13:40:49.725 [INFO][4876] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.32.5/26] block=192.168.32.0/26 handle="k8s-pod-network.d593178e01f9d09dea0493ac4a227101fc53e7f3c1335d185e1268c9d72971ec" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.394914 containerd[1779]: 2025-12-16 13:40:49.725 [INFO][4876] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.32.5/26] handle="k8s-pod-network.d593178e01f9d09dea0493ac4a227101fc53e7f3c1335d185e1268c9d72971ec" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.394914 containerd[1779]: 2025-12-16 13:40:49.725 [INFO][4876] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:40:50.394914 containerd[1779]: 2025-12-16 13:40:49.725 [INFO][4876] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.32.5/26] IPv6=[] ContainerID="d593178e01f9d09dea0493ac4a227101fc53e7f3c1335d185e1268c9d72971ec" HandleID="k8s-pod-network.d593178e01f9d09dea0493ac4a227101fc53e7f3c1335d185e1268c9d72971ec" Workload="ci--4459--2--2--a--7f096d1947-k8s-goldmane--666569f655--zxbn9-eth0" Dec 16 13:40:50.395047 containerd[1779]: 2025-12-16 13:40:49.726 [INFO][4820] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d593178e01f9d09dea0493ac4a227101fc53e7f3c1335d185e1268c9d72971ec" Namespace="calico-system" Pod="goldmane-666569f655-zxbn9" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-goldmane--666569f655--zxbn9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--a--7f096d1947-k8s-goldmane--666569f655--zxbn9-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"bc92ca59-7a99-4554-9e6f-b60493200978", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 40, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-a-7f096d1947", ContainerID:"", Pod:"goldmane-666569f655-zxbn9", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.32.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali713a9a1a194", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:40:50.395101 containerd[1779]: 2025-12-16 13:40:49.727 [INFO][4820] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.32.5/32] ContainerID="d593178e01f9d09dea0493ac4a227101fc53e7f3c1335d185e1268c9d72971ec" Namespace="calico-system" Pod="goldmane-666569f655-zxbn9" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-goldmane--666569f655--zxbn9-eth0" Dec 16 13:40:50.395101 containerd[1779]: 2025-12-16 13:40:49.727 [INFO][4820] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali713a9a1a194 ContainerID="d593178e01f9d09dea0493ac4a227101fc53e7f3c1335d185e1268c9d72971ec" Namespace="calico-system" Pod="goldmane-666569f655-zxbn9" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-goldmane--666569f655--zxbn9-eth0" Dec 16 13:40:50.395101 containerd[1779]: 2025-12-16 13:40:49.729 [INFO][4820] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d593178e01f9d09dea0493ac4a227101fc53e7f3c1335d185e1268c9d72971ec" Namespace="calico-system" Pod="goldmane-666569f655-zxbn9" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-goldmane--666569f655--zxbn9-eth0" Dec 16 13:40:50.395163 containerd[1779]: 2025-12-16 13:40:49.729 [INFO][4820] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d593178e01f9d09dea0493ac4a227101fc53e7f3c1335d185e1268c9d72971ec" Namespace="calico-system" Pod="goldmane-666569f655-zxbn9" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-goldmane--666569f655--zxbn9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--a--7f096d1947-k8s-goldmane--666569f655--zxbn9-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"bc92ca59-7a99-4554-9e6f-b60493200978", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 40, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-a-7f096d1947", ContainerID:"d593178e01f9d09dea0493ac4a227101fc53e7f3c1335d185e1268c9d72971ec", Pod:"goldmane-666569f655-zxbn9", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.32.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali713a9a1a194", MAC:"32:c1:4e:94:31:8a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:40:50.395217 containerd[1779]: 2025-12-16 13:40:50.392 [INFO][4820] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d593178e01f9d09dea0493ac4a227101fc53e7f3c1335d185e1268c9d72971ec" Namespace="calico-system" Pod="goldmane-666569f655-zxbn9" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-goldmane--666569f655--zxbn9-eth0" Dec 16 13:40:50.414524 containerd[1779]: time="2025-12-16T13:40:50.414218084Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5d9cf7b5c4-4vfj5,Uid:4477497f-a609-4791-b674-25df11e8ec73,Namespace:calico-system,Attempt:0,}" Dec 16 13:40:50.414524 containerd[1779]: time="2025-12-16T13:40:50.414223206Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-qpg2m,Uid:22edb7e4-dff8-4395-80d0-f36295b5be55,Namespace:kube-system,Attempt:0,}" Dec 16 13:40:50.414524 containerd[1779]: time="2025-12-16T13:40:50.414327211Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-hpqzp,Uid:94bfd095-3b1b-4404-872c-700cc2c230a9,Namespace:kube-system,Attempt:0,}" Dec 16 13:40:50.417931 containerd[1779]: time="2025-12-16T13:40:50.417894073Z" level=info msg="connecting to shim d593178e01f9d09dea0493ac4a227101fc53e7f3c1335d185e1268c9d72971ec" address="unix:///run/containerd/s/75cad38fa8e0bff35ac729756c84a54fb1469320814dd3c2bbc0ae70f29445b7" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:40:50.443784 systemd[1]: Started cri-containerd-d593178e01f9d09dea0493ac4a227101fc53e7f3c1335d185e1268c9d72971ec.scope - libcontainer container d593178e01f9d09dea0493ac4a227101fc53e7f3c1335d185e1268c9d72971ec. Dec 16 13:40:50.495721 containerd[1779]: time="2025-12-16T13:40:50.495687584Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-zxbn9,Uid:bc92ca59-7a99-4554-9e6f-b60493200978,Namespace:calico-system,Attempt:0,} returns sandbox id \"d593178e01f9d09dea0493ac4a227101fc53e7f3c1335d185e1268c9d72971ec\"" Dec 16 13:40:50.514827 kubelet[3080]: E1216 13:40:50.514794 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" podUID="59d04f45-d51d-4c79-b4da-d0334682cd90" Dec 16 13:40:50.527656 systemd-networkd[1684]: calic8ec1e199e6: Link UP Dec 16 13:40:50.528357 systemd-networkd[1684]: calic8ec1e199e6: Gained carrier Dec 16 13:40:50.542507 containerd[1779]: 2025-12-16 13:40:50.458 [INFO][5054] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--a--7f096d1947-k8s-calico--kube--controllers--5d9cf7b5c4--4vfj5-eth0 calico-kube-controllers-5d9cf7b5c4- calico-system 4477497f-a609-4791-b674-25df11e8ec73 823 0 2025-12-16 13:40:28 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5d9cf7b5c4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-2-a-7f096d1947 calico-kube-controllers-5d9cf7b5c4-4vfj5 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calic8ec1e199e6 [] [] }} ContainerID="35c716965e7e0fe823388da1f3c4eeef7edf65a19ec797d1433df20dd148fe88" Namespace="calico-system" Pod="calico-kube-controllers-5d9cf7b5c4-4vfj5" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-calico--kube--controllers--5d9cf7b5c4--4vfj5-" Dec 16 13:40:50.542507 containerd[1779]: 2025-12-16 13:40:50.459 [INFO][5054] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="35c716965e7e0fe823388da1f3c4eeef7edf65a19ec797d1433df20dd148fe88" Namespace="calico-system" Pod="calico-kube-controllers-5d9cf7b5c4-4vfj5" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-calico--kube--controllers--5d9cf7b5c4--4vfj5-eth0" Dec 16 13:40:50.542507 containerd[1779]: 2025-12-16 13:40:50.483 [INFO][5119] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="35c716965e7e0fe823388da1f3c4eeef7edf65a19ec797d1433df20dd148fe88" HandleID="k8s-pod-network.35c716965e7e0fe823388da1f3c4eeef7edf65a19ec797d1433df20dd148fe88" Workload="ci--4459--2--2--a--7f096d1947-k8s-calico--kube--controllers--5d9cf7b5c4--4vfj5-eth0" Dec 16 13:40:50.542734 containerd[1779]: 2025-12-16 13:40:50.483 [INFO][5119] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="35c716965e7e0fe823388da1f3c4eeef7edf65a19ec797d1433df20dd148fe88" HandleID="k8s-pod-network.35c716965e7e0fe823388da1f3c4eeef7edf65a19ec797d1433df20dd148fe88" Workload="ci--4459--2--2--a--7f096d1947-k8s-calico--kube--controllers--5d9cf7b5c4--4vfj5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0006821c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-a-7f096d1947", "pod":"calico-kube-controllers-5d9cf7b5c4-4vfj5", "timestamp":"2025-12-16 13:40:50.483729256 +0000 UTC"}, Hostname:"ci-4459-2-2-a-7f096d1947", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:40:50.542734 containerd[1779]: 2025-12-16 13:40:50.483 [INFO][5119] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:40:50.542734 containerd[1779]: 2025-12-16 13:40:50.483 [INFO][5119] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:40:50.542734 containerd[1779]: 2025-12-16 13:40:50.484 [INFO][5119] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-a-7f096d1947' Dec 16 13:40:50.542734 containerd[1779]: 2025-12-16 13:40:50.490 [INFO][5119] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.35c716965e7e0fe823388da1f3c4eeef7edf65a19ec797d1433df20dd148fe88" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.542734 containerd[1779]: 2025-12-16 13:40:50.498 [INFO][5119] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.542734 containerd[1779]: 2025-12-16 13:40:50.504 [INFO][5119] ipam/ipam.go 511: Trying affinity for 192.168.32.0/26 host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.542734 containerd[1779]: 2025-12-16 13:40:50.505 [INFO][5119] ipam/ipam.go 158: Attempting to load block cidr=192.168.32.0/26 host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.542734 containerd[1779]: 2025-12-16 13:40:50.507 [INFO][5119] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.32.0/26 host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.542922 containerd[1779]: 2025-12-16 13:40:50.507 [INFO][5119] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.32.0/26 handle="k8s-pod-network.35c716965e7e0fe823388da1f3c4eeef7edf65a19ec797d1433df20dd148fe88" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.542922 containerd[1779]: 2025-12-16 13:40:50.509 [INFO][5119] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.35c716965e7e0fe823388da1f3c4eeef7edf65a19ec797d1433df20dd148fe88 Dec 16 13:40:50.542922 containerd[1779]: 2025-12-16 13:40:50.514 [INFO][5119] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.32.0/26 handle="k8s-pod-network.35c716965e7e0fe823388da1f3c4eeef7edf65a19ec797d1433df20dd148fe88" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.542922 containerd[1779]: 2025-12-16 13:40:50.522 [INFO][5119] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.32.6/26] block=192.168.32.0/26 handle="k8s-pod-network.35c716965e7e0fe823388da1f3c4eeef7edf65a19ec797d1433df20dd148fe88" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.542922 containerd[1779]: 2025-12-16 13:40:50.522 [INFO][5119] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.32.6/26] handle="k8s-pod-network.35c716965e7e0fe823388da1f3c4eeef7edf65a19ec797d1433df20dd148fe88" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.542922 containerd[1779]: 2025-12-16 13:40:50.522 [INFO][5119] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:40:50.542922 containerd[1779]: 2025-12-16 13:40:50.522 [INFO][5119] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.32.6/26] IPv6=[] ContainerID="35c716965e7e0fe823388da1f3c4eeef7edf65a19ec797d1433df20dd148fe88" HandleID="k8s-pod-network.35c716965e7e0fe823388da1f3c4eeef7edf65a19ec797d1433df20dd148fe88" Workload="ci--4459--2--2--a--7f096d1947-k8s-calico--kube--controllers--5d9cf7b5c4--4vfj5-eth0" Dec 16 13:40:50.543057 containerd[1779]: 2025-12-16 13:40:50.524 [INFO][5054] cni-plugin/k8s.go 418: Populated endpoint ContainerID="35c716965e7e0fe823388da1f3c4eeef7edf65a19ec797d1433df20dd148fe88" Namespace="calico-system" Pod="calico-kube-controllers-5d9cf7b5c4-4vfj5" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-calico--kube--controllers--5d9cf7b5c4--4vfj5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--a--7f096d1947-k8s-calico--kube--controllers--5d9cf7b5c4--4vfj5-eth0", GenerateName:"calico-kube-controllers-5d9cf7b5c4-", Namespace:"calico-system", SelfLink:"", UID:"4477497f-a609-4791-b674-25df11e8ec73", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 40, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5d9cf7b5c4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-a-7f096d1947", ContainerID:"", Pod:"calico-kube-controllers-5d9cf7b5c4-4vfj5", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.32.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic8ec1e199e6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:40:50.543111 containerd[1779]: 2025-12-16 13:40:50.524 [INFO][5054] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.32.6/32] ContainerID="35c716965e7e0fe823388da1f3c4eeef7edf65a19ec797d1433df20dd148fe88" Namespace="calico-system" Pod="calico-kube-controllers-5d9cf7b5c4-4vfj5" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-calico--kube--controllers--5d9cf7b5c4--4vfj5-eth0" Dec 16 13:40:50.543111 containerd[1779]: 2025-12-16 13:40:50.524 [INFO][5054] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic8ec1e199e6 ContainerID="35c716965e7e0fe823388da1f3c4eeef7edf65a19ec797d1433df20dd148fe88" Namespace="calico-system" Pod="calico-kube-controllers-5d9cf7b5c4-4vfj5" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-calico--kube--controllers--5d9cf7b5c4--4vfj5-eth0" Dec 16 13:40:50.543111 containerd[1779]: 2025-12-16 13:40:50.529 [INFO][5054] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="35c716965e7e0fe823388da1f3c4eeef7edf65a19ec797d1433df20dd148fe88" Namespace="calico-system" Pod="calico-kube-controllers-5d9cf7b5c4-4vfj5" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-calico--kube--controllers--5d9cf7b5c4--4vfj5-eth0" Dec 16 13:40:50.543174 containerd[1779]: 2025-12-16 13:40:50.530 [INFO][5054] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="35c716965e7e0fe823388da1f3c4eeef7edf65a19ec797d1433df20dd148fe88" Namespace="calico-system" Pod="calico-kube-controllers-5d9cf7b5c4-4vfj5" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-calico--kube--controllers--5d9cf7b5c4--4vfj5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--a--7f096d1947-k8s-calico--kube--controllers--5d9cf7b5c4--4vfj5-eth0", GenerateName:"calico-kube-controllers-5d9cf7b5c4-", Namespace:"calico-system", SelfLink:"", UID:"4477497f-a609-4791-b674-25df11e8ec73", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 40, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5d9cf7b5c4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-a-7f096d1947", ContainerID:"35c716965e7e0fe823388da1f3c4eeef7edf65a19ec797d1433df20dd148fe88", Pod:"calico-kube-controllers-5d9cf7b5c4-4vfj5", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.32.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic8ec1e199e6", MAC:"4e:b8:7e:e1:47:e8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:40:50.543222 containerd[1779]: 2025-12-16 13:40:50.540 [INFO][5054] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="35c716965e7e0fe823388da1f3c4eeef7edf65a19ec797d1433df20dd148fe88" Namespace="calico-system" Pod="calico-kube-controllers-5d9cf7b5c4-4vfj5" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-calico--kube--controllers--5d9cf7b5c4--4vfj5-eth0" Dec 16 13:40:50.568697 containerd[1779]: time="2025-12-16T13:40:50.568594409Z" level=info msg="connecting to shim 35c716965e7e0fe823388da1f3c4eeef7edf65a19ec797d1433df20dd148fe88" address="unix:///run/containerd/s/41acaa9a001158a3fd8f41e0f9233cc9f0ec4b9834570340d1792e6420151967" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:40:50.594739 systemd[1]: Started cri-containerd-35c716965e7e0fe823388da1f3c4eeef7edf65a19ec797d1433df20dd148fe88.scope - libcontainer container 35c716965e7e0fe823388da1f3c4eeef7edf65a19ec797d1433df20dd148fe88. Dec 16 13:40:50.624206 systemd-networkd[1684]: cali486c52c7bb6: Link UP Dec 16 13:40:50.626071 systemd-networkd[1684]: cali486c52c7bb6: Gained carrier Dec 16 13:40:50.632013 containerd[1779]: time="2025-12-16T13:40:50.631979428Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:40:50.634055 containerd[1779]: time="2025-12-16T13:40:50.633984891Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 13:40:50.634055 containerd[1779]: time="2025-12-16T13:40:50.634025661Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 13:40:50.634208 kubelet[3080]: E1216 13:40:50.634178 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:40:50.634249 kubelet[3080]: E1216 13:40:50.634220 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:40:50.634444 kubelet[3080]: E1216 13:40:50.634405 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t2425,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-wzss6_calico-system(713c1552-24ab-4b60-9872-de4f52adb23b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 13:40:50.634696 containerd[1779]: time="2025-12-16T13:40:50.634677822Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 13:40:50.635700 kubelet[3080]: E1216 13:40:50.635664 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:40:50.639148 containerd[1779]: 2025-12-16 13:40:50.461 [INFO][5073] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--qpg2m-eth0 coredns-674b8bbfcf- kube-system 22edb7e4-dff8-4395-80d0-f36295b5be55 827 0 2025-12-16 13:40:13 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-a-7f096d1947 coredns-674b8bbfcf-qpg2m eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali486c52c7bb6 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba" Namespace="kube-system" Pod="coredns-674b8bbfcf-qpg2m" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--qpg2m-" Dec 16 13:40:50.639148 containerd[1779]: 2025-12-16 13:40:50.462 [INFO][5073] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba" Namespace="kube-system" Pod="coredns-674b8bbfcf-qpg2m" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--qpg2m-eth0" Dec 16 13:40:50.639148 containerd[1779]: 2025-12-16 13:40:50.491 [INFO][5126] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba" HandleID="k8s-pod-network.69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba" Workload="ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--qpg2m-eth0" Dec 16 13:40:50.639309 containerd[1779]: 2025-12-16 13:40:50.492 [INFO][5126] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba" HandleID="k8s-pod-network.69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba" Workload="ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--qpg2m-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e490), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-a-7f096d1947", "pod":"coredns-674b8bbfcf-qpg2m", "timestamp":"2025-12-16 13:40:50.491598146 +0000 UTC"}, Hostname:"ci-4459-2-2-a-7f096d1947", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:40:50.639309 containerd[1779]: 2025-12-16 13:40:50.492 [INFO][5126] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:40:50.639309 containerd[1779]: 2025-12-16 13:40:50.522 [INFO][5126] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:40:50.639309 containerd[1779]: 2025-12-16 13:40:50.522 [INFO][5126] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-a-7f096d1947' Dec 16 13:40:50.639309 containerd[1779]: 2025-12-16 13:40:50.592 [INFO][5126] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.639309 containerd[1779]: 2025-12-16 13:40:50.600 [INFO][5126] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.639309 containerd[1779]: 2025-12-16 13:40:50.603 [INFO][5126] ipam/ipam.go 511: Trying affinity for 192.168.32.0/26 host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.639309 containerd[1779]: 2025-12-16 13:40:50.605 [INFO][5126] ipam/ipam.go 158: Attempting to load block cidr=192.168.32.0/26 host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.639309 containerd[1779]: 2025-12-16 13:40:50.607 [INFO][5126] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.32.0/26 host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.639490 containerd[1779]: 2025-12-16 13:40:50.607 [INFO][5126] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.32.0/26 handle="k8s-pod-network.69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.639490 containerd[1779]: 2025-12-16 13:40:50.608 [INFO][5126] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba Dec 16 13:40:50.639490 containerd[1779]: 2025-12-16 13:40:50.612 [INFO][5126] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.32.0/26 handle="k8s-pod-network.69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.639490 containerd[1779]: 2025-12-16 13:40:50.619 [INFO][5126] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.32.7/26] block=192.168.32.0/26 handle="k8s-pod-network.69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.639490 containerd[1779]: 2025-12-16 13:40:50.619 [INFO][5126] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.32.7/26] handle="k8s-pod-network.69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.639490 containerd[1779]: 2025-12-16 13:40:50.619 [INFO][5126] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:40:50.639490 containerd[1779]: 2025-12-16 13:40:50.619 [INFO][5126] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.32.7/26] IPv6=[] ContainerID="69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba" HandleID="k8s-pod-network.69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba" Workload="ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--qpg2m-eth0" Dec 16 13:40:50.640060 containerd[1779]: 2025-12-16 13:40:50.622 [INFO][5073] cni-plugin/k8s.go 418: Populated endpoint ContainerID="69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba" Namespace="kube-system" Pod="coredns-674b8bbfcf-qpg2m" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--qpg2m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--qpg2m-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"22edb7e4-dff8-4395-80d0-f36295b5be55", ResourceVersion:"827", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 40, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-a-7f096d1947", ContainerID:"", Pod:"coredns-674b8bbfcf-qpg2m", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.32.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali486c52c7bb6", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:40:50.640060 containerd[1779]: 2025-12-16 13:40:50.622 [INFO][5073] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.32.7/32] ContainerID="69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba" Namespace="kube-system" Pod="coredns-674b8bbfcf-qpg2m" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--qpg2m-eth0" Dec 16 13:40:50.640060 containerd[1779]: 2025-12-16 13:40:50.623 [INFO][5073] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali486c52c7bb6 ContainerID="69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba" Namespace="kube-system" Pod="coredns-674b8bbfcf-qpg2m" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--qpg2m-eth0" Dec 16 13:40:50.640060 containerd[1779]: 2025-12-16 13:40:50.625 [INFO][5073] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba" Namespace="kube-system" Pod="coredns-674b8bbfcf-qpg2m" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--qpg2m-eth0" Dec 16 13:40:50.640060 containerd[1779]: 2025-12-16 13:40:50.626 [INFO][5073] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba" Namespace="kube-system" Pod="coredns-674b8bbfcf-qpg2m" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--qpg2m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--qpg2m-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"22edb7e4-dff8-4395-80d0-f36295b5be55", ResourceVersion:"827", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 40, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-a-7f096d1947", ContainerID:"69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba", Pod:"coredns-674b8bbfcf-qpg2m", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.32.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali486c52c7bb6", MAC:"d2:4e:5c:30:97:58", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:40:50.640060 containerd[1779]: 2025-12-16 13:40:50.636 [INFO][5073] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba" Namespace="kube-system" Pod="coredns-674b8bbfcf-qpg2m" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--qpg2m-eth0" Dec 16 13:40:50.648446 containerd[1779]: time="2025-12-16T13:40:50.648413944Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5d9cf7b5c4-4vfj5,Uid:4477497f-a609-4791-b674-25df11e8ec73,Namespace:calico-system,Attempt:0,} returns sandbox id \"35c716965e7e0fe823388da1f3c4eeef7edf65a19ec797d1433df20dd148fe88\"" Dec 16 13:40:50.669906 containerd[1779]: time="2025-12-16T13:40:50.669863095Z" level=info msg="connecting to shim 69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba" address="unix:///run/containerd/s/9d35241b3506d48142a1b8d35a1fc42faa4d0a5bb2d1687ecec1eccc4aae3321" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:40:50.690719 systemd-networkd[1684]: calif6d16bb442d: Gained IPv6LL Dec 16 13:40:50.694781 systemd[1]: Started cri-containerd-69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba.scope - libcontainer container 69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba. Dec 16 13:40:50.727607 systemd-networkd[1684]: cali515928b8264: Link UP Dec 16 13:40:50.729922 systemd-networkd[1684]: cali515928b8264: Gained carrier Dec 16 13:40:50.744235 containerd[1779]: 2025-12-16 13:40:50.464 [INFO][5085] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--hpqzp-eth0 coredns-674b8bbfcf- kube-system 94bfd095-3b1b-4404-872c-700cc2c230a9 822 0 2025-12-16 13:40:13 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-a-7f096d1947 coredns-674b8bbfcf-hpqzp eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali515928b8264 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d" Namespace="kube-system" Pod="coredns-674b8bbfcf-hpqzp" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--hpqzp-" Dec 16 13:40:50.744235 containerd[1779]: 2025-12-16 13:40:50.464 [INFO][5085] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d" Namespace="kube-system" Pod="coredns-674b8bbfcf-hpqzp" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--hpqzp-eth0" Dec 16 13:40:50.744235 containerd[1779]: 2025-12-16 13:40:50.493 [INFO][5127] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d" HandleID="k8s-pod-network.e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d" Workload="ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--hpqzp-eth0" Dec 16 13:40:50.744235 containerd[1779]: 2025-12-16 13:40:50.493 [INFO][5127] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d" HandleID="k8s-pod-network.e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d" Workload="ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--hpqzp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000139e60), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-a-7f096d1947", "pod":"coredns-674b8bbfcf-hpqzp", "timestamp":"2025-12-16 13:40:50.4934886 +0000 UTC"}, Hostname:"ci-4459-2-2-a-7f096d1947", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:40:50.744235 containerd[1779]: 2025-12-16 13:40:50.493 [INFO][5127] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:40:50.744235 containerd[1779]: 2025-12-16 13:40:50.619 [INFO][5127] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:40:50.744235 containerd[1779]: 2025-12-16 13:40:50.619 [INFO][5127] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-a-7f096d1947' Dec 16 13:40:50.744235 containerd[1779]: 2025-12-16 13:40:50.694 [INFO][5127] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.744235 containerd[1779]: 2025-12-16 13:40:50.701 [INFO][5127] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.744235 containerd[1779]: 2025-12-16 13:40:50.705 [INFO][5127] ipam/ipam.go 511: Trying affinity for 192.168.32.0/26 host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.744235 containerd[1779]: 2025-12-16 13:40:50.706 [INFO][5127] ipam/ipam.go 158: Attempting to load block cidr=192.168.32.0/26 host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.744235 containerd[1779]: 2025-12-16 13:40:50.708 [INFO][5127] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.32.0/26 host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.744235 containerd[1779]: 2025-12-16 13:40:50.708 [INFO][5127] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.32.0/26 handle="k8s-pod-network.e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.744235 containerd[1779]: 2025-12-16 13:40:50.709 [INFO][5127] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d Dec 16 13:40:50.744235 containerd[1779]: 2025-12-16 13:40:50.714 [INFO][5127] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.32.0/26 handle="k8s-pod-network.e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.744235 containerd[1779]: 2025-12-16 13:40:50.722 [INFO][5127] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.32.8/26] block=192.168.32.0/26 handle="k8s-pod-network.e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.744235 containerd[1779]: 2025-12-16 13:40:50.722 [INFO][5127] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.32.8/26] handle="k8s-pod-network.e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d" host="ci-4459-2-2-a-7f096d1947" Dec 16 13:40:50.744235 containerd[1779]: 2025-12-16 13:40:50.722 [INFO][5127] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:40:50.744235 containerd[1779]: 2025-12-16 13:40:50.722 [INFO][5127] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.32.8/26] IPv6=[] ContainerID="e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d" HandleID="k8s-pod-network.e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d" Workload="ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--hpqzp-eth0" Dec 16 13:40:50.744783 containerd[1779]: 2025-12-16 13:40:50.724 [INFO][5085] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d" Namespace="kube-system" Pod="coredns-674b8bbfcf-hpqzp" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--hpqzp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--hpqzp-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"94bfd095-3b1b-4404-872c-700cc2c230a9", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 40, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-a-7f096d1947", ContainerID:"", Pod:"coredns-674b8bbfcf-hpqzp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.32.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali515928b8264", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:40:50.744783 containerd[1779]: 2025-12-16 13:40:50.725 [INFO][5085] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.32.8/32] ContainerID="e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d" Namespace="kube-system" Pod="coredns-674b8bbfcf-hpqzp" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--hpqzp-eth0" Dec 16 13:40:50.744783 containerd[1779]: 2025-12-16 13:40:50.725 [INFO][5085] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali515928b8264 ContainerID="e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d" Namespace="kube-system" Pod="coredns-674b8bbfcf-hpqzp" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--hpqzp-eth0" Dec 16 13:40:50.744783 containerd[1779]: 2025-12-16 13:40:50.730 [INFO][5085] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d" Namespace="kube-system" Pod="coredns-674b8bbfcf-hpqzp" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--hpqzp-eth0" Dec 16 13:40:50.744783 containerd[1779]: 2025-12-16 13:40:50.730 [INFO][5085] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d" Namespace="kube-system" Pod="coredns-674b8bbfcf-hpqzp" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--hpqzp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--hpqzp-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"94bfd095-3b1b-4404-872c-700cc2c230a9", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 40, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-a-7f096d1947", ContainerID:"e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d", Pod:"coredns-674b8bbfcf-hpqzp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.32.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali515928b8264", MAC:"f2:9a:99:4f:bb:41", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:40:50.744783 containerd[1779]: 2025-12-16 13:40:50.741 [INFO][5085] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d" Namespace="kube-system" Pod="coredns-674b8bbfcf-hpqzp" WorkloadEndpoint="ci--4459--2--2--a--7f096d1947-k8s-coredns--674b8bbfcf--hpqzp-eth0" Dec 16 13:40:50.744783 containerd[1779]: time="2025-12-16T13:40:50.744645020Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-qpg2m,Uid:22edb7e4-dff8-4395-80d0-f36295b5be55,Namespace:kube-system,Attempt:0,} returns sandbox id \"69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba\"" Dec 16 13:40:50.755409 containerd[1779]: time="2025-12-16T13:40:50.755325064Z" level=info msg="CreateContainer within sandbox \"69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 13:40:50.770922 containerd[1779]: time="2025-12-16T13:40:50.770866415Z" level=info msg="Container ef372280e9acdb00475994ed8ef7cb8d76a2755f98f5b46a558f42e03cb392f3: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:40:50.782881 containerd[1779]: time="2025-12-16T13:40:50.782838138Z" level=info msg="CreateContainer within sandbox \"69a95e8a119bd90453ebcb4657c619ebb4f19dd9e739824de76cb8b0dd782fba\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ef372280e9acdb00475994ed8ef7cb8d76a2755f98f5b46a558f42e03cb392f3\"" Dec 16 13:40:50.783771 containerd[1779]: time="2025-12-16T13:40:50.783749278Z" level=info msg="StartContainer for \"ef372280e9acdb00475994ed8ef7cb8d76a2755f98f5b46a558f42e03cb392f3\"" Dec 16 13:40:50.784062 containerd[1779]: time="2025-12-16T13:40:50.784000183Z" level=info msg="connecting to shim e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d" address="unix:///run/containerd/s/ebe333d0249a0f87ddb7b4bf46a1d98074cc40e7a8a910376766f8cdf20f07a7" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:40:50.784446 containerd[1779]: time="2025-12-16T13:40:50.784426537Z" level=info msg="connecting to shim ef372280e9acdb00475994ed8ef7cb8d76a2755f98f5b46a558f42e03cb392f3" address="unix:///run/containerd/s/9d35241b3506d48142a1b8d35a1fc42faa4d0a5bb2d1687ecec1eccc4aae3321" protocol=ttrpc version=3 Dec 16 13:40:50.803789 systemd[1]: Started cri-containerd-ef372280e9acdb00475994ed8ef7cb8d76a2755f98f5b46a558f42e03cb392f3.scope - libcontainer container ef372280e9acdb00475994ed8ef7cb8d76a2755f98f5b46a558f42e03cb392f3. Dec 16 13:40:50.806649 systemd[1]: Started cri-containerd-e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d.scope - libcontainer container e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d. Dec 16 13:40:50.830558 containerd[1779]: time="2025-12-16T13:40:50.830465022Z" level=info msg="StartContainer for \"ef372280e9acdb00475994ed8ef7cb8d76a2755f98f5b46a558f42e03cb392f3\" returns successfully" Dec 16 13:40:50.851017 containerd[1779]: time="2025-12-16T13:40:50.850960281Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-hpqzp,Uid:94bfd095-3b1b-4404-872c-700cc2c230a9,Namespace:kube-system,Attempt:0,} returns sandbox id \"e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d\"" Dec 16 13:40:50.857014 containerd[1779]: time="2025-12-16T13:40:50.856955036Z" level=info msg="CreateContainer within sandbox \"e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 13:40:50.869028 containerd[1779]: time="2025-12-16T13:40:50.868506296Z" level=info msg="Container d0e0183d335774319dbf25a377be70d7258072c5e6a82231188aeb6c088cd0a2: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:40:50.875465 containerd[1779]: time="2025-12-16T13:40:50.875431369Z" level=info msg="CreateContainer within sandbox \"e76a65ee8cfdc76533245956bb6480bc0c4ab37fbba8eeaf8cd57d00a04c7e2d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d0e0183d335774319dbf25a377be70d7258072c5e6a82231188aeb6c088cd0a2\"" Dec 16 13:40:50.876076 containerd[1779]: time="2025-12-16T13:40:50.876050408Z" level=info msg="StartContainer for \"d0e0183d335774319dbf25a377be70d7258072c5e6a82231188aeb6c088cd0a2\"" Dec 16 13:40:50.876821 containerd[1779]: time="2025-12-16T13:40:50.876800509Z" level=info msg="connecting to shim d0e0183d335774319dbf25a377be70d7258072c5e6a82231188aeb6c088cd0a2" address="unix:///run/containerd/s/ebe333d0249a0f87ddb7b4bf46a1d98074cc40e7a8a910376766f8cdf20f07a7" protocol=ttrpc version=3 Dec 16 13:40:50.901788 systemd[1]: Started cri-containerd-d0e0183d335774319dbf25a377be70d7258072c5e6a82231188aeb6c088cd0a2.scope - libcontainer container d0e0183d335774319dbf25a377be70d7258072c5e6a82231188aeb6c088cd0a2. Dec 16 13:40:50.931292 containerd[1779]: time="2025-12-16T13:40:50.931242343Z" level=info msg="StartContainer for \"d0e0183d335774319dbf25a377be70d7258072c5e6a82231188aeb6c088cd0a2\" returns successfully" Dec 16 13:40:50.967378 containerd[1779]: time="2025-12-16T13:40:50.967312605Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:40:50.969615 containerd[1779]: time="2025-12-16T13:40:50.969470067Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 13:40:50.970179 containerd[1779]: time="2025-12-16T13:40:50.969674841Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 13:40:50.970220 kubelet[3080]: E1216 13:40:50.969809 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:40:50.970220 kubelet[3080]: E1216 13:40:50.969853 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:40:50.970296 containerd[1779]: time="2025-12-16T13:40:50.970233408Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 13:40:50.970671 kubelet[3080]: E1216 13:40:50.970598 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cbfcp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-zxbn9_calico-system(bc92ca59-7a99-4554-9e6f-b60493200978): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 13:40:50.971902 kubelet[3080]: E1216 13:40:50.971849 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-zxbn9" podUID="bc92ca59-7a99-4554-9e6f-b60493200978" Dec 16 13:40:51.202861 systemd-networkd[1684]: cali713a9a1a194: Gained IPv6LL Dec 16 13:40:51.302497 containerd[1779]: time="2025-12-16T13:40:51.302415118Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:40:51.304451 containerd[1779]: time="2025-12-16T13:40:51.304397115Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 13:40:51.304508 containerd[1779]: time="2025-12-16T13:40:51.304439883Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 13:40:51.304707 kubelet[3080]: E1216 13:40:51.304662 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:40:51.304770 kubelet[3080]: E1216 13:40:51.304719 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:40:51.304925 kubelet[3080]: E1216 13:40:51.304876 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sdnqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-5d9cf7b5c4-4vfj5_calico-system(4477497f-a609-4791-b674-25df11e8ec73): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 13:40:51.306097 kubelet[3080]: E1216 13:40:51.306047 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" podUID="4477497f-a609-4791-b674-25df11e8ec73" Dec 16 13:40:51.520687 kubelet[3080]: E1216 13:40:51.520587 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" podUID="59d04f45-d51d-4c79-b4da-d0334682cd90" Dec 16 13:40:51.521752 kubelet[3080]: E1216 13:40:51.520784 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" podUID="4477497f-a609-4791-b674-25df11e8ec73" Dec 16 13:40:51.521752 kubelet[3080]: E1216 13:40:51.521158 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:40:51.523909 kubelet[3080]: E1216 13:40:51.523819 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-zxbn9" podUID="bc92ca59-7a99-4554-9e6f-b60493200978" Dec 16 13:40:51.530566 kubelet[3080]: I1216 13:40:51.530351 3080 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-hpqzp" podStartSLOduration=38.530335243 podStartE2EDuration="38.530335243s" podCreationTimestamp="2025-12-16 13:40:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:40:51.5286939 +0000 UTC m=+45.195940979" watchObservedRunningTime="2025-12-16 13:40:51.530335243 +0000 UTC m=+45.197582314" Dec 16 13:40:51.571791 kubelet[3080]: I1216 13:40:51.571737 3080 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-qpg2m" podStartSLOduration=38.571722025 podStartE2EDuration="38.571722025s" podCreationTimestamp="2025-12-16 13:40:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:40:51.571671569 +0000 UTC m=+45.238918645" watchObservedRunningTime="2025-12-16 13:40:51.571722025 +0000 UTC m=+45.238969096" Dec 16 13:40:51.588084 systemd-networkd[1684]: calic51d4dfe24f: Gained IPv6LL Dec 16 13:40:51.906771 systemd-networkd[1684]: cali515928b8264: Gained IPv6LL Dec 16 13:40:52.226882 systemd-networkd[1684]: cali486c52c7bb6: Gained IPv6LL Dec 16 13:40:52.290720 systemd-networkd[1684]: calic8ec1e199e6: Gained IPv6LL Dec 16 13:40:52.525170 kubelet[3080]: E1216 13:40:52.525012 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" podUID="4477497f-a609-4791-b674-25df11e8ec73" Dec 16 13:40:58.415980 containerd[1779]: time="2025-12-16T13:40:58.415938605Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 13:40:58.755457 containerd[1779]: time="2025-12-16T13:40:58.755280722Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:40:58.757599 containerd[1779]: time="2025-12-16T13:40:58.757521850Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 13:40:58.757703 containerd[1779]: time="2025-12-16T13:40:58.757604704Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 13:40:58.757905 kubelet[3080]: E1216 13:40:58.757850 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:40:58.758194 kubelet[3080]: E1216 13:40:58.757913 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:40:58.758275 kubelet[3080]: E1216 13:40:58.758213 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:f859dfb58e5e4f5cbc43012fb92832ac,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vn2wh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7bdfb97df8-fdm7d_calico-system(938ae2d1-4432-47b7-a9d3-a5acd90ddc02): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 13:40:58.760623 containerd[1779]: time="2025-12-16T13:40:58.760157365Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 13:40:59.107435 containerd[1779]: time="2025-12-16T13:40:59.107214751Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:40:59.109751 containerd[1779]: time="2025-12-16T13:40:59.109714660Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 13:40:59.110158 containerd[1779]: time="2025-12-16T13:40:59.109783482Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 13:40:59.110246 kubelet[3080]: E1216 13:40:59.109927 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:40:59.110246 kubelet[3080]: E1216 13:40:59.109977 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:40:59.110246 kubelet[3080]: E1216 13:40:59.110101 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vn2wh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7bdfb97df8-fdm7d_calico-system(938ae2d1-4432-47b7-a9d3-a5acd90ddc02): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 13:40:59.111355 kubelet[3080]: E1216 13:40:59.111314 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7bdfb97df8-fdm7d" podUID="938ae2d1-4432-47b7-a9d3-a5acd90ddc02" Dec 16 13:41:03.415247 containerd[1779]: time="2025-12-16T13:41:03.415175167Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:41:03.756579 containerd[1779]: time="2025-12-16T13:41:03.756421500Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:41:03.758085 containerd[1779]: time="2025-12-16T13:41:03.758025981Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:41:03.758178 containerd[1779]: time="2025-12-16T13:41:03.758111804Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:41:03.758342 kubelet[3080]: E1216 13:41:03.758294 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:41:03.758665 kubelet[3080]: E1216 13:41:03.758347 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:41:03.758665 kubelet[3080]: E1216 13:41:03.758593 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2grc9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-574757c556-w6l7d_calico-apiserver(59d04f45-d51d-4c79-b4da-d0334682cd90): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:41:03.758775 containerd[1779]: time="2025-12-16T13:41:03.758608955Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:41:03.759983 kubelet[3080]: E1216 13:41:03.759947 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" podUID="59d04f45-d51d-4c79-b4da-d0334682cd90" Dec 16 13:41:04.109695 containerd[1779]: time="2025-12-16T13:41:04.109506592Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:41:04.111243 containerd[1779]: time="2025-12-16T13:41:04.111176625Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:41:04.111327 containerd[1779]: time="2025-12-16T13:41:04.111253862Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:41:04.111449 kubelet[3080]: E1216 13:41:04.111408 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:41:04.111497 kubelet[3080]: E1216 13:41:04.111454 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:41:04.111666 kubelet[3080]: E1216 13:41:04.111597 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-88k89,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-574757c556-b26rp_calico-apiserver(ae6d0914-be51-4e9e-9abc-d81494f00693): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:41:04.112828 kubelet[3080]: E1216 13:41:04.112781 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-b26rp" podUID="ae6d0914-be51-4e9e-9abc-d81494f00693" Dec 16 13:41:04.415207 containerd[1779]: time="2025-12-16T13:41:04.415170076Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 13:41:04.750653 containerd[1779]: time="2025-12-16T13:41:04.750520647Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:41:04.752300 containerd[1779]: time="2025-12-16T13:41:04.752255176Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 13:41:04.752364 containerd[1779]: time="2025-12-16T13:41:04.752298409Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 13:41:04.752467 kubelet[3080]: E1216 13:41:04.752433 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:41:04.752507 kubelet[3080]: E1216 13:41:04.752480 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:41:04.752657 kubelet[3080]: E1216 13:41:04.752618 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sdnqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-5d9cf7b5c4-4vfj5_calico-system(4477497f-a609-4791-b674-25df11e8ec73): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 13:41:04.753865 kubelet[3080]: E1216 13:41:04.753813 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" podUID="4477497f-a609-4791-b674-25df11e8ec73" Dec 16 13:41:05.414684 containerd[1779]: time="2025-12-16T13:41:05.414643699Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 13:41:05.744207 containerd[1779]: time="2025-12-16T13:41:05.744035652Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:41:05.745908 containerd[1779]: time="2025-12-16T13:41:05.745833888Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 13:41:05.745908 containerd[1779]: time="2025-12-16T13:41:05.745877527Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 13:41:05.746282 kubelet[3080]: E1216 13:41:05.746212 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:41:05.746674 kubelet[3080]: E1216 13:41:05.746657 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:41:05.746864 kubelet[3080]: E1216 13:41:05.746821 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t2425,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-wzss6_calico-system(713c1552-24ab-4b60-9872-de4f52adb23b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 13:41:05.748975 containerd[1779]: time="2025-12-16T13:41:05.748765445Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 13:41:06.093199 containerd[1779]: time="2025-12-16T13:41:06.093030518Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:41:06.094839 containerd[1779]: time="2025-12-16T13:41:06.094788715Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 13:41:06.094963 containerd[1779]: time="2025-12-16T13:41:06.094860997Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 13:41:06.095100 kubelet[3080]: E1216 13:41:06.095052 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:41:06.095149 kubelet[3080]: E1216 13:41:06.095114 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:41:06.095299 kubelet[3080]: E1216 13:41:06.095258 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t2425,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-wzss6_calico-system(713c1552-24ab-4b60-9872-de4f52adb23b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 13:41:06.096526 kubelet[3080]: E1216 13:41:06.096465 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:41:06.415221 containerd[1779]: time="2025-12-16T13:41:06.415166761Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 13:41:06.764101 containerd[1779]: time="2025-12-16T13:41:06.763970146Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:41:06.765724 containerd[1779]: time="2025-12-16T13:41:06.765628605Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 13:41:06.765724 containerd[1779]: time="2025-12-16T13:41:06.765666427Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 13:41:06.765870 kubelet[3080]: E1216 13:41:06.765817 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:41:06.766109 kubelet[3080]: E1216 13:41:06.765869 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:41:06.766109 kubelet[3080]: E1216 13:41:06.766009 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cbfcp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-zxbn9_calico-system(bc92ca59-7a99-4554-9e6f-b60493200978): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 13:41:06.767318 kubelet[3080]: E1216 13:41:06.767278 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-zxbn9" podUID="bc92ca59-7a99-4554-9e6f-b60493200978" Dec 16 13:41:13.416166 kubelet[3080]: E1216 13:41:13.416106 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7bdfb97df8-fdm7d" podUID="938ae2d1-4432-47b7-a9d3-a5acd90ddc02" Dec 16 13:41:15.415051 kubelet[3080]: E1216 13:41:15.414986 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" podUID="4477497f-a609-4791-b674-25df11e8ec73" Dec 16 13:41:18.415262 kubelet[3080]: E1216 13:41:18.415193 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" podUID="59d04f45-d51d-4c79-b4da-d0334682cd90" Dec 16 13:41:19.414964 kubelet[3080]: E1216 13:41:19.414909 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-b26rp" podUID="ae6d0914-be51-4e9e-9abc-d81494f00693" Dec 16 13:41:20.414673 kubelet[3080]: E1216 13:41:20.414573 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-zxbn9" podUID="bc92ca59-7a99-4554-9e6f-b60493200978" Dec 16 13:41:21.414953 kubelet[3080]: E1216 13:41:21.414855 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:41:24.417927 containerd[1779]: time="2025-12-16T13:41:24.417841328Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 13:41:24.773429 containerd[1779]: time="2025-12-16T13:41:24.773301449Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:41:24.775544 containerd[1779]: time="2025-12-16T13:41:24.775477024Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 13:41:24.775799 containerd[1779]: time="2025-12-16T13:41:24.775610065Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 13:41:24.775843 kubelet[3080]: E1216 13:41:24.775772 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:41:24.776094 kubelet[3080]: E1216 13:41:24.775841 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:41:24.776094 kubelet[3080]: E1216 13:41:24.775987 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:f859dfb58e5e4f5cbc43012fb92832ac,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vn2wh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7bdfb97df8-fdm7d_calico-system(938ae2d1-4432-47b7-a9d3-a5acd90ddc02): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 13:41:24.778111 containerd[1779]: time="2025-12-16T13:41:24.778069805Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 13:41:25.120706 containerd[1779]: time="2025-12-16T13:41:25.120651091Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:41:25.122922 containerd[1779]: time="2025-12-16T13:41:25.122730160Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 13:41:25.122922 containerd[1779]: time="2025-12-16T13:41:25.122816297Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 13:41:25.123063 kubelet[3080]: E1216 13:41:25.122999 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:41:25.123108 kubelet[3080]: E1216 13:41:25.123080 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:41:25.123288 kubelet[3080]: E1216 13:41:25.123250 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vn2wh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7bdfb97df8-fdm7d_calico-system(938ae2d1-4432-47b7-a9d3-a5acd90ddc02): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 13:41:25.124500 kubelet[3080]: E1216 13:41:25.124451 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7bdfb97df8-fdm7d" podUID="938ae2d1-4432-47b7-a9d3-a5acd90ddc02" Dec 16 13:41:28.416463 containerd[1779]: time="2025-12-16T13:41:28.416232889Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 13:41:28.763547 containerd[1779]: time="2025-12-16T13:41:28.763238353Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:41:28.765171 containerd[1779]: time="2025-12-16T13:41:28.765098680Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 13:41:28.765171 containerd[1779]: time="2025-12-16T13:41:28.765143502Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 13:41:28.765381 kubelet[3080]: E1216 13:41:28.765316 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:41:28.765381 kubelet[3080]: E1216 13:41:28.765360 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:41:28.765743 kubelet[3080]: E1216 13:41:28.765524 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sdnqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-5d9cf7b5c4-4vfj5_calico-system(4477497f-a609-4791-b674-25df11e8ec73): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 13:41:28.767004 kubelet[3080]: E1216 13:41:28.766744 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" podUID="4477497f-a609-4791-b674-25df11e8ec73" Dec 16 13:41:30.416677 containerd[1779]: time="2025-12-16T13:41:30.416645728Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:41:30.771831 containerd[1779]: time="2025-12-16T13:41:30.771690465Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:41:30.773643 containerd[1779]: time="2025-12-16T13:41:30.773573440Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:41:30.773643 containerd[1779]: time="2025-12-16T13:41:30.773649582Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:41:30.773839 kubelet[3080]: E1216 13:41:30.773784 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:41:30.774153 kubelet[3080]: E1216 13:41:30.773839 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:41:30.774153 kubelet[3080]: E1216 13:41:30.773966 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2grc9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-574757c556-w6l7d_calico-apiserver(59d04f45-d51d-4c79-b4da-d0334682cd90): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:41:30.775177 kubelet[3080]: E1216 13:41:30.775136 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" podUID="59d04f45-d51d-4c79-b4da-d0334682cd90" Dec 16 13:41:32.415836 containerd[1779]: time="2025-12-16T13:41:32.415773212Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 13:41:32.785692 containerd[1779]: time="2025-12-16T13:41:32.785580679Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:41:32.787227 containerd[1779]: time="2025-12-16T13:41:32.787167436Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 13:41:32.787326 containerd[1779]: time="2025-12-16T13:41:32.787245719Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 13:41:32.787418 kubelet[3080]: E1216 13:41:32.787373 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:41:32.787714 kubelet[3080]: E1216 13:41:32.787423 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:41:32.787714 kubelet[3080]: E1216 13:41:32.787534 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t2425,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-wzss6_calico-system(713c1552-24ab-4b60-9872-de4f52adb23b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 13:41:32.789458 containerd[1779]: time="2025-12-16T13:41:32.789386192Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 13:41:33.120131 containerd[1779]: time="2025-12-16T13:41:33.120073251Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:41:33.121806 containerd[1779]: time="2025-12-16T13:41:33.121772467Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 13:41:33.121882 containerd[1779]: time="2025-12-16T13:41:33.121843597Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 13:41:33.122049 kubelet[3080]: E1216 13:41:33.122004 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:41:33.122089 kubelet[3080]: E1216 13:41:33.122051 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:41:33.122199 kubelet[3080]: E1216 13:41:33.122158 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t2425,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-wzss6_calico-system(713c1552-24ab-4b60-9872-de4f52adb23b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 13:41:33.124268 kubelet[3080]: E1216 13:41:33.124223 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:41:33.415289 containerd[1779]: time="2025-12-16T13:41:33.415182866Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:41:33.771083 containerd[1779]: time="2025-12-16T13:41:33.770963749Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:41:33.772627 containerd[1779]: time="2025-12-16T13:41:33.772581251Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:41:33.772719 containerd[1779]: time="2025-12-16T13:41:33.772622247Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:41:33.772820 kubelet[3080]: E1216 13:41:33.772770 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:41:33.772862 kubelet[3080]: E1216 13:41:33.772819 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:41:33.773090 kubelet[3080]: E1216 13:41:33.773049 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-88k89,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-574757c556-b26rp_calico-apiserver(ae6d0914-be51-4e9e-9abc-d81494f00693): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:41:33.773208 containerd[1779]: time="2025-12-16T13:41:33.773081839Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 13:41:33.774309 kubelet[3080]: E1216 13:41:33.774246 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-b26rp" podUID="ae6d0914-be51-4e9e-9abc-d81494f00693" Dec 16 13:41:34.128640 containerd[1779]: time="2025-12-16T13:41:34.128580603Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:41:34.130284 containerd[1779]: time="2025-12-16T13:41:34.130227734Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 13:41:34.130376 containerd[1779]: time="2025-12-16T13:41:34.130304905Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 13:41:34.130476 kubelet[3080]: E1216 13:41:34.130417 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:41:34.130476 kubelet[3080]: E1216 13:41:34.130465 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:41:34.130814 kubelet[3080]: E1216 13:41:34.130602 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cbfcp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-zxbn9_calico-system(bc92ca59-7a99-4554-9e6f-b60493200978): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 13:41:34.131809 kubelet[3080]: E1216 13:41:34.131765 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-zxbn9" podUID="bc92ca59-7a99-4554-9e6f-b60493200978" Dec 16 13:41:36.415287 kubelet[3080]: E1216 13:41:36.415242 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7bdfb97df8-fdm7d" podUID="938ae2d1-4432-47b7-a9d3-a5acd90ddc02" Dec 16 13:41:41.414904 kubelet[3080]: E1216 13:41:41.414844 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" podUID="4477497f-a609-4791-b674-25df11e8ec73" Dec 16 13:41:45.414519 kubelet[3080]: E1216 13:41:45.414484 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" podUID="59d04f45-d51d-4c79-b4da-d0334682cd90" Dec 16 13:41:47.414288 kubelet[3080]: E1216 13:41:47.414213 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-b26rp" podUID="ae6d0914-be51-4e9e-9abc-d81494f00693" Dec 16 13:41:47.414884 kubelet[3080]: E1216 13:41:47.414842 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:41:48.417779 kubelet[3080]: E1216 13:41:48.417733 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7bdfb97df8-fdm7d" podUID="938ae2d1-4432-47b7-a9d3-a5acd90ddc02" Dec 16 13:41:49.414646 kubelet[3080]: E1216 13:41:49.414609 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-zxbn9" podUID="bc92ca59-7a99-4554-9e6f-b60493200978" Dec 16 13:41:52.414821 kubelet[3080]: E1216 13:41:52.414765 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" podUID="4477497f-a609-4791-b674-25df11e8ec73" Dec 16 13:41:58.414742 kubelet[3080]: E1216 13:41:58.414689 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-b26rp" podUID="ae6d0914-be51-4e9e-9abc-d81494f00693" Dec 16 13:42:00.414812 kubelet[3080]: E1216 13:42:00.414757 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" podUID="59d04f45-d51d-4c79-b4da-d0334682cd90" Dec 16 13:42:00.415378 kubelet[3080]: E1216 13:42:00.415342 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-zxbn9" podUID="bc92ca59-7a99-4554-9e6f-b60493200978" Dec 16 13:42:01.415156 kubelet[3080]: E1216 13:42:01.415094 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:42:02.415125 kubelet[3080]: E1216 13:42:02.415042 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7bdfb97df8-fdm7d" podUID="938ae2d1-4432-47b7-a9d3-a5acd90ddc02" Dec 16 13:42:04.415837 kubelet[3080]: E1216 13:42:04.415778 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" podUID="4477497f-a609-4791-b674-25df11e8ec73" Dec 16 13:42:12.417896 kubelet[3080]: E1216 13:42:12.417844 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-zxbn9" podUID="bc92ca59-7a99-4554-9e6f-b60493200978" Dec 16 13:42:12.420873 kubelet[3080]: E1216 13:42:12.418580 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-b26rp" podUID="ae6d0914-be51-4e9e-9abc-d81494f00693" Dec 16 13:42:14.415270 containerd[1779]: time="2025-12-16T13:42:14.415181955Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:42:14.790861 containerd[1779]: time="2025-12-16T13:42:14.790740951Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:42:14.792649 containerd[1779]: time="2025-12-16T13:42:14.792602020Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:42:14.792723 containerd[1779]: time="2025-12-16T13:42:14.792619440Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:42:14.792902 kubelet[3080]: E1216 13:42:14.792860 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:42:14.793154 kubelet[3080]: E1216 13:42:14.792911 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:42:14.793349 kubelet[3080]: E1216 13:42:14.793143 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2grc9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-574757c556-w6l7d_calico-apiserver(59d04f45-d51d-4c79-b4da-d0334682cd90): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:42:14.793454 containerd[1779]: time="2025-12-16T13:42:14.793416520Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 13:42:14.794535 kubelet[3080]: E1216 13:42:14.794479 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" podUID="59d04f45-d51d-4c79-b4da-d0334682cd90" Dec 16 13:42:15.142338 containerd[1779]: time="2025-12-16T13:42:15.142269681Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:42:15.143960 containerd[1779]: time="2025-12-16T13:42:15.143910245Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 13:42:15.143960 containerd[1779]: time="2025-12-16T13:42:15.143948453Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 13:42:15.144170 kubelet[3080]: E1216 13:42:15.144129 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:42:15.144216 kubelet[3080]: E1216 13:42:15.144177 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:42:15.144333 kubelet[3080]: E1216 13:42:15.144300 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:f859dfb58e5e4f5cbc43012fb92832ac,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vn2wh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7bdfb97df8-fdm7d_calico-system(938ae2d1-4432-47b7-a9d3-a5acd90ddc02): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 13:42:15.146365 containerd[1779]: time="2025-12-16T13:42:15.146313375Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 13:42:15.482247 containerd[1779]: time="2025-12-16T13:42:15.482065068Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:42:15.483843 containerd[1779]: time="2025-12-16T13:42:15.483815674Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 13:42:15.483901 containerd[1779]: time="2025-12-16T13:42:15.483888004Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 13:42:15.484083 kubelet[3080]: E1216 13:42:15.484048 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:42:15.484130 kubelet[3080]: E1216 13:42:15.484100 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:42:15.484598 kubelet[3080]: E1216 13:42:15.484544 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vn2wh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7bdfb97df8-fdm7d_calico-system(938ae2d1-4432-47b7-a9d3-a5acd90ddc02): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 13:42:15.485775 kubelet[3080]: E1216 13:42:15.485736 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7bdfb97df8-fdm7d" podUID="938ae2d1-4432-47b7-a9d3-a5acd90ddc02" Dec 16 13:42:16.415308 containerd[1779]: time="2025-12-16T13:42:16.415261684Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 13:42:16.940890 containerd[1779]: time="2025-12-16T13:42:16.940824955Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:42:16.942660 containerd[1779]: time="2025-12-16T13:42:16.942584573Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 13:42:16.942757 containerd[1779]: time="2025-12-16T13:42:16.942663371Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 13:42:16.942870 kubelet[3080]: E1216 13:42:16.942825 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:42:16.943080 kubelet[3080]: E1216 13:42:16.942882 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:42:16.943080 kubelet[3080]: E1216 13:42:16.943004 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t2425,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-wzss6_calico-system(713c1552-24ab-4b60-9872-de4f52adb23b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 13:42:16.944746 containerd[1779]: time="2025-12-16T13:42:16.944711903Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 13:42:17.296005 containerd[1779]: time="2025-12-16T13:42:17.295882020Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:42:17.297877 containerd[1779]: time="2025-12-16T13:42:17.297806897Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 13:42:17.298028 containerd[1779]: time="2025-12-16T13:42:17.297911897Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 13:42:17.298060 kubelet[3080]: E1216 13:42:17.298007 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:42:17.298060 kubelet[3080]: E1216 13:42:17.298049 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:42:17.298289 kubelet[3080]: E1216 13:42:17.298232 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t2425,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-wzss6_calico-system(713c1552-24ab-4b60-9872-de4f52adb23b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 13:42:17.299477 kubelet[3080]: E1216 13:42:17.299427 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:42:19.415572 containerd[1779]: time="2025-12-16T13:42:19.415484482Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 13:42:19.754304 containerd[1779]: time="2025-12-16T13:42:19.754083357Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:42:19.756451 containerd[1779]: time="2025-12-16T13:42:19.756341684Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 13:42:19.756451 containerd[1779]: time="2025-12-16T13:42:19.756377795Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 13:42:19.756621 kubelet[3080]: E1216 13:42:19.756587 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:42:19.756907 kubelet[3080]: E1216 13:42:19.756649 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:42:19.756907 kubelet[3080]: E1216 13:42:19.756851 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sdnqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-5d9cf7b5c4-4vfj5_calico-system(4477497f-a609-4791-b674-25df11e8ec73): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 13:42:19.758972 kubelet[3080]: E1216 13:42:19.758928 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" podUID="4477497f-a609-4791-b674-25df11e8ec73" Dec 16 13:42:24.415358 containerd[1779]: time="2025-12-16T13:42:24.415238747Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 13:42:24.760356 containerd[1779]: time="2025-12-16T13:42:24.760226786Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:42:24.761876 containerd[1779]: time="2025-12-16T13:42:24.761818421Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 13:42:24.761969 containerd[1779]: time="2025-12-16T13:42:24.761903251Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 13:42:24.762107 kubelet[3080]: E1216 13:42:24.762066 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:42:24.762400 kubelet[3080]: E1216 13:42:24.762123 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:42:24.762400 kubelet[3080]: E1216 13:42:24.762287 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cbfcp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-zxbn9_calico-system(bc92ca59-7a99-4554-9e6f-b60493200978): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 13:42:24.763512 kubelet[3080]: E1216 13:42:24.763479 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-zxbn9" podUID="bc92ca59-7a99-4554-9e6f-b60493200978" Dec 16 13:42:27.414536 kubelet[3080]: E1216 13:42:27.414485 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" podUID="59d04f45-d51d-4c79-b4da-d0334682cd90" Dec 16 13:42:27.415698 kubelet[3080]: E1216 13:42:27.415317 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7bdfb97df8-fdm7d" podUID="938ae2d1-4432-47b7-a9d3-a5acd90ddc02" Dec 16 13:42:27.415799 containerd[1779]: time="2025-12-16T13:42:27.414739087Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:42:27.759450 containerd[1779]: time="2025-12-16T13:42:27.759110493Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:42:27.760970 containerd[1779]: time="2025-12-16T13:42:27.760875751Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:42:27.761075 containerd[1779]: time="2025-12-16T13:42:27.760935750Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:42:27.761190 kubelet[3080]: E1216 13:42:27.761145 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:42:27.761234 kubelet[3080]: E1216 13:42:27.761198 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:42:27.761397 kubelet[3080]: E1216 13:42:27.761353 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-88k89,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-574757c556-b26rp_calico-apiserver(ae6d0914-be51-4e9e-9abc-d81494f00693): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:42:27.762731 kubelet[3080]: E1216 13:42:27.762662 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-b26rp" podUID="ae6d0914-be51-4e9e-9abc-d81494f00693" Dec 16 13:42:31.415046 kubelet[3080]: E1216 13:42:31.414978 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" podUID="4477497f-a609-4791-b674-25df11e8ec73" Dec 16 13:42:32.415305 kubelet[3080]: E1216 13:42:32.415200 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:42:36.415165 kubelet[3080]: E1216 13:42:36.415093 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-zxbn9" podUID="bc92ca59-7a99-4554-9e6f-b60493200978" Dec 16 13:42:38.417162 kubelet[3080]: E1216 13:42:38.417106 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7bdfb97df8-fdm7d" podUID="938ae2d1-4432-47b7-a9d3-a5acd90ddc02" Dec 16 13:42:39.415351 kubelet[3080]: E1216 13:42:39.415284 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" podUID="59d04f45-d51d-4c79-b4da-d0334682cd90" Dec 16 13:42:40.415801 kubelet[3080]: E1216 13:42:40.415733 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-b26rp" podUID="ae6d0914-be51-4e9e-9abc-d81494f00693" Dec 16 13:42:44.414782 kubelet[3080]: E1216 13:42:44.414732 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" podUID="4477497f-a609-4791-b674-25df11e8ec73" Dec 16 13:42:45.125683 systemd[1]: Started sshd@9-10.0.21.93:22-147.75.109.163:52432.service - OpenSSH per-connection server daemon (147.75.109.163:52432). Dec 16 13:42:45.415778 kubelet[3080]: E1216 13:42:45.415626 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:42:46.106569 sshd[5604]: Accepted publickey for core from 147.75.109.163 port 52432 ssh2: RSA SHA256:ieG6luY3aPLDHUxSRue076s7/DaIRhKi0Ks9YLqRurI Dec 16 13:42:46.107529 sshd-session[5604]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:42:46.112004 systemd-logind[1758]: New session 10 of user core. Dec 16 13:42:46.128945 systemd[1]: Started session-10.scope - Session 10 of User core. Dec 16 13:42:46.836226 sshd[5607]: Connection closed by 147.75.109.163 port 52432 Dec 16 13:42:46.836614 sshd-session[5604]: pam_unix(sshd:session): session closed for user core Dec 16 13:42:46.839467 systemd[1]: sshd@9-10.0.21.93:22-147.75.109.163:52432.service: Deactivated successfully. Dec 16 13:42:46.841792 systemd-logind[1758]: Session 10 logged out. Waiting for processes to exit. Dec 16 13:42:46.841902 systemd[1]: session-10.scope: Deactivated successfully. Dec 16 13:42:46.843172 systemd-logind[1758]: Removed session 10. Dec 16 13:42:47.417572 kubelet[3080]: E1216 13:42:47.416792 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-zxbn9" podUID="bc92ca59-7a99-4554-9e6f-b60493200978" Dec 16 13:42:50.415477 kubelet[3080]: E1216 13:42:50.415316 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" podUID="59d04f45-d51d-4c79-b4da-d0334682cd90" Dec 16 13:42:51.415435 kubelet[3080]: E1216 13:42:51.415396 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7bdfb97df8-fdm7d" podUID="938ae2d1-4432-47b7-a9d3-a5acd90ddc02" Dec 16 13:42:52.005761 systemd[1]: Started sshd@10-10.0.21.93:22-147.75.109.163:52444.service - OpenSSH per-connection server daemon (147.75.109.163:52444). Dec 16 13:42:52.415661 kubelet[3080]: E1216 13:42:52.415619 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-b26rp" podUID="ae6d0914-be51-4e9e-9abc-d81494f00693" Dec 16 13:42:52.986806 sshd[5633]: Accepted publickey for core from 147.75.109.163 port 52444 ssh2: RSA SHA256:ieG6luY3aPLDHUxSRue076s7/DaIRhKi0Ks9YLqRurI Dec 16 13:42:52.988145 sshd-session[5633]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:42:52.992868 systemd-logind[1758]: New session 11 of user core. Dec 16 13:42:53.008122 systemd[1]: Started session-11.scope - Session 11 of User core. Dec 16 13:42:53.709474 sshd[5636]: Connection closed by 147.75.109.163 port 52444 Dec 16 13:42:53.709903 sshd-session[5633]: pam_unix(sshd:session): session closed for user core Dec 16 13:42:53.713254 systemd[1]: sshd@10-10.0.21.93:22-147.75.109.163:52444.service: Deactivated successfully. Dec 16 13:42:53.714820 systemd[1]: session-11.scope: Deactivated successfully. Dec 16 13:42:53.715509 systemd-logind[1758]: Session 11 logged out. Waiting for processes to exit. Dec 16 13:42:53.716373 systemd-logind[1758]: Removed session 11. Dec 16 13:42:53.877984 systemd[1]: Started sshd@11-10.0.21.93:22-147.75.109.163:42934.service - OpenSSH per-connection server daemon (147.75.109.163:42934). Dec 16 13:42:54.831083 sshd[5654]: Accepted publickey for core from 147.75.109.163 port 42934 ssh2: RSA SHA256:ieG6luY3aPLDHUxSRue076s7/DaIRhKi0Ks9YLqRurI Dec 16 13:42:54.832215 sshd-session[5654]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:42:54.836156 systemd-logind[1758]: New session 12 of user core. Dec 16 13:42:54.852751 systemd[1]: Started session-12.scope - Session 12 of User core. Dec 16 13:42:56.369437 sshd[5657]: Connection closed by 147.75.109.163 port 42934 Dec 16 13:42:56.369871 sshd-session[5654]: pam_unix(sshd:session): session closed for user core Dec 16 13:42:56.372864 systemd[1]: sshd@11-10.0.21.93:22-147.75.109.163:42934.service: Deactivated successfully. Dec 16 13:42:56.374994 systemd[1]: session-12.scope: Deactivated successfully. Dec 16 13:42:56.376298 systemd-logind[1758]: Session 12 logged out. Waiting for processes to exit. Dec 16 13:42:56.377345 systemd-logind[1758]: Removed session 12. Dec 16 13:42:56.552092 systemd[1]: Started sshd@12-10.0.21.93:22-147.75.109.163:42938.service - OpenSSH per-connection server daemon (147.75.109.163:42938). Dec 16 13:42:57.598147 sshd[5672]: Accepted publickey for core from 147.75.109.163 port 42938 ssh2: RSA SHA256:ieG6luY3aPLDHUxSRue076s7/DaIRhKi0Ks9YLqRurI Dec 16 13:42:57.599379 sshd-session[5672]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:42:57.605160 systemd-logind[1758]: New session 13 of user core. Dec 16 13:42:57.619835 systemd[1]: Started session-13.scope - Session 13 of User core. Dec 16 13:42:58.415117 kubelet[3080]: E1216 13:42:58.415075 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-zxbn9" podUID="bc92ca59-7a99-4554-9e6f-b60493200978" Dec 16 13:42:58.695347 sshd[5675]: Connection closed by 147.75.109.163 port 42938 Dec 16 13:42:58.695398 sshd-session[5672]: pam_unix(sshd:session): session closed for user core Dec 16 13:42:58.698975 systemd[1]: sshd@12-10.0.21.93:22-147.75.109.163:42938.service: Deactivated successfully. Dec 16 13:42:58.700690 systemd[1]: session-13.scope: Deactivated successfully. Dec 16 13:42:58.701309 systemd-logind[1758]: Session 13 logged out. Waiting for processes to exit. Dec 16 13:42:58.702091 systemd-logind[1758]: Removed session 13. Dec 16 13:42:59.415182 kubelet[3080]: E1216 13:42:59.415127 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" podUID="4477497f-a609-4791-b674-25df11e8ec73" Dec 16 13:43:00.415324 kubelet[3080]: E1216 13:43:00.415283 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:43:03.414831 kubelet[3080]: E1216 13:43:03.414604 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" podUID="59d04f45-d51d-4c79-b4da-d0334682cd90" Dec 16 13:43:03.863959 systemd[1]: Started sshd@13-10.0.21.93:22-147.75.109.163:51210.service - OpenSSH per-connection server daemon (147.75.109.163:51210). Dec 16 13:43:04.415070 kubelet[3080]: E1216 13:43:04.414980 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-b26rp" podUID="ae6d0914-be51-4e9e-9abc-d81494f00693" Dec 16 13:43:04.819338 sshd[5696]: Accepted publickey for core from 147.75.109.163 port 51210 ssh2: RSA SHA256:ieG6luY3aPLDHUxSRue076s7/DaIRhKi0Ks9YLqRurI Dec 16 13:43:04.820355 sshd-session[5696]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:43:04.824628 systemd-logind[1758]: New session 14 of user core. Dec 16 13:43:04.839824 systemd[1]: Started session-14.scope - Session 14 of User core. Dec 16 13:43:05.415245 kubelet[3080]: E1216 13:43:05.415201 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7bdfb97df8-fdm7d" podUID="938ae2d1-4432-47b7-a9d3-a5acd90ddc02" Dec 16 13:43:05.541297 sshd[5699]: Connection closed by 147.75.109.163 port 51210 Dec 16 13:43:05.541684 sshd-session[5696]: pam_unix(sshd:session): session closed for user core Dec 16 13:43:05.545047 systemd[1]: sshd@13-10.0.21.93:22-147.75.109.163:51210.service: Deactivated successfully. Dec 16 13:43:05.546560 systemd[1]: session-14.scope: Deactivated successfully. Dec 16 13:43:05.547162 systemd-logind[1758]: Session 14 logged out. Waiting for processes to exit. Dec 16 13:43:05.547940 systemd-logind[1758]: Removed session 14. Dec 16 13:43:10.415568 kubelet[3080]: E1216 13:43:10.415504 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" podUID="4477497f-a609-4791-b674-25df11e8ec73" Dec 16 13:43:10.416057 kubelet[3080]: E1216 13:43:10.415884 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-zxbn9" podUID="bc92ca59-7a99-4554-9e6f-b60493200978" Dec 16 13:43:10.714745 systemd[1]: Started sshd@14-10.0.21.93:22-147.75.109.163:51212.service - OpenSSH per-connection server daemon (147.75.109.163:51212). Dec 16 13:43:11.414938 kubelet[3080]: E1216 13:43:11.414892 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:43:11.685040 sshd[5718]: Accepted publickey for core from 147.75.109.163 port 51212 ssh2: RSA SHA256:ieG6luY3aPLDHUxSRue076s7/DaIRhKi0Ks9YLqRurI Dec 16 13:43:11.686396 sshd-session[5718]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:43:11.690564 systemd-logind[1758]: New session 15 of user core. Dec 16 13:43:11.699772 systemd[1]: Started session-15.scope - Session 15 of User core. Dec 16 13:43:12.409661 sshd[5721]: Connection closed by 147.75.109.163 port 51212 Dec 16 13:43:12.410211 sshd-session[5718]: pam_unix(sshd:session): session closed for user core Dec 16 13:43:12.412942 systemd[1]: sshd@14-10.0.21.93:22-147.75.109.163:51212.service: Deactivated successfully. Dec 16 13:43:12.414617 systemd[1]: session-15.scope: Deactivated successfully. Dec 16 13:43:12.416580 systemd-logind[1758]: Session 15 logged out. Waiting for processes to exit. Dec 16 13:43:12.417269 systemd-logind[1758]: Removed session 15. Dec 16 13:43:15.415299 kubelet[3080]: E1216 13:43:15.415245 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-b26rp" podUID="ae6d0914-be51-4e9e-9abc-d81494f00693" Dec 16 13:43:16.415993 kubelet[3080]: E1216 13:43:16.415899 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7bdfb97df8-fdm7d" podUID="938ae2d1-4432-47b7-a9d3-a5acd90ddc02" Dec 16 13:43:17.601189 systemd[1]: Started sshd@15-10.0.21.93:22-147.75.109.163:43370.service - OpenSSH per-connection server daemon (147.75.109.163:43370). Dec 16 13:43:18.417730 kubelet[3080]: E1216 13:43:18.417642 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" podUID="59d04f45-d51d-4c79-b4da-d0334682cd90" Dec 16 13:43:18.635509 sshd[5770]: Accepted publickey for core from 147.75.109.163 port 43370 ssh2: RSA SHA256:ieG6luY3aPLDHUxSRue076s7/DaIRhKi0Ks9YLqRurI Dec 16 13:43:18.638869 sshd-session[5770]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:43:18.644978 systemd-logind[1758]: New session 16 of user core. Dec 16 13:43:18.660802 systemd[1]: Started session-16.scope - Session 16 of User core. Dec 16 13:43:19.403807 sshd[5773]: Connection closed by 147.75.109.163 port 43370 Dec 16 13:43:19.404208 sshd-session[5770]: pam_unix(sshd:session): session closed for user core Dec 16 13:43:19.407655 systemd-logind[1758]: Session 16 logged out. Waiting for processes to exit. Dec 16 13:43:19.407907 systemd[1]: sshd@15-10.0.21.93:22-147.75.109.163:43370.service: Deactivated successfully. Dec 16 13:43:19.409431 systemd[1]: session-16.scope: Deactivated successfully. Dec 16 13:43:19.410787 systemd-logind[1758]: Removed session 16. Dec 16 13:43:19.573701 systemd[1]: Started sshd@16-10.0.21.93:22-147.75.109.163:43374.service - OpenSSH per-connection server daemon (147.75.109.163:43374). Dec 16 13:43:20.535088 sshd[5790]: Accepted publickey for core from 147.75.109.163 port 43374 ssh2: RSA SHA256:ieG6luY3aPLDHUxSRue076s7/DaIRhKi0Ks9YLqRurI Dec 16 13:43:20.536339 sshd-session[5790]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:43:20.540498 systemd-logind[1758]: New session 17 of user core. Dec 16 13:43:20.558769 systemd[1]: Started session-17.scope - Session 17 of User core. Dec 16 13:43:21.292230 sshd[5794]: Connection closed by 147.75.109.163 port 43374 Dec 16 13:43:21.292953 sshd-session[5790]: pam_unix(sshd:session): session closed for user core Dec 16 13:43:21.297466 systemd[1]: sshd@16-10.0.21.93:22-147.75.109.163:43374.service: Deactivated successfully. Dec 16 13:43:21.299343 systemd[1]: session-17.scope: Deactivated successfully. Dec 16 13:43:21.300610 systemd-logind[1758]: Session 17 logged out. Waiting for processes to exit. Dec 16 13:43:21.301763 systemd-logind[1758]: Removed session 17. Dec 16 13:43:21.459003 systemd[1]: Started sshd@17-10.0.21.93:22-147.75.109.163:43382.service - OpenSSH per-connection server daemon (147.75.109.163:43382). Dec 16 13:43:22.414829 kubelet[3080]: E1216 13:43:22.414721 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" podUID="4477497f-a609-4791-b674-25df11e8ec73" Dec 16 13:43:22.414829 kubelet[3080]: E1216 13:43:22.414766 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-zxbn9" podUID="bc92ca59-7a99-4554-9e6f-b60493200978" Dec 16 13:43:22.415301 sshd[5809]: Accepted publickey for core from 147.75.109.163 port 43382 ssh2: RSA SHA256:ieG6luY3aPLDHUxSRue076s7/DaIRhKi0Ks9YLqRurI Dec 16 13:43:22.416761 sshd-session[5809]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:43:22.422225 systemd-logind[1758]: New session 18 of user core. Dec 16 13:43:22.427725 systemd[1]: Started session-18.scope - Session 18 of User core. Dec 16 13:43:23.816244 sshd[5812]: Connection closed by 147.75.109.163 port 43382 Dec 16 13:43:23.817759 sshd-session[5809]: pam_unix(sshd:session): session closed for user core Dec 16 13:43:23.821607 systemd[1]: sshd@17-10.0.21.93:22-147.75.109.163:43382.service: Deactivated successfully. Dec 16 13:43:23.823194 systemd[1]: session-18.scope: Deactivated successfully. Dec 16 13:43:23.823998 systemd-logind[1758]: Session 18 logged out. Waiting for processes to exit. Dec 16 13:43:23.824782 systemd-logind[1758]: Removed session 18. Dec 16 13:43:23.990174 systemd[1]: Started sshd@18-10.0.21.93:22-147.75.109.163:52434.service - OpenSSH per-connection server daemon (147.75.109.163:52434). Dec 16 13:43:24.977323 sshd[5835]: Accepted publickey for core from 147.75.109.163 port 52434 ssh2: RSA SHA256:ieG6luY3aPLDHUxSRue076s7/DaIRhKi0Ks9YLqRurI Dec 16 13:43:24.979103 sshd-session[5835]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:43:24.984304 systemd-logind[1758]: New session 19 of user core. Dec 16 13:43:24.994808 systemd[1]: Started session-19.scope - Session 19 of User core. Dec 16 13:43:26.013396 sshd[5838]: Connection closed by 147.75.109.163 port 52434 Dec 16 13:43:26.013764 sshd-session[5835]: pam_unix(sshd:session): session closed for user core Dec 16 13:43:26.018661 systemd[1]: sshd@18-10.0.21.93:22-147.75.109.163:52434.service: Deactivated successfully. Dec 16 13:43:26.020595 systemd[1]: session-19.scope: Deactivated successfully. Dec 16 13:43:26.021319 systemd-logind[1758]: Session 19 logged out. Waiting for processes to exit. Dec 16 13:43:26.022304 systemd-logind[1758]: Removed session 19. Dec 16 13:43:26.190218 systemd[1]: Started sshd@19-10.0.21.93:22-147.75.109.163:52438.service - OpenSSH per-connection server daemon (147.75.109.163:52438). Dec 16 13:43:26.415321 kubelet[3080]: E1216 13:43:26.415255 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:43:27.184328 sshd[5860]: Accepted publickey for core from 147.75.109.163 port 52438 ssh2: RSA SHA256:ieG6luY3aPLDHUxSRue076s7/DaIRhKi0Ks9YLqRurI Dec 16 13:43:27.185888 sshd-session[5860]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:43:27.190788 systemd-logind[1758]: New session 20 of user core. Dec 16 13:43:27.204775 systemd[1]: Started session-20.scope - Session 20 of User core. Dec 16 13:43:27.415199 kubelet[3080]: E1216 13:43:27.415157 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7bdfb97df8-fdm7d" podUID="938ae2d1-4432-47b7-a9d3-a5acd90ddc02" Dec 16 13:43:28.035188 sshd[5863]: Connection closed by 147.75.109.163 port 52438 Dec 16 13:43:28.035599 sshd-session[5860]: pam_unix(sshd:session): session closed for user core Dec 16 13:43:28.039931 systemd[1]: sshd@19-10.0.21.93:22-147.75.109.163:52438.service: Deactivated successfully. Dec 16 13:43:28.041775 systemd[1]: session-20.scope: Deactivated successfully. Dec 16 13:43:28.042544 systemd-logind[1758]: Session 20 logged out. Waiting for processes to exit. Dec 16 13:43:28.044524 systemd-logind[1758]: Removed session 20. Dec 16 13:43:29.415071 kubelet[3080]: E1216 13:43:29.415008 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-b26rp" podUID="ae6d0914-be51-4e9e-9abc-d81494f00693" Dec 16 13:43:31.415143 kubelet[3080]: E1216 13:43:31.415078 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" podUID="59d04f45-d51d-4c79-b4da-d0334682cd90" Dec 16 13:43:33.200028 systemd[1]: Started sshd@20-10.0.21.93:22-147.75.109.163:43914.service - OpenSSH per-connection server daemon (147.75.109.163:43914). Dec 16 13:43:33.415205 kubelet[3080]: E1216 13:43:33.415146 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-zxbn9" podUID="bc92ca59-7a99-4554-9e6f-b60493200978" Dec 16 13:43:34.152895 sshd[5884]: Accepted publickey for core from 147.75.109.163 port 43914 ssh2: RSA SHA256:ieG6luY3aPLDHUxSRue076s7/DaIRhKi0Ks9YLqRurI Dec 16 13:43:34.154736 sshd-session[5884]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:43:34.160922 systemd-logind[1758]: New session 21 of user core. Dec 16 13:43:34.176748 systemd[1]: Started session-21.scope - Session 21 of User core. Dec 16 13:43:34.865453 sshd[5887]: Connection closed by 147.75.109.163 port 43914 Dec 16 13:43:34.865929 sshd-session[5884]: pam_unix(sshd:session): session closed for user core Dec 16 13:43:34.869313 systemd[1]: sshd@20-10.0.21.93:22-147.75.109.163:43914.service: Deactivated successfully. Dec 16 13:43:34.870987 systemd[1]: session-21.scope: Deactivated successfully. Dec 16 13:43:34.871648 systemd-logind[1758]: Session 21 logged out. Waiting for processes to exit. Dec 16 13:43:34.872446 systemd-logind[1758]: Removed session 21. Dec 16 13:43:37.414871 kubelet[3080]: E1216 13:43:37.414814 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" podUID="4477497f-a609-4791-b674-25df11e8ec73" Dec 16 13:43:40.041517 systemd[1]: Started sshd@21-10.0.21.93:22-147.75.109.163:43922.service - OpenSSH per-connection server daemon (147.75.109.163:43922). Dec 16 13:43:41.008017 sshd[5904]: Accepted publickey for core from 147.75.109.163 port 43922 ssh2: RSA SHA256:ieG6luY3aPLDHUxSRue076s7/DaIRhKi0Ks9YLqRurI Dec 16 13:43:41.009365 sshd-session[5904]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:43:41.013874 systemd-logind[1758]: New session 22 of user core. Dec 16 13:43:41.025820 systemd[1]: Started session-22.scope - Session 22 of User core. Dec 16 13:43:41.415410 containerd[1779]: time="2025-12-16T13:43:41.415374834Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 13:43:41.735700 sshd[5907]: Connection closed by 147.75.109.163 port 43922 Dec 16 13:43:41.735044 sshd-session[5904]: pam_unix(sshd:session): session closed for user core Dec 16 13:43:41.738387 systemd[1]: sshd@21-10.0.21.93:22-147.75.109.163:43922.service: Deactivated successfully. Dec 16 13:43:41.740079 systemd[1]: session-22.scope: Deactivated successfully. Dec 16 13:43:41.740739 systemd-logind[1758]: Session 22 logged out. Waiting for processes to exit. Dec 16 13:43:41.741646 systemd-logind[1758]: Removed session 22. Dec 16 13:43:41.747868 containerd[1779]: time="2025-12-16T13:43:41.747818354Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:43:41.749596 containerd[1779]: time="2025-12-16T13:43:41.749532387Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 13:43:41.749675 containerd[1779]: time="2025-12-16T13:43:41.749570771Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 13:43:41.749807 kubelet[3080]: E1216 13:43:41.749767 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:43:41.750089 kubelet[3080]: E1216 13:43:41.749819 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:43:41.750089 kubelet[3080]: E1216 13:43:41.749955 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t2425,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-wzss6_calico-system(713c1552-24ab-4b60-9872-de4f52adb23b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 13:43:41.751877 containerd[1779]: time="2025-12-16T13:43:41.751842100Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 13:43:42.102872 containerd[1779]: time="2025-12-16T13:43:42.102736631Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:43:42.104607 containerd[1779]: time="2025-12-16T13:43:42.104531771Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 13:43:42.104694 containerd[1779]: time="2025-12-16T13:43:42.104579291Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 13:43:42.104872 kubelet[3080]: E1216 13:43:42.104836 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:43:42.104982 kubelet[3080]: E1216 13:43:42.104969 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:43:42.105200 kubelet[3080]: E1216 13:43:42.105162 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t2425,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-wzss6_calico-system(713c1552-24ab-4b60-9872-de4f52adb23b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 13:43:42.106529 kubelet[3080]: E1216 13:43:42.106483 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:43:42.414758 kubelet[3080]: E1216 13:43:42.414707 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-b26rp" podUID="ae6d0914-be51-4e9e-9abc-d81494f00693" Dec 16 13:43:42.414970 containerd[1779]: time="2025-12-16T13:43:42.414943054Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 13:43:42.742715 containerd[1779]: time="2025-12-16T13:43:42.742538033Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:43:42.744562 containerd[1779]: time="2025-12-16T13:43:42.744485916Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 13:43:42.744651 containerd[1779]: time="2025-12-16T13:43:42.744531649Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 13:43:42.744837 kubelet[3080]: E1216 13:43:42.744790 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:43:42.744879 kubelet[3080]: E1216 13:43:42.744857 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:43:42.745031 kubelet[3080]: E1216 13:43:42.744999 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:f859dfb58e5e4f5cbc43012fb92832ac,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vn2wh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7bdfb97df8-fdm7d_calico-system(938ae2d1-4432-47b7-a9d3-a5acd90ddc02): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 13:43:42.746926 containerd[1779]: time="2025-12-16T13:43:42.746895665Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 13:43:43.076883 containerd[1779]: time="2025-12-16T13:43:43.076715224Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:43:43.078629 containerd[1779]: time="2025-12-16T13:43:43.078571617Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 13:43:43.078782 containerd[1779]: time="2025-12-16T13:43:43.078643554Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 13:43:43.078864 kubelet[3080]: E1216 13:43:43.078821 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:43:43.079145 kubelet[3080]: E1216 13:43:43.078875 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:43:43.079145 kubelet[3080]: E1216 13:43:43.079016 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vn2wh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7bdfb97df8-fdm7d_calico-system(938ae2d1-4432-47b7-a9d3-a5acd90ddc02): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 13:43:43.080289 kubelet[3080]: E1216 13:43:43.080236 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7bdfb97df8-fdm7d" podUID="938ae2d1-4432-47b7-a9d3-a5acd90ddc02" Dec 16 13:43:44.415659 containerd[1779]: time="2025-12-16T13:43:44.415605317Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:43:44.766034 containerd[1779]: time="2025-12-16T13:43:44.765879274Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:43:44.768091 containerd[1779]: time="2025-12-16T13:43:44.768045506Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:43:44.768161 containerd[1779]: time="2025-12-16T13:43:44.768130623Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:43:44.768348 kubelet[3080]: E1216 13:43:44.768311 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:43:44.768636 kubelet[3080]: E1216 13:43:44.768360 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:43:44.768875 kubelet[3080]: E1216 13:43:44.768834 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2grc9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-574757c556-w6l7d_calico-apiserver(59d04f45-d51d-4c79-b4da-d0334682cd90): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:43:44.770158 kubelet[3080]: E1216 13:43:44.770117 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" podUID="59d04f45-d51d-4c79-b4da-d0334682cd90" Dec 16 13:43:46.907630 systemd[1]: Started sshd@22-10.0.21.93:22-147.75.109.163:35528.service - OpenSSH per-connection server daemon (147.75.109.163:35528). Dec 16 13:43:47.873570 sshd[5953]: Accepted publickey for core from 147.75.109.163 port 35528 ssh2: RSA SHA256:ieG6luY3aPLDHUxSRue076s7/DaIRhKi0Ks9YLqRurI Dec 16 13:43:47.875591 sshd-session[5953]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:43:47.887783 systemd-logind[1758]: New session 23 of user core. Dec 16 13:43:47.900833 systemd[1]: Started session-23.scope - Session 23 of User core. Dec 16 13:43:48.415229 containerd[1779]: time="2025-12-16T13:43:48.415157553Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 13:43:48.597732 sshd[5956]: Connection closed by 147.75.109.163 port 35528 Dec 16 13:43:48.598304 sshd-session[5953]: pam_unix(sshd:session): session closed for user core Dec 16 13:43:48.602580 systemd[1]: sshd@22-10.0.21.93:22-147.75.109.163:35528.service: Deactivated successfully. Dec 16 13:43:48.604135 systemd[1]: session-23.scope: Deactivated successfully. Dec 16 13:43:48.604824 systemd-logind[1758]: Session 23 logged out. Waiting for processes to exit. Dec 16 13:43:48.605633 systemd-logind[1758]: Removed session 23. Dec 16 13:43:48.768450 containerd[1779]: time="2025-12-16T13:43:48.768314691Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:43:48.769975 containerd[1779]: time="2025-12-16T13:43:48.769927450Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 13:43:48.770096 containerd[1779]: time="2025-12-16T13:43:48.770023372Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 13:43:48.770272 kubelet[3080]: E1216 13:43:48.770197 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:43:48.770272 kubelet[3080]: E1216 13:43:48.770266 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:43:48.770645 kubelet[3080]: E1216 13:43:48.770528 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sdnqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-5d9cf7b5c4-4vfj5_calico-system(4477497f-a609-4791-b674-25df11e8ec73): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 13:43:48.770741 containerd[1779]: time="2025-12-16T13:43:48.770727369Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 13:43:48.771895 kubelet[3080]: E1216 13:43:48.771854 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" podUID="4477497f-a609-4791-b674-25df11e8ec73" Dec 16 13:43:49.110245 containerd[1779]: time="2025-12-16T13:43:49.110088902Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:43:49.112402 containerd[1779]: time="2025-12-16T13:43:49.112362955Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 13:43:49.112491 containerd[1779]: time="2025-12-16T13:43:49.112448061Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 13:43:49.112696 kubelet[3080]: E1216 13:43:49.112657 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:43:49.112758 kubelet[3080]: E1216 13:43:49.112715 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:43:49.112910 kubelet[3080]: E1216 13:43:49.112874 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cbfcp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-zxbn9_calico-system(bc92ca59-7a99-4554-9e6f-b60493200978): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 13:43:49.114060 kubelet[3080]: E1216 13:43:49.114024 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-zxbn9" podUID="bc92ca59-7a99-4554-9e6f-b60493200978" Dec 16 13:43:53.415220 kubelet[3080]: E1216 13:43:53.415135 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:43:53.773413 systemd[1]: Started sshd@23-10.0.21.93:22-147.75.109.163:54778.service - OpenSSH per-connection server daemon (147.75.109.163:54778). Dec 16 13:43:54.745247 sshd[5992]: Accepted publickey for core from 147.75.109.163 port 54778 ssh2: RSA SHA256:ieG6luY3aPLDHUxSRue076s7/DaIRhKi0Ks9YLqRurI Dec 16 13:43:54.746512 sshd-session[5992]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:43:54.751155 systemd-logind[1758]: New session 24 of user core. Dec 16 13:43:54.770967 systemd[1]: Started session-24.scope - Session 24 of User core. Dec 16 13:43:55.415834 kubelet[3080]: E1216 13:43:55.415782 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" podUID="59d04f45-d51d-4c79-b4da-d0334682cd90" Dec 16 13:43:55.416334 containerd[1779]: time="2025-12-16T13:43:55.416033634Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:43:55.417331 kubelet[3080]: E1216 13:43:55.417294 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7bdfb97df8-fdm7d" podUID="938ae2d1-4432-47b7-a9d3-a5acd90ddc02" Dec 16 13:43:55.470491 sshd[5995]: Connection closed by 147.75.109.163 port 54778 Dec 16 13:43:55.470885 sshd-session[5992]: pam_unix(sshd:session): session closed for user core Dec 16 13:43:55.474231 systemd[1]: sshd@23-10.0.21.93:22-147.75.109.163:54778.service: Deactivated successfully. Dec 16 13:43:55.475756 systemd[1]: session-24.scope: Deactivated successfully. Dec 16 13:43:55.476356 systemd-logind[1758]: Session 24 logged out. Waiting for processes to exit. Dec 16 13:43:55.477110 systemd-logind[1758]: Removed session 24. Dec 16 13:43:55.769412 containerd[1779]: time="2025-12-16T13:43:55.769292307Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:43:55.770735 containerd[1779]: time="2025-12-16T13:43:55.770700098Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:43:55.770856 containerd[1779]: time="2025-12-16T13:43:55.770740752Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:43:55.770943 kubelet[3080]: E1216 13:43:55.770912 3080 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:43:55.771015 kubelet[3080]: E1216 13:43:55.770959 3080 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:43:55.771212 kubelet[3080]: E1216 13:43:55.771109 3080 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-88k89,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-574757c556-b26rp_calico-apiserver(ae6d0914-be51-4e9e-9abc-d81494f00693): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:43:55.772294 kubelet[3080]: E1216 13:43:55.772259 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-b26rp" podUID="ae6d0914-be51-4e9e-9abc-d81494f00693" Dec 16 13:44:00.414920 kubelet[3080]: E1216 13:44:00.414784 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" podUID="4477497f-a609-4791-b674-25df11e8ec73" Dec 16 13:44:01.414840 kubelet[3080]: E1216 13:44:01.414792 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-zxbn9" podUID="bc92ca59-7a99-4554-9e6f-b60493200978" Dec 16 13:44:06.415030 kubelet[3080]: E1216 13:44:06.414965 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:44:07.415086 kubelet[3080]: E1216 13:44:07.415033 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7bdfb97df8-fdm7d" podUID="938ae2d1-4432-47b7-a9d3-a5acd90ddc02" Dec 16 13:44:10.415710 kubelet[3080]: E1216 13:44:10.415662 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-b26rp" podUID="ae6d0914-be51-4e9e-9abc-d81494f00693" Dec 16 13:44:10.417703 kubelet[3080]: E1216 13:44:10.416880 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" podUID="59d04f45-d51d-4c79-b4da-d0334682cd90" Dec 16 13:44:14.415976 kubelet[3080]: E1216 13:44:14.415921 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" podUID="4477497f-a609-4791-b674-25df11e8ec73" Dec 16 13:44:16.415838 kubelet[3080]: E1216 13:44:16.415634 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-zxbn9" podUID="bc92ca59-7a99-4554-9e6f-b60493200978" Dec 16 13:44:20.416480 kubelet[3080]: E1216 13:44:20.416430 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7bdfb97df8-fdm7d" podUID="938ae2d1-4432-47b7-a9d3-a5acd90ddc02" Dec 16 13:44:20.417044 kubelet[3080]: E1216 13:44:20.416441 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:44:21.415233 kubelet[3080]: E1216 13:44:21.415174 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-b26rp" podUID="ae6d0914-be51-4e9e-9abc-d81494f00693" Dec 16 13:44:22.415738 kubelet[3080]: E1216 13:44:22.415695 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" podUID="59d04f45-d51d-4c79-b4da-d0334682cd90" Dec 16 13:44:25.415378 kubelet[3080]: E1216 13:44:25.415251 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" podUID="4477497f-a609-4791-b674-25df11e8ec73" Dec 16 13:44:30.414855 kubelet[3080]: E1216 13:44:30.414796 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-zxbn9" podUID="bc92ca59-7a99-4554-9e6f-b60493200978" Dec 16 13:44:32.416519 kubelet[3080]: E1216 13:44:32.416458 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:44:34.417394 kubelet[3080]: E1216 13:44:34.417315 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" podUID="59d04f45-d51d-4c79-b4da-d0334682cd90" Dec 16 13:44:35.414752 kubelet[3080]: E1216 13:44:35.414692 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7bdfb97df8-fdm7d" podUID="938ae2d1-4432-47b7-a9d3-a5acd90ddc02" Dec 16 13:44:36.416241 kubelet[3080]: E1216 13:44:36.416172 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" podUID="4477497f-a609-4791-b674-25df11e8ec73" Dec 16 13:44:36.416744 kubelet[3080]: E1216 13:44:36.416331 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-b26rp" podUID="ae6d0914-be51-4e9e-9abc-d81494f00693" Dec 16 13:44:44.415320 kubelet[3080]: E1216 13:44:44.415242 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-zxbn9" podUID="bc92ca59-7a99-4554-9e6f-b60493200978" Dec 16 13:44:44.416035 kubelet[3080]: E1216 13:44:44.415491 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:44:46.415540 kubelet[3080]: E1216 13:44:46.415487 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" podUID="59d04f45-d51d-4c79-b4da-d0334682cd90" Dec 16 13:44:46.415997 kubelet[3080]: E1216 13:44:46.415843 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7bdfb97df8-fdm7d" podUID="938ae2d1-4432-47b7-a9d3-a5acd90ddc02" Dec 16 13:44:49.562988 systemd[1]: cri-containerd-d2f01c2f8448f7d617d6df94713fbead1ba0e243119dda27fbdb747fc15c7dfb.scope: Deactivated successfully. Dec 16 13:44:49.563321 systemd[1]: cri-containerd-d2f01c2f8448f7d617d6df94713fbead1ba0e243119dda27fbdb747fc15c7dfb.scope: Consumed 4.042s CPU time, 68.3M memory peak. Dec 16 13:44:49.564495 containerd[1779]: time="2025-12-16T13:44:49.564451964Z" level=info msg="received container exit event container_id:\"d2f01c2f8448f7d617d6df94713fbead1ba0e243119dda27fbdb747fc15c7dfb\" id:\"d2f01c2f8448f7d617d6df94713fbead1ba0e243119dda27fbdb747fc15c7dfb\" pid:2910 exit_status:1 exited_at:{seconds:1765892689 nanos:564108480}" Dec 16 13:44:49.587655 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d2f01c2f8448f7d617d6df94713fbead1ba0e243119dda27fbdb747fc15c7dfb-rootfs.mount: Deactivated successfully. Dec 16 13:44:49.618466 systemd[1]: cri-containerd-3918d2dd5324cbf8b1979462384233bd28fb750339fa9a7d3bcd81ffc9c9cc3b.scope: Deactivated successfully. Dec 16 13:44:49.618772 systemd[1]: cri-containerd-3918d2dd5324cbf8b1979462384233bd28fb750339fa9a7d3bcd81ffc9c9cc3b.scope: Consumed 36.189s CPU time, 106.2M memory peak. Dec 16 13:44:49.619415 containerd[1779]: time="2025-12-16T13:44:49.619378304Z" level=info msg="received container exit event container_id:\"3918d2dd5324cbf8b1979462384233bd28fb750339fa9a7d3bcd81ffc9c9cc3b\" id:\"3918d2dd5324cbf8b1979462384233bd28fb750339fa9a7d3bcd81ffc9c9cc3b\" pid:3516 exit_status:1 exited_at:{seconds:1765892689 nanos:619083134}" Dec 16 13:44:49.639000 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3918d2dd5324cbf8b1979462384233bd28fb750339fa9a7d3bcd81ffc9c9cc3b-rootfs.mount: Deactivated successfully. Dec 16 13:44:49.861033 kubelet[3080]: E1216 13:44:49.860758 3080 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.21.93:48352->10.0.21.47:2379: read: connection timed out" Dec 16 13:44:49.864394 systemd[1]: cri-containerd-bf54ddf8924781bfc588f42ab8e89aa5cd1ec8b3685bdf4e5c8ef7a9a0196a05.scope: Deactivated successfully. Dec 16 13:44:49.864704 systemd[1]: cri-containerd-bf54ddf8924781bfc588f42ab8e89aa5cd1ec8b3685bdf4e5c8ef7a9a0196a05.scope: Consumed 2.287s CPU time, 25.7M memory peak. Dec 16 13:44:49.865653 containerd[1779]: time="2025-12-16T13:44:49.865611922Z" level=info msg="received container exit event container_id:\"bf54ddf8924781bfc588f42ab8e89aa5cd1ec8b3685bdf4e5c8ef7a9a0196a05\" id:\"bf54ddf8924781bfc588f42ab8e89aa5cd1ec8b3685bdf4e5c8ef7a9a0196a05\" pid:2919 exit_status:1 exited_at:{seconds:1765892689 nanos:865306179}" Dec 16 13:44:49.887723 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bf54ddf8924781bfc588f42ab8e89aa5cd1ec8b3685bdf4e5c8ef7a9a0196a05-rootfs.mount: Deactivated successfully. Dec 16 13:44:49.982125 kubelet[3080]: I1216 13:44:49.982034 3080 scope.go:117] "RemoveContainer" containerID="d2f01c2f8448f7d617d6df94713fbead1ba0e243119dda27fbdb747fc15c7dfb" Dec 16 13:44:49.983095 kubelet[3080]: I1216 13:44:49.983061 3080 scope.go:117] "RemoveContainer" containerID="bf54ddf8924781bfc588f42ab8e89aa5cd1ec8b3685bdf4e5c8ef7a9a0196a05" Dec 16 13:44:49.983659 containerd[1779]: time="2025-12-16T13:44:49.983626453Z" level=info msg="CreateContainer within sandbox \"99f6b13f3e6707e1e9d9ee2a7b4d2967131b90bef5366a7369a89022dc04ef34\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Dec 16 13:44:49.984644 kubelet[3080]: I1216 13:44:49.984570 3080 scope.go:117] "RemoveContainer" containerID="d3848637bdef26fa564147cfa1b0c65bba01b9d7ee0d1116df461038da80c2da" Dec 16 13:44:49.984694 containerd[1779]: time="2025-12-16T13:44:49.984585882Z" level=info msg="CreateContainer within sandbox \"43fc1367cb51f2a35f04018d18cf8662b84552019a6d3e3586c0a53a110e5962\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Dec 16 13:44:49.984801 kubelet[3080]: I1216 13:44:49.984784 3080 scope.go:117] "RemoveContainer" containerID="3918d2dd5324cbf8b1979462384233bd28fb750339fa9a7d3bcd81ffc9c9cc3b" Dec 16 13:44:49.984913 kubelet[3080]: E1216 13:44:49.984895 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-7dcd859c48-52gxm_tigera-operator(f9282a71-1e62-4807-9b32-a4a8ec8b3bc3)\"" pod="tigera-operator/tigera-operator-7dcd859c48-52gxm" podUID="f9282a71-1e62-4807-9b32-a4a8ec8b3bc3" Dec 16 13:44:49.985875 containerd[1779]: time="2025-12-16T13:44:49.985821008Z" level=info msg="RemoveContainer for \"d3848637bdef26fa564147cfa1b0c65bba01b9d7ee0d1116df461038da80c2da\"" Dec 16 13:44:49.997475 containerd[1779]: time="2025-12-16T13:44:49.997415136Z" level=info msg="RemoveContainer for \"d3848637bdef26fa564147cfa1b0c65bba01b9d7ee0d1116df461038da80c2da\" returns successfully" Dec 16 13:44:49.998770 containerd[1779]: time="2025-12-16T13:44:49.998723830Z" level=info msg="Container 1ca60ecfe1f2fbeb0dfff6a78540143c0e5baf929b4c1b79d501f25c1c695557: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:44:50.000987 containerd[1779]: time="2025-12-16T13:44:50.000956260Z" level=info msg="Container 79e458f1d0a671a836c639947db1044db7e94fae0a01de684b0a5536a2ed3389: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:44:50.009114 containerd[1779]: time="2025-12-16T13:44:50.009019731Z" level=info msg="CreateContainer within sandbox \"43fc1367cb51f2a35f04018d18cf8662b84552019a6d3e3586c0a53a110e5962\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"1ca60ecfe1f2fbeb0dfff6a78540143c0e5baf929b4c1b79d501f25c1c695557\"" Dec 16 13:44:50.009661 containerd[1779]: time="2025-12-16T13:44:50.009594058Z" level=info msg="StartContainer for \"1ca60ecfe1f2fbeb0dfff6a78540143c0e5baf929b4c1b79d501f25c1c695557\"" Dec 16 13:44:50.010898 containerd[1779]: time="2025-12-16T13:44:50.010843980Z" level=info msg="connecting to shim 1ca60ecfe1f2fbeb0dfff6a78540143c0e5baf929b4c1b79d501f25c1c695557" address="unix:///run/containerd/s/ce506c5368c044f1040c711d785f84ddaf37c388bd90948411e203799d3e4610" protocol=ttrpc version=3 Dec 16 13:44:50.015982 containerd[1779]: time="2025-12-16T13:44:50.015943697Z" level=info msg="CreateContainer within sandbox \"99f6b13f3e6707e1e9d9ee2a7b4d2967131b90bef5366a7369a89022dc04ef34\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"79e458f1d0a671a836c639947db1044db7e94fae0a01de684b0a5536a2ed3389\"" Dec 16 13:44:50.016469 containerd[1779]: time="2025-12-16T13:44:50.016443145Z" level=info msg="StartContainer for \"79e458f1d0a671a836c639947db1044db7e94fae0a01de684b0a5536a2ed3389\"" Dec 16 13:44:50.018837 containerd[1779]: time="2025-12-16T13:44:50.018805468Z" level=info msg="connecting to shim 79e458f1d0a671a836c639947db1044db7e94fae0a01de684b0a5536a2ed3389" address="unix:///run/containerd/s/c6ad5dc5a43cbb07989c2700f1f8f2a25dac783f22e92e4e9551f9678c1b5712" protocol=ttrpc version=3 Dec 16 13:44:50.041825 systemd[1]: Started cri-containerd-1ca60ecfe1f2fbeb0dfff6a78540143c0e5baf929b4c1b79d501f25c1c695557.scope - libcontainer container 1ca60ecfe1f2fbeb0dfff6a78540143c0e5baf929b4c1b79d501f25c1c695557. Dec 16 13:44:50.044421 systemd[1]: Started cri-containerd-79e458f1d0a671a836c639947db1044db7e94fae0a01de684b0a5536a2ed3389.scope - libcontainer container 79e458f1d0a671a836c639947db1044db7e94fae0a01de684b0a5536a2ed3389. Dec 16 13:44:50.094882 containerd[1779]: time="2025-12-16T13:44:50.094829534Z" level=info msg="StartContainer for \"1ca60ecfe1f2fbeb0dfff6a78540143c0e5baf929b4c1b79d501f25c1c695557\" returns successfully" Dec 16 13:44:50.095081 containerd[1779]: time="2025-12-16T13:44:50.095061876Z" level=info msg="StartContainer for \"79e458f1d0a671a836c639947db1044db7e94fae0a01de684b0a5536a2ed3389\" returns successfully" Dec 16 13:44:50.415161 kubelet[3080]: E1216 13:44:50.415115 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-b26rp" podUID="ae6d0914-be51-4e9e-9abc-d81494f00693" Dec 16 13:44:51.415128 kubelet[3080]: E1216 13:44:51.415082 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" podUID="4477497f-a609-4791-b674-25df11e8ec73" Dec 16 13:44:54.725232 kubelet[3080]: E1216 13:44:54.724719 3080 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.21.93:48152->10.0.21.47:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4459-2-2-a-7f096d1947.1881b60a4f3d0509 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4459-2-2-a-7f096d1947,UID:b8327394c4bbf31f869ef4b2b5575c90,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4459-2-2-a-7f096d1947,},FirstTimestamp:2025-12-16 13:44:44.301272329 +0000 UTC m=+277.968519398,LastTimestamp:2025-12-16 13:44:44.301272329 +0000 UTC m=+277.968519398,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-a-7f096d1947,}" Dec 16 13:44:56.414409 kubelet[3080]: E1216 13:44:56.414348 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-zxbn9" podUID="bc92ca59-7a99-4554-9e6f-b60493200978" Dec 16 13:44:59.415022 kubelet[3080]: E1216 13:44:59.414981 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-w6l7d" podUID="59d04f45-d51d-4c79-b4da-d0334682cd90" Dec 16 13:44:59.415448 kubelet[3080]: E1216 13:44:59.415282 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-wzss6" podUID="713c1552-24ab-4b60-9872-de4f52adb23b" Dec 16 13:44:59.415448 kubelet[3080]: E1216 13:44:59.415293 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7bdfb97df8-fdm7d" podUID="938ae2d1-4432-47b7-a9d3-a5acd90ddc02" Dec 16 13:44:59.861934 kubelet[3080]: E1216 13:44:59.861758 3080 controller.go:195] "Failed to update lease" err="Put \"https://10.0.21.93:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-a-7f096d1947?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 13:45:00.420690 kubelet[3080]: I1216 13:45:00.420638 3080 status_manager.go:895] "Failed to get status for pod" podUID="a5de365130e76bebbd64bed8193f592b" pod="kube-system/kube-controller-manager-ci-4459-2-2-a-7f096d1947" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.21.93:48268->10.0.21.47:2379: read: connection timed out" Dec 16 13:45:01.413803 kubelet[3080]: I1216 13:45:01.413735 3080 scope.go:117] "RemoveContainer" containerID="3918d2dd5324cbf8b1979462384233bd28fb750339fa9a7d3bcd81ffc9c9cc3b" Dec 16 13:45:01.414674 kubelet[3080]: E1216 13:45:01.414514 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-574757c556-b26rp" podUID="ae6d0914-be51-4e9e-9abc-d81494f00693" Dec 16 13:45:01.416029 containerd[1779]: time="2025-12-16T13:45:01.415615921Z" level=info msg="CreateContainer within sandbox \"a1f592c89c6404b0e83a3a88cde728deaf28654ee76060a5f4613f98ed440a46\" for container &ContainerMetadata{Name:tigera-operator,Attempt:2,}" Dec 16 13:45:01.426037 containerd[1779]: time="2025-12-16T13:45:01.425979690Z" level=info msg="Container 17d69fec6eae9e3384aef5c9f529e47b15a57969c43b7d0b34c6278d2f96d809: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:45:01.436083 containerd[1779]: time="2025-12-16T13:45:01.436027297Z" level=info msg="CreateContainer within sandbox \"a1f592c89c6404b0e83a3a88cde728deaf28654ee76060a5f4613f98ed440a46\" for &ContainerMetadata{Name:tigera-operator,Attempt:2,} returns container id \"17d69fec6eae9e3384aef5c9f529e47b15a57969c43b7d0b34c6278d2f96d809\"" Dec 16 13:45:01.436656 containerd[1779]: time="2025-12-16T13:45:01.436629632Z" level=info msg="StartContainer for \"17d69fec6eae9e3384aef5c9f529e47b15a57969c43b7d0b34c6278d2f96d809\"" Dec 16 13:45:01.437929 containerd[1779]: time="2025-12-16T13:45:01.437660811Z" level=info msg="connecting to shim 17d69fec6eae9e3384aef5c9f529e47b15a57969c43b7d0b34c6278d2f96d809" address="unix:///run/containerd/s/298593836c23173c01e2ec85ee67d4b4e73cf47a88e76a8d6fbdb722e5d8d4d0" protocol=ttrpc version=3 Dec 16 13:45:01.462760 systemd[1]: Started cri-containerd-17d69fec6eae9e3384aef5c9f529e47b15a57969c43b7d0b34c6278d2f96d809.scope - libcontainer container 17d69fec6eae9e3384aef5c9f529e47b15a57969c43b7d0b34c6278d2f96d809. Dec 16 13:45:01.490313 containerd[1779]: time="2025-12-16T13:45:01.490249536Z" level=info msg="StartContainer for \"17d69fec6eae9e3384aef5c9f529e47b15a57969c43b7d0b34c6278d2f96d809\" returns successfully" Dec 16 13:45:02.461027 containerd[1779]: time="2025-12-16T13:45:02.460924801Z" level=warning msg="container event discarded" container=6932abcc69ebb7e3dd31ad7720f9abb37eef316f7b252d725b98e66972a2555a type=CONTAINER_CREATED_EVENT Dec 16 13:45:02.461027 containerd[1779]: time="2025-12-16T13:45:02.460993206Z" level=warning msg="container event discarded" container=6932abcc69ebb7e3dd31ad7720f9abb37eef316f7b252d725b98e66972a2555a type=CONTAINER_STARTED_EVENT Dec 16 13:45:02.497290 containerd[1779]: time="2025-12-16T13:45:02.497218020Z" level=warning msg="container event discarded" container=43fc1367cb51f2a35f04018d18cf8662b84552019a6d3e3586c0a53a110e5962 type=CONTAINER_CREATED_EVENT Dec 16 13:45:02.497290 containerd[1779]: time="2025-12-16T13:45:02.497274487Z" level=warning msg="container event discarded" container=43fc1367cb51f2a35f04018d18cf8662b84552019a6d3e3586c0a53a110e5962 type=CONTAINER_STARTED_EVENT Dec 16 13:45:02.497290 containerd[1779]: time="2025-12-16T13:45:02.497282565Z" level=warning msg="container event discarded" container=99f6b13f3e6707e1e9d9ee2a7b4d2967131b90bef5366a7369a89022dc04ef34 type=CONTAINER_CREATED_EVENT Dec 16 13:45:02.497290 containerd[1779]: time="2025-12-16T13:45:02.497290843Z" level=warning msg="container event discarded" container=99f6b13f3e6707e1e9d9ee2a7b4d2967131b90bef5366a7369a89022dc04ef34 type=CONTAINER_STARTED_EVENT Dec 16 13:45:02.497290 containerd[1779]: time="2025-12-16T13:45:02.497296933Z" level=warning msg="container event discarded" container=9834abdb3cdfe3ce070baa5049c5ff831e9caec06fd6a60aad7b84dadaaa0783 type=CONTAINER_CREATED_EVENT Dec 16 13:45:02.524904 containerd[1779]: time="2025-12-16T13:45:02.524756070Z" level=warning msg="container event discarded" container=bf54ddf8924781bfc588f42ab8e89aa5cd1ec8b3685bdf4e5c8ef7a9a0196a05 type=CONTAINER_CREATED_EVENT Dec 16 13:45:02.524904 containerd[1779]: time="2025-12-16T13:45:02.524886401Z" level=warning msg="container event discarded" container=d2f01c2f8448f7d617d6df94713fbead1ba0e243119dda27fbdb747fc15c7dfb type=CONTAINER_CREATED_EVENT Dec 16 13:45:02.602316 containerd[1779]: time="2025-12-16T13:45:02.602213117Z" level=warning msg="container event discarded" container=9834abdb3cdfe3ce070baa5049c5ff831e9caec06fd6a60aad7b84dadaaa0783 type=CONTAINER_STARTED_EVENT Dec 16 13:45:02.614597 containerd[1779]: time="2025-12-16T13:45:02.614510113Z" level=warning msg="container event discarded" container=d2f01c2f8448f7d617d6df94713fbead1ba0e243119dda27fbdb747fc15c7dfb type=CONTAINER_STARTED_EVENT Dec 16 13:45:02.614597 containerd[1779]: time="2025-12-16T13:45:02.614578871Z" level=warning msg="container event discarded" container=bf54ddf8924781bfc588f42ab8e89aa5cd1ec8b3685bdf4e5c8ef7a9a0196a05 type=CONTAINER_STARTED_EVENT Dec 16 13:45:05.414811 kubelet[3080]: E1216 13:45:05.414637 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d9cf7b5c4-4vfj5" podUID="4477497f-a609-4791-b674-25df11e8ec73" Dec 16 13:45:09.415496 kubelet[3080]: E1216 13:45:09.415438 3080 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-zxbn9" podUID="bc92ca59-7a99-4554-9e6f-b60493200978" Dec 16 13:45:09.863414 kubelet[3080]: E1216 13:45:09.863263 3080 controller.go:195] "Failed to update lease" err="the server was unable to return a response in the time allotted, but may still be processing the request (put leases.coordination.k8s.io ci-4459-2-2-a-7f096d1947)"