Dec 12 17:24:02.764262 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Dec 12 17:24:02.764284 kernel: Linux version 6.12.61-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Fri Dec 12 15:20:48 -00 2025 Dec 12 17:24:02.764294 kernel: KASLR enabled Dec 12 17:24:02.764300 kernel: efi: EFI v2.7 by EDK II Dec 12 17:24:02.764305 kernel: efi: SMBIOS 3.0=0x43bed0000 MEMATTR=0x43a714018 ACPI 2.0=0x438430018 RNG=0x43843e818 MEMRESERVE=0x438351218 Dec 12 17:24:02.764310 kernel: random: crng init done Dec 12 17:24:02.764317 kernel: secureboot: Secure boot disabled Dec 12 17:24:02.764322 kernel: ACPI: Early table checksum verification disabled Dec 12 17:24:02.764328 kernel: ACPI: RSDP 0x0000000438430018 000024 (v02 BOCHS ) Dec 12 17:24:02.764333 kernel: ACPI: XSDT 0x000000043843FE98 000074 (v01 BOCHS BXPC 00000001 01000013) Dec 12 17:24:02.764340 kernel: ACPI: FACP 0x000000043843FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:02.764346 kernel: ACPI: DSDT 0x0000000438437518 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:02.764352 kernel: ACPI: APIC 0x000000043843FC18 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:02.764358 kernel: ACPI: PPTT 0x000000043843D898 000114 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:02.764435 kernel: ACPI: GTDT 0x000000043843E898 000068 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:02.764445 kernel: ACPI: MCFG 0x000000043843FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:02.764454 kernel: ACPI: SPCR 0x000000043843E498 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:02.764460 kernel: ACPI: DBG2 0x000000043843E798 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:02.764466 kernel: ACPI: SRAT 0x000000043843E518 0000A0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:02.764472 kernel: ACPI: IORT 0x000000043843E618 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:02.764478 kernel: ACPI: BGRT 0x000000043843E718 000038 (v01 INTEL EDK2 00000002 01000013) Dec 12 17:24:02.764484 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Dec 12 17:24:02.764490 kernel: ACPI: Use ACPI SPCR as default console: Yes Dec 12 17:24:02.764496 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000-0x43fffffff] Dec 12 17:24:02.764502 kernel: NODE_DATA(0) allocated [mem 0x43dff1a00-0x43dff8fff] Dec 12 17:24:02.764508 kernel: Zone ranges: Dec 12 17:24:02.764515 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Dec 12 17:24:02.764521 kernel: DMA32 empty Dec 12 17:24:02.764527 kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Dec 12 17:24:02.764532 kernel: Device empty Dec 12 17:24:02.764538 kernel: Movable zone start for each node Dec 12 17:24:02.764544 kernel: Early memory node ranges Dec 12 17:24:02.764550 kernel: node 0: [mem 0x0000000040000000-0x000000043843ffff] Dec 12 17:24:02.764556 kernel: node 0: [mem 0x0000000438440000-0x000000043872ffff] Dec 12 17:24:02.764562 kernel: node 0: [mem 0x0000000438730000-0x000000043bbfffff] Dec 12 17:24:02.764568 kernel: node 0: [mem 0x000000043bc00000-0x000000043bfdffff] Dec 12 17:24:02.764574 kernel: node 0: [mem 0x000000043bfe0000-0x000000043fffffff] Dec 12 17:24:02.764580 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x000000043fffffff] Dec 12 17:24:02.764587 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Dec 12 17:24:02.764593 kernel: psci: probing for conduit method from ACPI. Dec 12 17:24:02.764602 kernel: psci: PSCIv1.3 detected in firmware. Dec 12 17:24:02.764608 kernel: psci: Using standard PSCI v0.2 function IDs Dec 12 17:24:02.764614 kernel: psci: Trusted OS migration not required Dec 12 17:24:02.764622 kernel: psci: SMC Calling Convention v1.1 Dec 12 17:24:02.764628 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Dec 12 17:24:02.764634 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Dec 12 17:24:02.764640 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Dec 12 17:24:02.764647 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x2 -> Node 0 Dec 12 17:24:02.764653 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x3 -> Node 0 Dec 12 17:24:02.764659 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Dec 12 17:24:02.764666 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Dec 12 17:24:02.764672 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Dec 12 17:24:02.764679 kernel: Detected PIPT I-cache on CPU0 Dec 12 17:24:02.764685 kernel: CPU features: detected: GIC system register CPU interface Dec 12 17:24:02.764691 kernel: CPU features: detected: Spectre-v4 Dec 12 17:24:02.764699 kernel: CPU features: detected: Spectre-BHB Dec 12 17:24:02.764705 kernel: CPU features: kernel page table isolation forced ON by KASLR Dec 12 17:24:02.764712 kernel: CPU features: detected: Kernel page table isolation (KPTI) Dec 12 17:24:02.764718 kernel: CPU features: detected: ARM erratum 1418040 Dec 12 17:24:02.764724 kernel: CPU features: detected: SSBS not fully self-synchronizing Dec 12 17:24:02.764731 kernel: alternatives: applying boot alternatives Dec 12 17:24:02.764738 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=361f5baddf90aee3bc7ee7e9be879bc0cc94314f224faa1e2791d9b44cd3ec52 Dec 12 17:24:02.764745 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Dec 12 17:24:02.764751 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Dec 12 17:24:02.764758 kernel: Fallback order for Node 0: 0 Dec 12 17:24:02.764765 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4194304 Dec 12 17:24:02.764772 kernel: Policy zone: Normal Dec 12 17:24:02.764778 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 12 17:24:02.764784 kernel: software IO TLB: area num 4. Dec 12 17:24:02.764791 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Dec 12 17:24:02.764797 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Dec 12 17:24:02.764803 kernel: rcu: Preemptible hierarchical RCU implementation. Dec 12 17:24:02.764810 kernel: rcu: RCU event tracing is enabled. Dec 12 17:24:02.764816 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Dec 12 17:24:02.764823 kernel: Trampoline variant of Tasks RCU enabled. Dec 12 17:24:02.764829 kernel: Tracing variant of Tasks RCU enabled. Dec 12 17:24:02.764835 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 12 17:24:02.764843 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Dec 12 17:24:02.764850 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Dec 12 17:24:02.764856 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Dec 12 17:24:02.764862 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Dec 12 17:24:02.764868 kernel: GICv3: 256 SPIs implemented Dec 12 17:24:02.764875 kernel: GICv3: 0 Extended SPIs implemented Dec 12 17:24:02.764881 kernel: Root IRQ handler: gic_handle_irq Dec 12 17:24:02.764887 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Dec 12 17:24:02.764893 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Dec 12 17:24:02.764899 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Dec 12 17:24:02.764906 kernel: ITS [mem 0x08080000-0x0809ffff] Dec 12 17:24:02.764913 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100110000 (indirect, esz 8, psz 64K, shr 1) Dec 12 17:24:02.764920 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100120000 (flat, esz 8, psz 64K, shr 1) Dec 12 17:24:02.764927 kernel: GICv3: using LPI property table @0x0000000100130000 Dec 12 17:24:02.764933 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100140000 Dec 12 17:24:02.764939 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 12 17:24:02.764946 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 12 17:24:02.764952 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Dec 12 17:24:02.764959 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Dec 12 17:24:02.764965 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Dec 12 17:24:02.764971 kernel: arm-pv: using stolen time PV Dec 12 17:24:02.764978 kernel: Console: colour dummy device 80x25 Dec 12 17:24:02.764986 kernel: ACPI: Core revision 20240827 Dec 12 17:24:02.764992 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Dec 12 17:24:02.764999 kernel: pid_max: default: 32768 minimum: 301 Dec 12 17:24:02.765005 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Dec 12 17:24:02.765012 kernel: landlock: Up and running. Dec 12 17:24:02.765018 kernel: SELinux: Initializing. Dec 12 17:24:02.765025 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 12 17:24:02.765031 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 12 17:24:02.765038 kernel: rcu: Hierarchical SRCU implementation. Dec 12 17:24:02.765044 kernel: rcu: Max phase no-delay instances is 400. Dec 12 17:24:02.765052 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Dec 12 17:24:02.765059 kernel: Remapping and enabling EFI services. Dec 12 17:24:02.765065 kernel: smp: Bringing up secondary CPUs ... Dec 12 17:24:02.765072 kernel: Detected PIPT I-cache on CPU1 Dec 12 17:24:02.765078 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Dec 12 17:24:02.765085 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100150000 Dec 12 17:24:02.765091 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 12 17:24:02.765098 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Dec 12 17:24:02.765104 kernel: Detected PIPT I-cache on CPU2 Dec 12 17:24:02.765116 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Dec 12 17:24:02.765123 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000100160000 Dec 12 17:24:02.765130 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 12 17:24:02.765138 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Dec 12 17:24:02.765145 kernel: Detected PIPT I-cache on CPU3 Dec 12 17:24:02.765152 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Dec 12 17:24:02.765159 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000100170000 Dec 12 17:24:02.765165 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 12 17:24:02.765173 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Dec 12 17:24:02.765180 kernel: smp: Brought up 1 node, 4 CPUs Dec 12 17:24:02.765187 kernel: SMP: Total of 4 processors activated. Dec 12 17:24:02.765194 kernel: CPU: All CPU(s) started at EL1 Dec 12 17:24:02.765200 kernel: CPU features: detected: 32-bit EL0 Support Dec 12 17:24:02.765207 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Dec 12 17:24:02.765214 kernel: CPU features: detected: Common not Private translations Dec 12 17:24:02.765221 kernel: CPU features: detected: CRC32 instructions Dec 12 17:24:02.765227 kernel: CPU features: detected: Enhanced Virtualization Traps Dec 12 17:24:02.765235 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Dec 12 17:24:02.765242 kernel: CPU features: detected: LSE atomic instructions Dec 12 17:24:02.765249 kernel: CPU features: detected: Privileged Access Never Dec 12 17:24:02.765256 kernel: CPU features: detected: RAS Extension Support Dec 12 17:24:02.765263 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Dec 12 17:24:02.765270 kernel: alternatives: applying system-wide alternatives Dec 12 17:24:02.765276 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Dec 12 17:24:02.765284 kernel: Memory: 16297360K/16777216K available (11200K kernel code, 2456K rwdata, 9084K rodata, 39552K init, 1038K bss, 457072K reserved, 16384K cma-reserved) Dec 12 17:24:02.765291 kernel: devtmpfs: initialized Dec 12 17:24:02.765299 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 12 17:24:02.765306 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Dec 12 17:24:02.765313 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Dec 12 17:24:02.765320 kernel: 0 pages in range for non-PLT usage Dec 12 17:24:02.765327 kernel: 508400 pages in range for PLT usage Dec 12 17:24:02.765333 kernel: pinctrl core: initialized pinctrl subsystem Dec 12 17:24:02.765340 kernel: SMBIOS 3.0.0 present. Dec 12 17:24:02.765347 kernel: DMI: QEMU KVM Virtual Machine, BIOS 0.0.0 02/06/2015 Dec 12 17:24:02.765354 kernel: DMI: Memory slots populated: 1/1 Dec 12 17:24:02.765362 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 12 17:24:02.765376 kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Dec 12 17:24:02.765383 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Dec 12 17:24:02.765390 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Dec 12 17:24:02.765397 kernel: audit: initializing netlink subsys (disabled) Dec 12 17:24:02.765404 kernel: audit: type=2000 audit(0.040:1): state=initialized audit_enabled=0 res=1 Dec 12 17:24:02.765411 kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 12 17:24:02.765418 kernel: cpuidle: using governor menu Dec 12 17:24:02.765425 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Dec 12 17:24:02.765434 kernel: ASID allocator initialised with 32768 entries Dec 12 17:24:02.765441 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 12 17:24:02.765448 kernel: Serial: AMBA PL011 UART driver Dec 12 17:24:02.765455 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Dec 12 17:24:02.765462 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Dec 12 17:24:02.765469 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Dec 12 17:24:02.765476 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Dec 12 17:24:02.765483 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Dec 12 17:24:02.765490 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Dec 12 17:24:02.765499 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Dec 12 17:24:02.765505 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Dec 12 17:24:02.765512 kernel: ACPI: Added _OSI(Module Device) Dec 12 17:24:02.765519 kernel: ACPI: Added _OSI(Processor Device) Dec 12 17:24:02.765526 kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 12 17:24:02.765533 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 12 17:24:02.765540 kernel: ACPI: Interpreter enabled Dec 12 17:24:02.765547 kernel: ACPI: Using GIC for interrupt routing Dec 12 17:24:02.765554 kernel: ACPI: MCFG table detected, 1 entries Dec 12 17:24:02.765562 kernel: ACPI: CPU0 has been hot-added Dec 12 17:24:02.765569 kernel: ACPI: CPU1 has been hot-added Dec 12 17:24:02.765575 kernel: ACPI: CPU2 has been hot-added Dec 12 17:24:02.765582 kernel: ACPI: CPU3 has been hot-added Dec 12 17:24:02.765589 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Dec 12 17:24:02.765596 kernel: printk: legacy console [ttyAMA0] enabled Dec 12 17:24:02.765603 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 12 17:24:02.765751 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Dec 12 17:24:02.765818 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Dec 12 17:24:02.765877 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Dec 12 17:24:02.765933 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Dec 12 17:24:02.765990 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Dec 12 17:24:02.765999 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Dec 12 17:24:02.766006 kernel: PCI host bridge to bus 0000:00 Dec 12 17:24:02.766073 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Dec 12 17:24:02.766126 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Dec 12 17:24:02.766179 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Dec 12 17:24:02.766230 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Dec 12 17:24:02.766303 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Dec 12 17:24:02.766402 kernel: pci 0000:00:01.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.766473 kernel: pci 0000:00:01.0: BAR 0 [mem 0x125a0000-0x125a0fff] Dec 12 17:24:02.766532 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Dec 12 17:24:02.766595 kernel: pci 0000:00:01.0: bridge window [mem 0x12400000-0x124fffff] Dec 12 17:24:02.766652 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Dec 12 17:24:02.766718 kernel: pci 0000:00:01.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.766776 kernel: pci 0000:00:01.1: BAR 0 [mem 0x1259f000-0x1259ffff] Dec 12 17:24:02.766835 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Dec 12 17:24:02.766892 kernel: pci 0000:00:01.1: bridge window [mem 0x12300000-0x123fffff] Dec 12 17:24:02.766956 kernel: pci 0000:00:01.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.767015 kernel: pci 0000:00:01.2: BAR 0 [mem 0x1259e000-0x1259efff] Dec 12 17:24:02.767073 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Dec 12 17:24:02.767130 kernel: pci 0000:00:01.2: bridge window [mem 0x12200000-0x122fffff] Dec 12 17:24:02.767187 kernel: pci 0000:00:01.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Dec 12 17:24:02.767253 kernel: pci 0000:00:01.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.767311 kernel: pci 0000:00:01.3: BAR 0 [mem 0x1259d000-0x1259dfff] Dec 12 17:24:02.767377 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Dec 12 17:24:02.767443 kernel: pci 0000:00:01.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Dec 12 17:24:02.767512 kernel: pci 0000:00:01.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.767570 kernel: pci 0000:00:01.4: BAR 0 [mem 0x1259c000-0x1259cfff] Dec 12 17:24:02.767627 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Dec 12 17:24:02.767684 kernel: pci 0000:00:01.4: bridge window [mem 0x12100000-0x121fffff] Dec 12 17:24:02.767740 kernel: pci 0000:00:01.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Dec 12 17:24:02.767805 kernel: pci 0000:00:01.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.767865 kernel: pci 0000:00:01.5: BAR 0 [mem 0x1259b000-0x1259bfff] Dec 12 17:24:02.767922 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Dec 12 17:24:02.767978 kernel: pci 0000:00:01.5: bridge window [mem 0x12000000-0x120fffff] Dec 12 17:24:02.768041 kernel: pci 0000:00:01.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Dec 12 17:24:02.768111 kernel: pci 0000:00:01.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.768169 kernel: pci 0000:00:01.6: BAR 0 [mem 0x1259a000-0x1259afff] Dec 12 17:24:02.768245 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Dec 12 17:24:02.768318 kernel: pci 0000:00:01.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.768428 kernel: pci 0000:00:01.7: BAR 0 [mem 0x12599000-0x12599fff] Dec 12 17:24:02.768491 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Dec 12 17:24:02.768556 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.768615 kernel: pci 0000:00:02.0: BAR 0 [mem 0x12598000-0x12598fff] Dec 12 17:24:02.768675 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Dec 12 17:24:02.768742 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.768803 kernel: pci 0000:00:02.1: BAR 0 [mem 0x12597000-0x12597fff] Dec 12 17:24:02.768861 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Dec 12 17:24:02.768925 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.768983 kernel: pci 0000:00:02.2: BAR 0 [mem 0x12596000-0x12596fff] Dec 12 17:24:02.769041 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Dec 12 17:24:02.769104 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.769164 kernel: pci 0000:00:02.3: BAR 0 [mem 0x12595000-0x12595fff] Dec 12 17:24:02.769221 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Dec 12 17:24:02.769284 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.769342 kernel: pci 0000:00:02.4: BAR 0 [mem 0x12594000-0x12594fff] Dec 12 17:24:02.769410 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Dec 12 17:24:02.769496 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.769560 kernel: pci 0000:00:02.5: BAR 0 [mem 0x12593000-0x12593fff] Dec 12 17:24:02.769618 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Dec 12 17:24:02.769684 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.769743 kernel: pci 0000:00:02.6: BAR 0 [mem 0x12592000-0x12592fff] Dec 12 17:24:02.769800 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Dec 12 17:24:02.769863 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.769921 kernel: pci 0000:00:02.7: BAR 0 [mem 0x12591000-0x12591fff] Dec 12 17:24:02.769980 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Dec 12 17:24:02.770044 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.770103 kernel: pci 0000:00:03.0: BAR 0 [mem 0x12590000-0x12590fff] Dec 12 17:24:02.770160 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Dec 12 17:24:02.770222 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.770281 kernel: pci 0000:00:03.1: BAR 0 [mem 0x1258f000-0x1258ffff] Dec 12 17:24:02.770338 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Dec 12 17:24:02.770407 kernel: pci 0000:00:03.1: bridge window [io 0xf000-0xffff] Dec 12 17:24:02.770466 kernel: pci 0000:00:03.1: bridge window [mem 0x11e00000-0x11ffffff] Dec 12 17:24:02.770531 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.770591 kernel: pci 0000:00:03.2: BAR 0 [mem 0x1258e000-0x1258efff] Dec 12 17:24:02.770648 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Dec 12 17:24:02.770707 kernel: pci 0000:00:03.2: bridge window [io 0xe000-0xefff] Dec 12 17:24:02.770775 kernel: pci 0000:00:03.2: bridge window [mem 0x11c00000-0x11dfffff] Dec 12 17:24:02.770845 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.770905 kernel: pci 0000:00:03.3: BAR 0 [mem 0x1258d000-0x1258dfff] Dec 12 17:24:02.770964 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Dec 12 17:24:02.771021 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff] Dec 12 17:24:02.771079 kernel: pci 0000:00:03.3: bridge window [mem 0x11a00000-0x11bfffff] Dec 12 17:24:02.771146 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.771206 kernel: pci 0000:00:03.4: BAR 0 [mem 0x1258c000-0x1258cfff] Dec 12 17:24:02.771271 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Dec 12 17:24:02.771329 kernel: pci 0000:00:03.4: bridge window [io 0xc000-0xcfff] Dec 12 17:24:02.771406 kernel: pci 0000:00:03.4: bridge window [mem 0x11800000-0x119fffff] Dec 12 17:24:02.771476 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.771541 kernel: pci 0000:00:03.5: BAR 0 [mem 0x1258b000-0x1258bfff] Dec 12 17:24:02.771604 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Dec 12 17:24:02.771662 kernel: pci 0000:00:03.5: bridge window [io 0xb000-0xbfff] Dec 12 17:24:02.771721 kernel: pci 0000:00:03.5: bridge window [mem 0x11600000-0x117fffff] Dec 12 17:24:02.771800 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.771865 kernel: pci 0000:00:03.6: BAR 0 [mem 0x1258a000-0x1258afff] Dec 12 17:24:02.771933 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Dec 12 17:24:02.771999 kernel: pci 0000:00:03.6: bridge window [io 0xa000-0xafff] Dec 12 17:24:02.772067 kernel: pci 0000:00:03.6: bridge window [mem 0x11400000-0x115fffff] Dec 12 17:24:02.772132 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.772236 kernel: pci 0000:00:03.7: BAR 0 [mem 0x12589000-0x12589fff] Dec 12 17:24:02.772307 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Dec 12 17:24:02.772378 kernel: pci 0000:00:03.7: bridge window [io 0x9000-0x9fff] Dec 12 17:24:02.772444 kernel: pci 0000:00:03.7: bridge window [mem 0x11200000-0x113fffff] Dec 12 17:24:02.772527 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.772590 kernel: pci 0000:00:04.0: BAR 0 [mem 0x12588000-0x12588fff] Dec 12 17:24:02.772652 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Dec 12 17:24:02.772711 kernel: pci 0000:00:04.0: bridge window [io 0x8000-0x8fff] Dec 12 17:24:02.772787 kernel: pci 0000:00:04.0: bridge window [mem 0x11000000-0x111fffff] Dec 12 17:24:02.772851 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.772911 kernel: pci 0000:00:04.1: BAR 0 [mem 0x12587000-0x12587fff] Dec 12 17:24:02.772968 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Dec 12 17:24:02.773025 kernel: pci 0000:00:04.1: bridge window [io 0x7000-0x7fff] Dec 12 17:24:02.773082 kernel: pci 0000:00:04.1: bridge window [mem 0x10e00000-0x10ffffff] Dec 12 17:24:02.773145 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.773205 kernel: pci 0000:00:04.2: BAR 0 [mem 0x12586000-0x12586fff] Dec 12 17:24:02.773263 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Dec 12 17:24:02.773321 kernel: pci 0000:00:04.2: bridge window [io 0x6000-0x6fff] Dec 12 17:24:02.773386 kernel: pci 0000:00:04.2: bridge window [mem 0x10c00000-0x10dfffff] Dec 12 17:24:02.773459 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.773518 kernel: pci 0000:00:04.3: BAR 0 [mem 0x12585000-0x12585fff] Dec 12 17:24:02.773577 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Dec 12 17:24:02.773634 kernel: pci 0000:00:04.3: bridge window [io 0x5000-0x5fff] Dec 12 17:24:02.773694 kernel: pci 0000:00:04.3: bridge window [mem 0x10a00000-0x10bfffff] Dec 12 17:24:02.773760 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.773819 kernel: pci 0000:00:04.4: BAR 0 [mem 0x12584000-0x12584fff] Dec 12 17:24:02.773878 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Dec 12 17:24:02.773936 kernel: pci 0000:00:04.4: bridge window [io 0x4000-0x4fff] Dec 12 17:24:02.774003 kernel: pci 0000:00:04.4: bridge window [mem 0x10800000-0x109fffff] Dec 12 17:24:02.774069 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.774128 kernel: pci 0000:00:04.5: BAR 0 [mem 0x12583000-0x12583fff] Dec 12 17:24:02.774185 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Dec 12 17:24:02.774242 kernel: pci 0000:00:04.5: bridge window [io 0x3000-0x3fff] Dec 12 17:24:02.774299 kernel: pci 0000:00:04.5: bridge window [mem 0x10600000-0x107fffff] Dec 12 17:24:02.774385 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.774447 kernel: pci 0000:00:04.6: BAR 0 [mem 0x12582000-0x12582fff] Dec 12 17:24:02.774506 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Dec 12 17:24:02.774564 kernel: pci 0000:00:04.6: bridge window [io 0x2000-0x2fff] Dec 12 17:24:02.774622 kernel: pci 0000:00:04.6: bridge window [mem 0x10400000-0x105fffff] Dec 12 17:24:02.774686 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.774746 kernel: pci 0000:00:04.7: BAR 0 [mem 0x12581000-0x12581fff] Dec 12 17:24:02.774807 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Dec 12 17:24:02.774864 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x1fff] Dec 12 17:24:02.774923 kernel: pci 0000:00:04.7: bridge window [mem 0x10200000-0x103fffff] Dec 12 17:24:02.774987 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:02.775045 kernel: pci 0000:00:05.0: BAR 0 [mem 0x12580000-0x12580fff] Dec 12 17:24:02.775103 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Dec 12 17:24:02.775159 kernel: pci 0000:00:05.0: bridge window [io 0x0000-0x0fff] Dec 12 17:24:02.775219 kernel: pci 0000:00:05.0: bridge window [mem 0x10000000-0x101fffff] Dec 12 17:24:02.775288 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 12 17:24:02.775354 kernel: pci 0000:01:00.0: BAR 1 [mem 0x12400000-0x12400fff] Dec 12 17:24:02.775430 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Dec 12 17:24:02.775491 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Dec 12 17:24:02.775558 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Dec 12 17:24:02.775618 kernel: pci 0000:02:00.0: BAR 0 [mem 0x12300000-0x12303fff 64bit] Dec 12 17:24:02.775687 kernel: pci 0000:03:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Dec 12 17:24:02.775753 kernel: pci 0000:03:00.0: BAR 1 [mem 0x12200000-0x12200fff] Dec 12 17:24:02.775816 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Dec 12 17:24:02.775886 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Dec 12 17:24:02.775949 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Dec 12 17:24:02.776018 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Dec 12 17:24:02.776086 kernel: pci 0000:05:00.0: BAR 1 [mem 0x12100000-0x12100fff] Dec 12 17:24:02.776155 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Dec 12 17:24:02.776250 kernel: pci 0000:06:00.0: [1af4:1050] type 00 class 0x038000 PCIe Endpoint Dec 12 17:24:02.776352 kernel: pci 0000:06:00.0: BAR 1 [mem 0x12000000-0x12000fff] Dec 12 17:24:02.776427 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Dec 12 17:24:02.776507 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Dec 12 17:24:02.776571 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Dec 12 17:24:02.776631 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Dec 12 17:24:02.776693 kernel: pci 0000:00:01.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Dec 12 17:24:02.776752 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Dec 12 17:24:02.776809 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Dec 12 17:24:02.776872 kernel: pci 0000:00:01.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Dec 12 17:24:02.776933 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Dec 12 17:24:02.776998 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Dec 12 17:24:02.777062 kernel: pci 0000:00:01.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Dec 12 17:24:02.777120 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Dec 12 17:24:02.777178 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Dec 12 17:24:02.777238 kernel: pci 0000:00:01.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Dec 12 17:24:02.777298 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Dec 12 17:24:02.777356 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Dec 12 17:24:02.777429 kernel: pci 0000:00:01.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Dec 12 17:24:02.777490 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Dec 12 17:24:02.777548 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Dec 12 17:24:02.777610 kernel: pci 0000:00:01.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Dec 12 17:24:02.777668 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 07] add_size 200000 add_align 100000 Dec 12 17:24:02.777727 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff] to [bus 07] add_size 200000 add_align 100000 Dec 12 17:24:02.777788 kernel: pci 0000:00:01.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Dec 12 17:24:02.777846 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Dec 12 17:24:02.777903 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Dec 12 17:24:02.777965 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Dec 12 17:24:02.778023 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Dec 12 17:24:02.778080 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Dec 12 17:24:02.778153 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Dec 12 17:24:02.778213 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0a] add_size 200000 add_align 100000 Dec 12 17:24:02.778270 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff] to [bus 0a] add_size 200000 add_align 100000 Dec 12 17:24:02.778333 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Dec 12 17:24:02.778413 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Dec 12 17:24:02.778474 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff] to [bus 0b] add_size 200000 add_align 100000 Dec 12 17:24:02.778540 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Dec 12 17:24:02.778599 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0c] add_size 200000 add_align 100000 Dec 12 17:24:02.778656 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 0c] add_size 200000 add_align 100000 Dec 12 17:24:02.778717 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Dec 12 17:24:02.778776 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0d] add_size 200000 add_align 100000 Dec 12 17:24:02.778833 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 0d] add_size 200000 add_align 100000 Dec 12 17:24:02.778894 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Dec 12 17:24:02.778954 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0e] add_size 200000 add_align 100000 Dec 12 17:24:02.779011 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff] to [bus 0e] add_size 200000 add_align 100000 Dec 12 17:24:02.779076 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Dec 12 17:24:02.779134 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0f] add_size 200000 add_align 100000 Dec 12 17:24:02.779192 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff] to [bus 0f] add_size 200000 add_align 100000 Dec 12 17:24:02.779255 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Dec 12 17:24:02.779313 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 10] add_size 200000 add_align 100000 Dec 12 17:24:02.779380 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 10] add_size 200000 add_align 100000 Dec 12 17:24:02.779445 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Dec 12 17:24:02.779503 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 11] add_size 200000 add_align 100000 Dec 12 17:24:02.779560 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 11] add_size 200000 add_align 100000 Dec 12 17:24:02.779622 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Dec 12 17:24:02.779680 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 12] add_size 200000 add_align 100000 Dec 12 17:24:02.779739 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff] to [bus 12] add_size 200000 add_align 100000 Dec 12 17:24:02.779802 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Dec 12 17:24:02.779860 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 13] add_size 200000 add_align 100000 Dec 12 17:24:02.779917 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff] to [bus 13] add_size 200000 add_align 100000 Dec 12 17:24:02.779979 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Dec 12 17:24:02.780038 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 14] add_size 200000 add_align 100000 Dec 12 17:24:02.780095 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff] to [bus 14] add_size 200000 add_align 100000 Dec 12 17:24:02.780159 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Dec 12 17:24:02.780232 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 15] add_size 200000 add_align 100000 Dec 12 17:24:02.780293 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff] to [bus 15] add_size 200000 add_align 100000 Dec 12 17:24:02.780356 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Dec 12 17:24:02.780434 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 16] add_size 200000 add_align 100000 Dec 12 17:24:02.780498 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff] to [bus 16] add_size 200000 add_align 100000 Dec 12 17:24:02.780559 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Dec 12 17:24:02.780622 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 17] add_size 200000 add_align 100000 Dec 12 17:24:02.780679 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff] to [bus 17] add_size 200000 add_align 100000 Dec 12 17:24:02.780741 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Dec 12 17:24:02.780799 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 18] add_size 200000 add_align 100000 Dec 12 17:24:02.780856 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff] to [bus 18] add_size 200000 add_align 100000 Dec 12 17:24:02.780917 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Dec 12 17:24:02.780975 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 19] add_size 200000 add_align 100000 Dec 12 17:24:02.781034 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 19] add_size 200000 add_align 100000 Dec 12 17:24:02.781096 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Dec 12 17:24:02.781154 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1a] add_size 200000 add_align 100000 Dec 12 17:24:02.781211 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff] to [bus 1a] add_size 200000 add_align 100000 Dec 12 17:24:02.781272 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Dec 12 17:24:02.781330 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1b] add_size 200000 add_align 100000 Dec 12 17:24:02.781400 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff] to [bus 1b] add_size 200000 add_align 100000 Dec 12 17:24:02.781463 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Dec 12 17:24:02.781521 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1c] add_size 200000 add_align 100000 Dec 12 17:24:02.781578 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff] to [bus 1c] add_size 200000 add_align 100000 Dec 12 17:24:02.781641 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Dec 12 17:24:02.781702 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1d] add_size 200000 add_align 100000 Dec 12 17:24:02.781759 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff] to [bus 1d] add_size 200000 add_align 100000 Dec 12 17:24:02.781822 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Dec 12 17:24:02.781881 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1e] add_size 200000 add_align 100000 Dec 12 17:24:02.781938 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff] to [bus 1e] add_size 200000 add_align 100000 Dec 12 17:24:02.781999 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Dec 12 17:24:02.782057 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1f] add_size 200000 add_align 100000 Dec 12 17:24:02.782121 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff] to [bus 1f] add_size 200000 add_align 100000 Dec 12 17:24:02.782185 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Dec 12 17:24:02.782245 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 20] add_size 200000 add_align 100000 Dec 12 17:24:02.782303 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff] to [bus 20] add_size 200000 add_align 100000 Dec 12 17:24:02.782363 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Dec 12 17:24:02.782436 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 21] add_size 200000 add_align 100000 Dec 12 17:24:02.782496 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 21] add_size 200000 add_align 100000 Dec 12 17:24:02.782556 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff]: assigned Dec 12 17:24:02.782614 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Dec 12 17:24:02.782676 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff]: assigned Dec 12 17:24:02.782735 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Dec 12 17:24:02.782795 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff]: assigned Dec 12 17:24:02.782853 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Dec 12 17:24:02.782913 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff]: assigned Dec 12 17:24:02.782971 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Dec 12 17:24:02.783030 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff]: assigned Dec 12 17:24:02.783088 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Dec 12 17:24:02.783151 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Dec 12 17:24:02.783209 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Dec 12 17:24:02.783268 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Dec 12 17:24:02.783326 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Dec 12 17:24:02.783394 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Dec 12 17:24:02.783454 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Dec 12 17:24:02.783513 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff]: assigned Dec 12 17:24:02.783573 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Dec 12 17:24:02.783633 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff]: assigned Dec 12 17:24:02.783690 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref]: assigned Dec 12 17:24:02.783750 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff]: assigned Dec 12 17:24:02.783807 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref]: assigned Dec 12 17:24:02.783866 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff]: assigned Dec 12 17:24:02.783924 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref]: assigned Dec 12 17:24:02.783984 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff]: assigned Dec 12 17:24:02.784044 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref]: assigned Dec 12 17:24:02.784104 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff]: assigned Dec 12 17:24:02.784164 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref]: assigned Dec 12 17:24:02.784243 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff]: assigned Dec 12 17:24:02.784306 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref]: assigned Dec 12 17:24:02.784378 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff]: assigned Dec 12 17:24:02.784439 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref]: assigned Dec 12 17:24:02.784499 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff]: assigned Dec 12 17:24:02.784560 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref]: assigned Dec 12 17:24:02.784620 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff]: assigned Dec 12 17:24:02.784678 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref]: assigned Dec 12 17:24:02.784739 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff]: assigned Dec 12 17:24:02.784797 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref]: assigned Dec 12 17:24:02.784857 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff]: assigned Dec 12 17:24:02.784915 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref]: assigned Dec 12 17:24:02.784975 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff]: assigned Dec 12 17:24:02.785036 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref]: assigned Dec 12 17:24:02.785097 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff]: assigned Dec 12 17:24:02.785154 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref]: assigned Dec 12 17:24:02.785216 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff]: assigned Dec 12 17:24:02.785275 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref]: assigned Dec 12 17:24:02.785339 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff]: assigned Dec 12 17:24:02.785406 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref]: assigned Dec 12 17:24:02.785472 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff]: assigned Dec 12 17:24:02.785533 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref]: assigned Dec 12 17:24:02.785593 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff]: assigned Dec 12 17:24:02.785653 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref]: assigned Dec 12 17:24:02.785714 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff]: assigned Dec 12 17:24:02.785772 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref]: assigned Dec 12 17:24:02.785833 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff]: assigned Dec 12 17:24:02.785892 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref]: assigned Dec 12 17:24:02.785954 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff]: assigned Dec 12 17:24:02.786012 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref]: assigned Dec 12 17:24:02.786072 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff]: assigned Dec 12 17:24:02.786130 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref]: assigned Dec 12 17:24:02.786190 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff]: assigned Dec 12 17:24:02.786249 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref]: assigned Dec 12 17:24:02.786309 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff]: assigned Dec 12 17:24:02.786377 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref]: assigned Dec 12 17:24:02.786444 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff]: assigned Dec 12 17:24:02.786504 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref]: assigned Dec 12 17:24:02.786566 kernel: pci 0000:00:01.0: BAR 0 [mem 0x14200000-0x14200fff]: assigned Dec 12 17:24:02.786626 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x1fff]: assigned Dec 12 17:24:02.786689 kernel: pci 0000:00:01.1: BAR 0 [mem 0x14201000-0x14201fff]: assigned Dec 12 17:24:02.786772 kernel: pci 0000:00:01.1: bridge window [io 0x2000-0x2fff]: assigned Dec 12 17:24:02.786833 kernel: pci 0000:00:01.2: BAR 0 [mem 0x14202000-0x14202fff]: assigned Dec 12 17:24:02.786894 kernel: pci 0000:00:01.2: bridge window [io 0x3000-0x3fff]: assigned Dec 12 17:24:02.786957 kernel: pci 0000:00:01.3: BAR 0 [mem 0x14203000-0x14203fff]: assigned Dec 12 17:24:02.787015 kernel: pci 0000:00:01.3: bridge window [io 0x4000-0x4fff]: assigned Dec 12 17:24:02.787081 kernel: pci 0000:00:01.4: BAR 0 [mem 0x14204000-0x14204fff]: assigned Dec 12 17:24:02.787140 kernel: pci 0000:00:01.4: bridge window [io 0x5000-0x5fff]: assigned Dec 12 17:24:02.787294 kernel: pci 0000:00:01.5: BAR 0 [mem 0x14205000-0x14205fff]: assigned Dec 12 17:24:02.787360 kernel: pci 0000:00:01.5: bridge window [io 0x6000-0x6fff]: assigned Dec 12 17:24:02.787448 kernel: pci 0000:00:01.6: BAR 0 [mem 0x14206000-0x14206fff]: assigned Dec 12 17:24:02.787509 kernel: pci 0000:00:01.6: bridge window [io 0x7000-0x7fff]: assigned Dec 12 17:24:02.787579 kernel: pci 0000:00:01.7: BAR 0 [mem 0x14207000-0x14207fff]: assigned Dec 12 17:24:02.787637 kernel: pci 0000:00:01.7: bridge window [io 0x8000-0x8fff]: assigned Dec 12 17:24:02.787787 kernel: pci 0000:00:02.0: BAR 0 [mem 0x14208000-0x14208fff]: assigned Dec 12 17:24:02.787856 kernel: pci 0000:00:02.0: bridge window [io 0x9000-0x9fff]: assigned Dec 12 17:24:02.787920 kernel: pci 0000:00:02.1: BAR 0 [mem 0x14209000-0x14209fff]: assigned Dec 12 17:24:02.787980 kernel: pci 0000:00:02.1: bridge window [io 0xa000-0xafff]: assigned Dec 12 17:24:02.788042 kernel: pci 0000:00:02.2: BAR 0 [mem 0x1420a000-0x1420afff]: assigned Dec 12 17:24:02.788157 kernel: pci 0000:00:02.2: bridge window [io 0xb000-0xbfff]: assigned Dec 12 17:24:02.788250 kernel: pci 0000:00:02.3: BAR 0 [mem 0x1420b000-0x1420bfff]: assigned Dec 12 17:24:02.788363 kernel: pci 0000:00:02.3: bridge window [io 0xc000-0xcfff]: assigned Dec 12 17:24:02.788469 kernel: pci 0000:00:02.4: BAR 0 [mem 0x1420c000-0x1420cfff]: assigned Dec 12 17:24:02.788576 kernel: pci 0000:00:02.4: bridge window [io 0xd000-0xdfff]: assigned Dec 12 17:24:02.788652 kernel: pci 0000:00:02.5: BAR 0 [mem 0x1420d000-0x1420dfff]: assigned Dec 12 17:24:02.788714 kernel: pci 0000:00:02.5: bridge window [io 0xe000-0xefff]: assigned Dec 12 17:24:02.788780 kernel: pci 0000:00:02.6: BAR 0 [mem 0x1420e000-0x1420efff]: assigned Dec 12 17:24:02.788846 kernel: pci 0000:00:02.6: bridge window [io 0xf000-0xffff]: assigned Dec 12 17:24:02.788909 kernel: pci 0000:00:02.7: BAR 0 [mem 0x1420f000-0x1420ffff]: assigned Dec 12 17:24:02.788967 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.789024 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.789085 kernel: pci 0000:00:03.0: BAR 0 [mem 0x14210000-0x14210fff]: assigned Dec 12 17:24:02.789144 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.789206 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.789283 kernel: pci 0000:00:03.1: BAR 0 [mem 0x14211000-0x14211fff]: assigned Dec 12 17:24:02.789344 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.789438 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.789521 kernel: pci 0000:00:03.2: BAR 0 [mem 0x14212000-0x14212fff]: assigned Dec 12 17:24:02.789582 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.789642 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.789705 kernel: pci 0000:00:03.3: BAR 0 [mem 0x14213000-0x14213fff]: assigned Dec 12 17:24:02.789766 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.789903 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.789977 kernel: pci 0000:00:03.4: BAR 0 [mem 0x14214000-0x14214fff]: assigned Dec 12 17:24:02.790037 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.790095 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.790157 kernel: pci 0000:00:03.5: BAR 0 [mem 0x14215000-0x14215fff]: assigned Dec 12 17:24:02.790215 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.790277 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.790347 kernel: pci 0000:00:03.6: BAR 0 [mem 0x14216000-0x14216fff]: assigned Dec 12 17:24:02.790421 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.790482 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.790545 kernel: pci 0000:00:03.7: BAR 0 [mem 0x14217000-0x14217fff]: assigned Dec 12 17:24:02.790603 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.790662 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.790727 kernel: pci 0000:00:04.0: BAR 0 [mem 0x14218000-0x14218fff]: assigned Dec 12 17:24:02.790791 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.790852 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.790913 kernel: pci 0000:00:04.1: BAR 0 [mem 0x14219000-0x14219fff]: assigned Dec 12 17:24:02.790972 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.791030 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.791092 kernel: pci 0000:00:04.2: BAR 0 [mem 0x1421a000-0x1421afff]: assigned Dec 12 17:24:02.791151 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.791208 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.791272 kernel: pci 0000:00:04.3: BAR 0 [mem 0x1421b000-0x1421bfff]: assigned Dec 12 17:24:02.791332 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.791409 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.791473 kernel: pci 0000:00:04.4: BAR 0 [mem 0x1421c000-0x1421cfff]: assigned Dec 12 17:24:02.791532 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.791589 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.791664 kernel: pci 0000:00:04.5: BAR 0 [mem 0x1421d000-0x1421dfff]: assigned Dec 12 17:24:02.791722 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.791782 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.791844 kernel: pci 0000:00:04.6: BAR 0 [mem 0x1421e000-0x1421efff]: assigned Dec 12 17:24:02.791903 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.791960 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.792021 kernel: pci 0000:00:04.7: BAR 0 [mem 0x1421f000-0x1421ffff]: assigned Dec 12 17:24:02.792080 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.792137 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.792214 kernel: pci 0000:00:05.0: BAR 0 [mem 0x14220000-0x14220fff]: assigned Dec 12 17:24:02.792287 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.792348 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.792418 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff]: assigned Dec 12 17:24:02.792480 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff]: assigned Dec 12 17:24:02.792540 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff]: assigned Dec 12 17:24:02.792601 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff]: assigned Dec 12 17:24:02.792663 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff]: assigned Dec 12 17:24:02.792723 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff]: assigned Dec 12 17:24:02.792781 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff]: assigned Dec 12 17:24:02.792841 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff]: assigned Dec 12 17:24:02.792902 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff]: assigned Dec 12 17:24:02.792961 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff]: assigned Dec 12 17:24:02.793020 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff]: assigned Dec 12 17:24:02.793079 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff]: assigned Dec 12 17:24:02.793139 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff]: assigned Dec 12 17:24:02.793198 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff]: assigned Dec 12 17:24:02.793259 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff]: assigned Dec 12 17:24:02.793319 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.793393 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.793457 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.793516 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.793575 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.793633 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.793695 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.793755 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.793816 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.793877 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.793939 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.793997 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.794057 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.794115 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.794175 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.794235 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.794297 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.794358 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.794429 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.794490 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.794551 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.794609 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.794672 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.794732 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.794795 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.794856 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.794917 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.794976 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.795038 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.795097 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.795159 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.795221 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.795284 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.795343 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.795449 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space Dec 12 17:24:02.795514 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign Dec 12 17:24:02.795581 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Dec 12 17:24:02.795646 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Dec 12 17:24:02.795710 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Dec 12 17:24:02.795768 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Dec 12 17:24:02.795828 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff] Dec 12 17:24:02.795886 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Dec 12 17:24:02.796022 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Dec 12 17:24:02.796094 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Dec 12 17:24:02.796159 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff] Dec 12 17:24:02.796241 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Dec 12 17:24:02.796321 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Dec 12 17:24:02.796401 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Dec 12 17:24:02.796466 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Dec 12 17:24:02.796541 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff] Dec 12 17:24:02.796601 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Dec 12 17:24:02.796667 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Dec 12 17:24:02.796731 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Dec 12 17:24:02.796790 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff] Dec 12 17:24:02.796850 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Dec 12 17:24:02.796917 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Dec 12 17:24:02.796978 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Dec 12 17:24:02.797051 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Dec 12 17:24:02.797115 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff] Dec 12 17:24:02.797176 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Dec 12 17:24:02.797242 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Dec 12 17:24:02.797303 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Dec 12 17:24:02.797361 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Dec 12 17:24:02.797433 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff] Dec 12 17:24:02.797492 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 12 17:24:02.797553 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Dec 12 17:24:02.797612 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff] Dec 12 17:24:02.797674 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 12 17:24:02.797734 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Dec 12 17:24:02.797794 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff] Dec 12 17:24:02.797852 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 12 17:24:02.797913 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Dec 12 17:24:02.797972 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Dec 12 17:24:02.798030 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Dec 12 17:24:02.798094 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Dec 12 17:24:02.798153 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff] Dec 12 17:24:02.798211 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref] Dec 12 17:24:02.798273 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Dec 12 17:24:02.798334 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff] Dec 12 17:24:02.798402 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref] Dec 12 17:24:02.798471 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Dec 12 17:24:02.798535 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff] Dec 12 17:24:02.798598 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref] Dec 12 17:24:02.798678 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Dec 12 17:24:02.798755 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff] Dec 12 17:24:02.798817 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref] Dec 12 17:24:02.798885 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Dec 12 17:24:02.798944 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff] Dec 12 17:24:02.799002 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref] Dec 12 17:24:02.799062 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Dec 12 17:24:02.799121 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff] Dec 12 17:24:02.799180 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref] Dec 12 17:24:02.799243 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Dec 12 17:24:02.799303 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff] Dec 12 17:24:02.799361 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref] Dec 12 17:24:02.799432 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Dec 12 17:24:02.799492 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff] Dec 12 17:24:02.799549 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref] Dec 12 17:24:02.799608 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Dec 12 17:24:02.799675 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff] Dec 12 17:24:02.799733 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref] Dec 12 17:24:02.799795 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Dec 12 17:24:02.799854 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff] Dec 12 17:24:02.799911 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff] Dec 12 17:24:02.799969 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref] Dec 12 17:24:02.800031 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Dec 12 17:24:02.800090 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff] Dec 12 17:24:02.800150 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff] Dec 12 17:24:02.800219 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref] Dec 12 17:24:02.800288 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Dec 12 17:24:02.800347 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff] Dec 12 17:24:02.800419 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff] Dec 12 17:24:02.800479 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref] Dec 12 17:24:02.800541 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Dec 12 17:24:02.800603 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff] Dec 12 17:24:02.800663 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff] Dec 12 17:24:02.800726 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref] Dec 12 17:24:02.800789 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Dec 12 17:24:02.800850 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff] Dec 12 17:24:02.800908 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff] Dec 12 17:24:02.800966 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref] Dec 12 17:24:02.801029 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Dec 12 17:24:02.801088 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff] Dec 12 17:24:02.801147 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff] Dec 12 17:24:02.801213 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref] Dec 12 17:24:02.801274 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Dec 12 17:24:02.801336 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff] Dec 12 17:24:02.801405 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff] Dec 12 17:24:02.801467 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref] Dec 12 17:24:02.801532 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Dec 12 17:24:02.801591 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff] Dec 12 17:24:02.801650 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff] Dec 12 17:24:02.801711 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref] Dec 12 17:24:02.801774 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Dec 12 17:24:02.801834 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff] Dec 12 17:24:02.801891 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff] Dec 12 17:24:02.801948 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref] Dec 12 17:24:02.802009 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Dec 12 17:24:02.802067 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff] Dec 12 17:24:02.802125 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff] Dec 12 17:24:02.802185 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref] Dec 12 17:24:02.802247 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Dec 12 17:24:02.802307 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff] Dec 12 17:24:02.802371 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff] Dec 12 17:24:02.802437 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref] Dec 12 17:24:02.802499 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Dec 12 17:24:02.802558 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff] Dec 12 17:24:02.802616 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff] Dec 12 17:24:02.802673 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref] Dec 12 17:24:02.802737 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Dec 12 17:24:02.802795 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff] Dec 12 17:24:02.802853 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff] Dec 12 17:24:02.802910 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref] Dec 12 17:24:02.802971 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Dec 12 17:24:02.803030 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff] Dec 12 17:24:02.803088 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff] Dec 12 17:24:02.803145 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref] Dec 12 17:24:02.803208 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Dec 12 17:24:02.803267 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff] Dec 12 17:24:02.803324 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff] Dec 12 17:24:02.803396 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref] Dec 12 17:24:02.803461 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Dec 12 17:24:02.803513 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Dec 12 17:24:02.803565 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Dec 12 17:24:02.803638 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Dec 12 17:24:02.803697 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Dec 12 17:24:02.803758 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Dec 12 17:24:02.803812 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Dec 12 17:24:02.803873 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Dec 12 17:24:02.803932 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Dec 12 17:24:02.804003 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Dec 12 17:24:02.804062 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Dec 12 17:24:02.804122 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Dec 12 17:24:02.804177 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Dec 12 17:24:02.804255 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Dec 12 17:24:02.804313 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 12 17:24:02.804386 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Dec 12 17:24:02.804447 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 12 17:24:02.804510 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Dec 12 17:24:02.804567 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 12 17:24:02.804633 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Dec 12 17:24:02.804689 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Dec 12 17:24:02.804753 kernel: pci_bus 0000:0a: resource 1 [mem 0x11200000-0x113fffff] Dec 12 17:24:02.804811 kernel: pci_bus 0000:0a: resource 2 [mem 0x8001200000-0x80013fffff 64bit pref] Dec 12 17:24:02.804877 kernel: pci_bus 0000:0b: resource 1 [mem 0x11400000-0x115fffff] Dec 12 17:24:02.804931 kernel: pci_bus 0000:0b: resource 2 [mem 0x8001400000-0x80015fffff 64bit pref] Dec 12 17:24:02.804991 kernel: pci_bus 0000:0c: resource 1 [mem 0x11600000-0x117fffff] Dec 12 17:24:02.805045 kernel: pci_bus 0000:0c: resource 2 [mem 0x8001600000-0x80017fffff 64bit pref] Dec 12 17:24:02.805106 kernel: pci_bus 0000:0d: resource 1 [mem 0x11800000-0x119fffff] Dec 12 17:24:02.805162 kernel: pci_bus 0000:0d: resource 2 [mem 0x8001800000-0x80019fffff 64bit pref] Dec 12 17:24:02.805222 kernel: pci_bus 0000:0e: resource 1 [mem 0x11a00000-0x11bfffff] Dec 12 17:24:02.805276 kernel: pci_bus 0000:0e: resource 2 [mem 0x8001a00000-0x8001bfffff 64bit pref] Dec 12 17:24:02.805337 kernel: pci_bus 0000:0f: resource 1 [mem 0x11c00000-0x11dfffff] Dec 12 17:24:02.805411 kernel: pci_bus 0000:0f: resource 2 [mem 0x8001c00000-0x8001dfffff 64bit pref] Dec 12 17:24:02.805474 kernel: pci_bus 0000:10: resource 1 [mem 0x11e00000-0x11ffffff] Dec 12 17:24:02.805532 kernel: pci_bus 0000:10: resource 2 [mem 0x8001e00000-0x8001ffffff 64bit pref] Dec 12 17:24:02.805592 kernel: pci_bus 0000:11: resource 1 [mem 0x12000000-0x121fffff] Dec 12 17:24:02.805647 kernel: pci_bus 0000:11: resource 2 [mem 0x8002000000-0x80021fffff 64bit pref] Dec 12 17:24:02.805712 kernel: pci_bus 0000:12: resource 1 [mem 0x12200000-0x123fffff] Dec 12 17:24:02.805766 kernel: pci_bus 0000:12: resource 2 [mem 0x8002200000-0x80023fffff 64bit pref] Dec 12 17:24:02.805830 kernel: pci_bus 0000:13: resource 0 [io 0xf000-0xffff] Dec 12 17:24:02.806040 kernel: pci_bus 0000:13: resource 1 [mem 0x12400000-0x125fffff] Dec 12 17:24:02.806110 kernel: pci_bus 0000:13: resource 2 [mem 0x8002400000-0x80025fffff 64bit pref] Dec 12 17:24:02.806173 kernel: pci_bus 0000:14: resource 0 [io 0xe000-0xefff] Dec 12 17:24:02.806228 kernel: pci_bus 0000:14: resource 1 [mem 0x12600000-0x127fffff] Dec 12 17:24:02.806282 kernel: pci_bus 0000:14: resource 2 [mem 0x8002600000-0x80027fffff 64bit pref] Dec 12 17:24:02.806342 kernel: pci_bus 0000:15: resource 0 [io 0xd000-0xdfff] Dec 12 17:24:02.806425 kernel: pci_bus 0000:15: resource 1 [mem 0x12800000-0x129fffff] Dec 12 17:24:02.806482 kernel: pci_bus 0000:15: resource 2 [mem 0x8002800000-0x80029fffff 64bit pref] Dec 12 17:24:02.806543 kernel: pci_bus 0000:16: resource 0 [io 0xc000-0xcfff] Dec 12 17:24:02.806596 kernel: pci_bus 0000:16: resource 1 [mem 0x12a00000-0x12bfffff] Dec 12 17:24:02.806650 kernel: pci_bus 0000:16: resource 2 [mem 0x8002a00000-0x8002bfffff 64bit pref] Dec 12 17:24:02.806710 kernel: pci_bus 0000:17: resource 0 [io 0xb000-0xbfff] Dec 12 17:24:02.806764 kernel: pci_bus 0000:17: resource 1 [mem 0x12c00000-0x12dfffff] Dec 12 17:24:02.806820 kernel: pci_bus 0000:17: resource 2 [mem 0x8002c00000-0x8002dfffff 64bit pref] Dec 12 17:24:02.806881 kernel: pci_bus 0000:18: resource 0 [io 0xa000-0xafff] Dec 12 17:24:02.806935 kernel: pci_bus 0000:18: resource 1 [mem 0x12e00000-0x12ffffff] Dec 12 17:24:02.806988 kernel: pci_bus 0000:18: resource 2 [mem 0x8002e00000-0x8002ffffff 64bit pref] Dec 12 17:24:02.807048 kernel: pci_bus 0000:19: resource 0 [io 0x9000-0x9fff] Dec 12 17:24:02.807102 kernel: pci_bus 0000:19: resource 1 [mem 0x13000000-0x131fffff] Dec 12 17:24:02.807154 kernel: pci_bus 0000:19: resource 2 [mem 0x8003000000-0x80031fffff 64bit pref] Dec 12 17:24:02.807217 kernel: pci_bus 0000:1a: resource 0 [io 0x8000-0x8fff] Dec 12 17:24:02.807271 kernel: pci_bus 0000:1a: resource 1 [mem 0x13200000-0x133fffff] Dec 12 17:24:02.807324 kernel: pci_bus 0000:1a: resource 2 [mem 0x8003200000-0x80033fffff 64bit pref] Dec 12 17:24:02.807404 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Dec 12 17:24:02.807462 kernel: pci_bus 0000:1b: resource 1 [mem 0x13400000-0x135fffff] Dec 12 17:24:02.807515 kernel: pci_bus 0000:1b: resource 2 [mem 0x8003400000-0x80035fffff 64bit pref] Dec 12 17:24:02.807577 kernel: pci_bus 0000:1c: resource 0 [io 0x6000-0x6fff] Dec 12 17:24:02.807634 kernel: pci_bus 0000:1c: resource 1 [mem 0x13600000-0x137fffff] Dec 12 17:24:02.807689 kernel: pci_bus 0000:1c: resource 2 [mem 0x8003600000-0x80037fffff 64bit pref] Dec 12 17:24:02.807751 kernel: pci_bus 0000:1d: resource 0 [io 0x5000-0x5fff] Dec 12 17:24:02.807805 kernel: pci_bus 0000:1d: resource 1 [mem 0x13800000-0x139fffff] Dec 12 17:24:02.807859 kernel: pci_bus 0000:1d: resource 2 [mem 0x8003800000-0x80039fffff 64bit pref] Dec 12 17:24:02.807919 kernel: pci_bus 0000:1e: resource 0 [io 0x4000-0x4fff] Dec 12 17:24:02.807975 kernel: pci_bus 0000:1e: resource 1 [mem 0x13a00000-0x13bfffff] Dec 12 17:24:02.808028 kernel: pci_bus 0000:1e: resource 2 [mem 0x8003a00000-0x8003bfffff 64bit pref] Dec 12 17:24:02.808088 kernel: pci_bus 0000:1f: resource 0 [io 0x3000-0x3fff] Dec 12 17:24:02.808142 kernel: pci_bus 0000:1f: resource 1 [mem 0x13c00000-0x13dfffff] Dec 12 17:24:02.808208 kernel: pci_bus 0000:1f: resource 2 [mem 0x8003c00000-0x8003dfffff 64bit pref] Dec 12 17:24:02.808275 kernel: pci_bus 0000:20: resource 0 [io 0x2000-0x2fff] Dec 12 17:24:02.808330 kernel: pci_bus 0000:20: resource 1 [mem 0x13e00000-0x13ffffff] Dec 12 17:24:02.808398 kernel: pci_bus 0000:20: resource 2 [mem 0x8003e00000-0x8003ffffff 64bit pref] Dec 12 17:24:02.808463 kernel: pci_bus 0000:21: resource 0 [io 0x1000-0x1fff] Dec 12 17:24:02.808517 kernel: pci_bus 0000:21: resource 1 [mem 0x14000000-0x141fffff] Dec 12 17:24:02.808572 kernel: pci_bus 0000:21: resource 2 [mem 0x8004000000-0x80041fffff 64bit pref] Dec 12 17:24:02.808581 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Dec 12 17:24:02.808589 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Dec 12 17:24:02.808596 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Dec 12 17:24:02.808604 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Dec 12 17:24:02.808613 kernel: iommu: Default domain type: Translated Dec 12 17:24:02.808621 kernel: iommu: DMA domain TLB invalidation policy: strict mode Dec 12 17:24:02.808628 kernel: efivars: Registered efivars operations Dec 12 17:24:02.808635 kernel: vgaarb: loaded Dec 12 17:24:02.808642 kernel: clocksource: Switched to clocksource arch_sys_counter Dec 12 17:24:02.808650 kernel: VFS: Disk quotas dquot_6.6.0 Dec 12 17:24:02.808657 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 12 17:24:02.808664 kernel: pnp: PnP ACPI init Dec 12 17:24:02.808735 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Dec 12 17:24:02.808748 kernel: pnp: PnP ACPI: found 1 devices Dec 12 17:24:02.808755 kernel: NET: Registered PF_INET protocol family Dec 12 17:24:02.808762 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 12 17:24:02.808770 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Dec 12 17:24:02.808777 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 12 17:24:02.808785 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Dec 12 17:24:02.808792 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Dec 12 17:24:02.808799 kernel: TCP: Hash tables configured (established 131072 bind 65536) Dec 12 17:24:02.808808 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 12 17:24:02.808816 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 12 17:24:02.808823 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 12 17:24:02.808890 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Dec 12 17:24:02.808901 kernel: PCI: CLS 0 bytes, default 64 Dec 12 17:24:02.808908 kernel: kvm [1]: HYP mode not available Dec 12 17:24:02.808916 kernel: Initialise system trusted keyrings Dec 12 17:24:02.808923 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Dec 12 17:24:02.808930 kernel: Key type asymmetric registered Dec 12 17:24:02.808939 kernel: Asymmetric key parser 'x509' registered Dec 12 17:24:02.808946 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Dec 12 17:24:02.808954 kernel: io scheduler mq-deadline registered Dec 12 17:24:02.808961 kernel: io scheduler kyber registered Dec 12 17:24:02.808968 kernel: io scheduler bfq registered Dec 12 17:24:02.808976 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Dec 12 17:24:02.809037 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 50 Dec 12 17:24:02.809096 kernel: pcieport 0000:00:01.0: AER: enabled with IRQ 50 Dec 12 17:24:02.809154 kernel: pcieport 0000:00:01.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.809217 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 51 Dec 12 17:24:02.809276 kernel: pcieport 0000:00:01.1: AER: enabled with IRQ 51 Dec 12 17:24:02.809333 kernel: pcieport 0000:00:01.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.809419 kernel: pcieport 0000:00:01.2: PME: Signaling with IRQ 52 Dec 12 17:24:02.809481 kernel: pcieport 0000:00:01.2: AER: enabled with IRQ 52 Dec 12 17:24:02.809539 kernel: pcieport 0000:00:01.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.809601 kernel: pcieport 0000:00:01.3: PME: Signaling with IRQ 53 Dec 12 17:24:02.809663 kernel: pcieport 0000:00:01.3: AER: enabled with IRQ 53 Dec 12 17:24:02.809722 kernel: pcieport 0000:00:01.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.809783 kernel: pcieport 0000:00:01.4: PME: Signaling with IRQ 54 Dec 12 17:24:02.809841 kernel: pcieport 0000:00:01.4: AER: enabled with IRQ 54 Dec 12 17:24:02.809899 kernel: pcieport 0000:00:01.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.809960 kernel: pcieport 0000:00:01.5: PME: Signaling with IRQ 55 Dec 12 17:24:02.810019 kernel: pcieport 0000:00:01.5: AER: enabled with IRQ 55 Dec 12 17:24:02.810076 kernel: pcieport 0000:00:01.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.810139 kernel: pcieport 0000:00:01.6: PME: Signaling with IRQ 56 Dec 12 17:24:02.810198 kernel: pcieport 0000:00:01.6: AER: enabled with IRQ 56 Dec 12 17:24:02.810256 kernel: pcieport 0000:00:01.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.810318 kernel: pcieport 0000:00:01.7: PME: Signaling with IRQ 57 Dec 12 17:24:02.810394 kernel: pcieport 0000:00:01.7: AER: enabled with IRQ 57 Dec 12 17:24:02.810457 kernel: pcieport 0000:00:01.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.810467 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Dec 12 17:24:02.810526 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 58 Dec 12 17:24:02.810588 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 58 Dec 12 17:24:02.810646 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.810707 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 59 Dec 12 17:24:02.810765 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 59 Dec 12 17:24:02.810823 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.810883 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 60 Dec 12 17:24:02.810942 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 60 Dec 12 17:24:02.811000 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.811062 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 61 Dec 12 17:24:02.811121 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 61 Dec 12 17:24:02.811178 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.811239 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 62 Dec 12 17:24:02.811296 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 62 Dec 12 17:24:02.811354 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.811429 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 63 Dec 12 17:24:02.811492 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 63 Dec 12 17:24:02.811550 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.811612 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 64 Dec 12 17:24:02.811671 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 64 Dec 12 17:24:02.811729 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.811791 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 65 Dec 12 17:24:02.811850 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 65 Dec 12 17:24:02.811909 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.811922 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Dec 12 17:24:02.811984 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 66 Dec 12 17:24:02.812044 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 66 Dec 12 17:24:02.812102 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.812163 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 67 Dec 12 17:24:02.812239 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 67 Dec 12 17:24:02.812301 kernel: pcieport 0000:00:03.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.812378 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 68 Dec 12 17:24:02.812450 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 68 Dec 12 17:24:02.812510 kernel: pcieport 0000:00:03.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.812572 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 69 Dec 12 17:24:02.812631 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 69 Dec 12 17:24:02.812689 kernel: pcieport 0000:00:03.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.812749 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 70 Dec 12 17:24:02.812808 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 70 Dec 12 17:24:02.812866 kernel: pcieport 0000:00:03.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.812930 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 71 Dec 12 17:24:02.812989 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 71 Dec 12 17:24:02.813048 kernel: pcieport 0000:00:03.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.813109 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 72 Dec 12 17:24:02.813169 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 72 Dec 12 17:24:02.813229 kernel: pcieport 0000:00:03.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.813294 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 73 Dec 12 17:24:02.813356 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 73 Dec 12 17:24:02.813425 kernel: pcieport 0000:00:03.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.813449 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Dec 12 17:24:02.813510 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 74 Dec 12 17:24:02.813569 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 74 Dec 12 17:24:02.813630 kernel: pcieport 0000:00:04.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.813691 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 75 Dec 12 17:24:02.813749 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 75 Dec 12 17:24:02.813809 kernel: pcieport 0000:00:04.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.813870 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 76 Dec 12 17:24:02.813929 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 76 Dec 12 17:24:02.813986 kernel: pcieport 0000:00:04.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.814047 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 77 Dec 12 17:24:02.814112 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 77 Dec 12 17:24:02.814174 kernel: pcieport 0000:00:04.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.814236 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 78 Dec 12 17:24:02.814298 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 78 Dec 12 17:24:02.814356 kernel: pcieport 0000:00:04.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.814431 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 79 Dec 12 17:24:02.814498 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 79 Dec 12 17:24:02.814558 kernel: pcieport 0000:00:04.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.814622 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 80 Dec 12 17:24:02.814682 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 80 Dec 12 17:24:02.814740 kernel: pcieport 0000:00:04.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.814804 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 81 Dec 12 17:24:02.814863 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 81 Dec 12 17:24:02.814920 kernel: pcieport 0000:00:04.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.814981 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 82 Dec 12 17:24:02.815039 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 82 Dec 12 17:24:02.815104 kernel: pcieport 0000:00:05.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:02.815115 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Dec 12 17:24:02.815124 kernel: ACPI: button: Power Button [PWRB] Dec 12 17:24:02.815189 kernel: virtio-pci 0000:01:00.0: enabling device (0000 -> 0002) Dec 12 17:24:02.815255 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Dec 12 17:24:02.815265 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 12 17:24:02.815273 kernel: thunder_xcv, ver 1.0 Dec 12 17:24:02.815280 kernel: thunder_bgx, ver 1.0 Dec 12 17:24:02.815287 kernel: nicpf, ver 1.0 Dec 12 17:24:02.815294 kernel: nicvf, ver 1.0 Dec 12 17:24:02.815373 kernel: rtc-efi rtc-efi.0: registered as rtc0 Dec 12 17:24:02.815436 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-12-12T17:24:02 UTC (1765560242) Dec 12 17:24:02.815446 kernel: hid: raw HID events driver (C) Jiri Kosina Dec 12 17:24:02.815454 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Dec 12 17:24:02.815461 kernel: watchdog: NMI not fully supported Dec 12 17:24:02.815469 kernel: watchdog: Hard watchdog permanently disabled Dec 12 17:24:02.815476 kernel: NET: Registered PF_INET6 protocol family Dec 12 17:24:02.815483 kernel: Segment Routing with IPv6 Dec 12 17:24:02.815491 kernel: In-situ OAM (IOAM) with IPv6 Dec 12 17:24:02.815498 kernel: NET: Registered PF_PACKET protocol family Dec 12 17:24:02.815507 kernel: Key type dns_resolver registered Dec 12 17:24:02.815514 kernel: registered taskstats version 1 Dec 12 17:24:02.815522 kernel: Loading compiled-in X.509 certificates Dec 12 17:24:02.815529 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.61-flatcar: 92f3a94fb747a7ba7cbcfde1535be91b86f9429a' Dec 12 17:24:02.815536 kernel: Demotion targets for Node 0: null Dec 12 17:24:02.815543 kernel: Key type .fscrypt registered Dec 12 17:24:02.815550 kernel: Key type fscrypt-provisioning registered Dec 12 17:24:02.815558 kernel: ima: No TPM chip found, activating TPM-bypass! Dec 12 17:24:02.815565 kernel: ima: Allocated hash algorithm: sha1 Dec 12 17:24:02.815573 kernel: ima: No architecture policies found Dec 12 17:24:02.815581 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Dec 12 17:24:02.815588 kernel: clk: Disabling unused clocks Dec 12 17:24:02.815595 kernel: PM: genpd: Disabling unused power domains Dec 12 17:24:02.815602 kernel: Warning: unable to open an initial console. Dec 12 17:24:02.815610 kernel: Freeing unused kernel memory: 39552K Dec 12 17:24:02.815617 kernel: Run /init as init process Dec 12 17:24:02.815624 kernel: with arguments: Dec 12 17:24:02.815631 kernel: /init Dec 12 17:24:02.815640 kernel: with environment: Dec 12 17:24:02.815647 kernel: HOME=/ Dec 12 17:24:02.815654 kernel: TERM=linux Dec 12 17:24:02.815662 systemd[1]: Successfully made /usr/ read-only. Dec 12 17:24:02.815672 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 12 17:24:02.815681 systemd[1]: Detected virtualization kvm. Dec 12 17:24:02.815688 systemd[1]: Detected architecture arm64. Dec 12 17:24:02.815697 systemd[1]: Running in initrd. Dec 12 17:24:02.815704 systemd[1]: No hostname configured, using default hostname. Dec 12 17:24:02.815712 systemd[1]: Hostname set to . Dec 12 17:24:02.815720 systemd[1]: Initializing machine ID from VM UUID. Dec 12 17:24:02.815728 systemd[1]: Queued start job for default target initrd.target. Dec 12 17:24:02.815735 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 17:24:02.815751 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 17:24:02.815761 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Dec 12 17:24:02.815769 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 12 17:24:02.815777 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Dec 12 17:24:02.815787 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Dec 12 17:24:02.815796 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Dec 12 17:24:02.815804 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Dec 12 17:24:02.815812 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 17:24:02.815820 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 12 17:24:02.815827 systemd[1]: Reached target paths.target - Path Units. Dec 12 17:24:02.815835 systemd[1]: Reached target slices.target - Slice Units. Dec 12 17:24:02.815843 systemd[1]: Reached target swap.target - Swaps. Dec 12 17:24:02.815852 systemd[1]: Reached target timers.target - Timer Units. Dec 12 17:24:02.815860 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Dec 12 17:24:02.815868 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 12 17:24:02.815876 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Dec 12 17:24:02.815884 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Dec 12 17:24:02.815892 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 12 17:24:02.815900 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 12 17:24:02.815908 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 17:24:02.815917 systemd[1]: Reached target sockets.target - Socket Units. Dec 12 17:24:02.815925 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Dec 12 17:24:02.815935 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 12 17:24:02.815943 systemd[1]: Finished network-cleanup.service - Network Cleanup. Dec 12 17:24:02.815952 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Dec 12 17:24:02.815960 systemd[1]: Starting systemd-fsck-usr.service... Dec 12 17:24:02.815968 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 12 17:24:02.815976 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 12 17:24:02.815985 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:24:02.815993 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Dec 12 17:24:02.816002 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 17:24:02.816010 systemd[1]: Finished systemd-fsck-usr.service. Dec 12 17:24:02.816042 systemd-journald[313]: Collecting audit messages is disabled. Dec 12 17:24:02.816062 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 12 17:24:02.816071 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:24:02.816079 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 12 17:24:02.816089 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 12 17:24:02.816097 kernel: Bridge firewalling registered Dec 12 17:24:02.816104 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 12 17:24:02.816113 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 12 17:24:02.816121 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 12 17:24:02.816129 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 12 17:24:02.816137 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 12 17:24:02.816145 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 12 17:24:02.816155 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 17:24:02.816163 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Dec 12 17:24:02.816172 systemd-journald[313]: Journal started Dec 12 17:24:02.816202 systemd-journald[313]: Runtime Journal (/run/log/journal/c358f6af69a14dcba7040c439575f1a2) is 8M, max 319.5M, 311.5M free. Dec 12 17:24:02.759631 systemd-modules-load[315]: Inserted module 'overlay' Dec 12 17:24:02.824493 systemd[1]: Started systemd-journald.service - Journal Service. Dec 12 17:24:02.776922 systemd-modules-load[315]: Inserted module 'br_netfilter' Dec 12 17:24:02.826500 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 12 17:24:02.835771 dracut-cmdline[345]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=361f5baddf90aee3bc7ee7e9be879bc0cc94314f224faa1e2791d9b44cd3ec52 Dec 12 17:24:02.839018 systemd-tmpfiles[350]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Dec 12 17:24:02.842861 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 17:24:02.846490 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 12 17:24:02.881272 systemd-resolved[385]: Positive Trust Anchors: Dec 12 17:24:02.881285 systemd-resolved[385]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 12 17:24:02.881315 systemd-resolved[385]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 12 17:24:02.886871 systemd-resolved[385]: Defaulting to hostname 'linux'. Dec 12 17:24:02.887864 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 12 17:24:02.890957 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 12 17:24:02.910397 kernel: SCSI subsystem initialized Dec 12 17:24:02.915384 kernel: Loading iSCSI transport class v2.0-870. Dec 12 17:24:02.923397 kernel: iscsi: registered transport (tcp) Dec 12 17:24:02.936405 kernel: iscsi: registered transport (qla4xxx) Dec 12 17:24:02.936452 kernel: QLogic iSCSI HBA Driver Dec 12 17:24:02.953287 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 12 17:24:02.970310 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 17:24:02.971943 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 12 17:24:03.018843 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Dec 12 17:24:03.021208 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Dec 12 17:24:03.085405 kernel: raid6: neonx8 gen() 15814 MB/s Dec 12 17:24:03.102392 kernel: raid6: neonx4 gen() 15799 MB/s Dec 12 17:24:03.119390 kernel: raid6: neonx2 gen() 13185 MB/s Dec 12 17:24:03.136396 kernel: raid6: neonx1 gen() 10356 MB/s Dec 12 17:24:03.153388 kernel: raid6: int64x8 gen() 6900 MB/s Dec 12 17:24:03.170390 kernel: raid6: int64x4 gen() 7337 MB/s Dec 12 17:24:03.187412 kernel: raid6: int64x2 gen() 6071 MB/s Dec 12 17:24:03.204545 kernel: raid6: int64x1 gen() 5036 MB/s Dec 12 17:24:03.204599 kernel: raid6: using algorithm neonx8 gen() 15814 MB/s Dec 12 17:24:03.222497 kernel: raid6: .... xor() 11997 MB/s, rmw enabled Dec 12 17:24:03.222554 kernel: raid6: using neon recovery algorithm Dec 12 17:24:03.227412 kernel: xor: measuring software checksum speed Dec 12 17:24:03.228733 kernel: 8regs : 18624 MB/sec Dec 12 17:24:03.228786 kernel: 32regs : 21699 MB/sec Dec 12 17:24:03.229996 kernel: arm64_neon : 28022 MB/sec Dec 12 17:24:03.230045 kernel: xor: using function: arm64_neon (28022 MB/sec) Dec 12 17:24:03.282399 kernel: Btrfs loaded, zoned=no, fsverity=no Dec 12 17:24:03.288296 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Dec 12 17:24:03.290940 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 17:24:03.319485 systemd-udevd[566]: Using default interface naming scheme 'v255'. Dec 12 17:24:03.323639 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 17:24:03.325872 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Dec 12 17:24:03.346681 dracut-pre-trigger[573]: rd.md=0: removing MD RAID activation Dec 12 17:24:03.367962 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Dec 12 17:24:03.370270 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 12 17:24:03.446760 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 17:24:03.449501 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Dec 12 17:24:03.492594 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Dec 12 17:24:03.495000 kernel: virtio_blk virtio1: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Dec 12 17:24:03.500469 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Dec 12 17:24:03.500502 kernel: GPT:17805311 != 104857599 Dec 12 17:24:03.500513 kernel: GPT:Alternate GPT header not at the end of the disk. Dec 12 17:24:03.501553 kernel: GPT:17805311 != 104857599 Dec 12 17:24:03.502965 kernel: GPT: Use GNU Parted to correct GPT errors. Dec 12 17:24:03.502994 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 12 17:24:03.513395 kernel: ACPI: bus type USB registered Dec 12 17:24:03.514914 kernel: usbcore: registered new interface driver usbfs Dec 12 17:24:03.514952 kernel: usbcore: registered new interface driver hub Dec 12 17:24:03.515773 kernel: usbcore: registered new device driver usb Dec 12 17:24:03.539381 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 12 17:24:03.539673 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Dec 12 17:24:03.539603 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 17:24:03.543482 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Dec 12 17:24:03.539667 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:24:03.543482 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:24:03.547692 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:24:03.550952 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 12 17:24:03.551121 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Dec 12 17:24:03.551202 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Dec 12 17:24:03.556806 kernel: hub 1-0:1.0: USB hub found Dec 12 17:24:03.557001 kernel: hub 1-0:1.0: 4 ports detected Dec 12 17:24:03.557078 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Dec 12 17:24:03.560138 kernel: hub 2-0:1.0: USB hub found Dec 12 17:24:03.560319 kernel: hub 2-0:1.0: 4 ports detected Dec 12 17:24:03.572437 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:24:03.594642 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Dec 12 17:24:03.597253 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Dec 12 17:24:03.605120 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Dec 12 17:24:03.613701 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 12 17:24:03.620479 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Dec 12 17:24:03.621634 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Dec 12 17:24:03.623893 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Dec 12 17:24:03.626704 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 17:24:03.628714 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 12 17:24:03.631260 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Dec 12 17:24:03.633037 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Dec 12 17:24:03.648683 disk-uuid[662]: Primary Header is updated. Dec 12 17:24:03.648683 disk-uuid[662]: Secondary Entries is updated. Dec 12 17:24:03.648683 disk-uuid[662]: Secondary Header is updated. Dec 12 17:24:03.654425 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Dec 12 17:24:03.659386 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 12 17:24:03.801418 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Dec 12 17:24:03.932000 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Dec 12 17:24:03.932073 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Dec 12 17:24:03.932262 kernel: usbcore: registered new interface driver usbhid Dec 12 17:24:03.932715 kernel: usbhid: USB HID core driver Dec 12 17:24:04.038409 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Dec 12 17:24:04.163397 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Dec 12 17:24:04.217412 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Dec 12 17:24:04.670325 disk-uuid[664]: The operation has completed successfully. Dec 12 17:24:04.671360 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 12 17:24:04.712176 systemd[1]: disk-uuid.service: Deactivated successfully. Dec 12 17:24:04.713297 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Dec 12 17:24:04.737963 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Dec 12 17:24:04.763900 sh[683]: Success Dec 12 17:24:04.776392 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 12 17:24:04.776432 kernel: device-mapper: uevent: version 1.0.3 Dec 12 17:24:04.778061 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Dec 12 17:24:04.786404 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Dec 12 17:24:04.872408 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Dec 12 17:24:04.875280 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Dec 12 17:24:04.896158 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Dec 12 17:24:04.913393 kernel: BTRFS: device fsid 6d6d314d-b8a1-4727-8a34-8525e276a248 devid 1 transid 38 /dev/mapper/usr (253:0) scanned by mount (695) Dec 12 17:24:04.916093 kernel: BTRFS info (device dm-0): first mount of filesystem 6d6d314d-b8a1-4727-8a34-8525e276a248 Dec 12 17:24:04.916128 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:24:04.936796 kernel: BTRFS info (device dm-0): disabling log replay at mount time Dec 12 17:24:04.936844 kernel: BTRFS info (device dm-0): enabling free space tree Dec 12 17:24:04.939847 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Dec 12 17:24:04.941199 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Dec 12 17:24:04.942467 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Dec 12 17:24:04.943244 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Dec 12 17:24:04.944958 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Dec 12 17:24:04.980405 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (724) Dec 12 17:24:04.983391 kernel: BTRFS info (device vda6): first mount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:24:04.983449 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:24:04.995054 kernel: BTRFS info (device vda6): turning on async discard Dec 12 17:24:04.995164 kernel: BTRFS info (device vda6): enabling free space tree Dec 12 17:24:04.999397 kernel: BTRFS info (device vda6): last unmount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:24:05.000465 systemd[1]: Finished ignition-setup.service - Ignition (setup). Dec 12 17:24:05.002847 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Dec 12 17:24:05.043403 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 12 17:24:05.047843 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 12 17:24:05.084300 systemd-networkd[865]: lo: Link UP Dec 12 17:24:05.084313 systemd-networkd[865]: lo: Gained carrier Dec 12 17:24:05.085247 systemd-networkd[865]: Enumeration completed Dec 12 17:24:05.085331 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 12 17:24:05.085695 systemd-networkd[865]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:24:05.085699 systemd-networkd[865]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 12 17:24:05.086519 systemd-networkd[865]: eth0: Link UP Dec 12 17:24:05.086599 systemd-networkd[865]: eth0: Gained carrier Dec 12 17:24:05.086608 systemd-networkd[865]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:24:05.087324 systemd[1]: Reached target network.target - Network. Dec 12 17:24:05.115017 systemd-networkd[865]: eth0: DHCPv4 address 10.0.10.18/25, gateway 10.0.10.1 acquired from 10.0.10.1 Dec 12 17:24:05.185220 ignition[806]: Ignition 2.22.0 Dec 12 17:24:05.185233 ignition[806]: Stage: fetch-offline Dec 12 17:24:05.185266 ignition[806]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:24:05.185274 ignition[806]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 17:24:05.188941 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Dec 12 17:24:05.185350 ignition[806]: parsed url from cmdline: "" Dec 12 17:24:05.190958 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Dec 12 17:24:05.185353 ignition[806]: no config URL provided Dec 12 17:24:05.185357 ignition[806]: reading system config file "/usr/lib/ignition/user.ign" Dec 12 17:24:05.185363 ignition[806]: no config at "/usr/lib/ignition/user.ign" Dec 12 17:24:05.186171 ignition[806]: failed to fetch config: resource requires networking Dec 12 17:24:05.186346 ignition[806]: Ignition finished successfully Dec 12 17:24:05.219936 ignition[884]: Ignition 2.22.0 Dec 12 17:24:05.219954 ignition[884]: Stage: fetch Dec 12 17:24:05.220084 ignition[884]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:24:05.220093 ignition[884]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 17:24:05.220173 ignition[884]: parsed url from cmdline: "" Dec 12 17:24:05.220176 ignition[884]: no config URL provided Dec 12 17:24:05.220196 ignition[884]: reading system config file "/usr/lib/ignition/user.ign" Dec 12 17:24:05.220205 ignition[884]: no config at "/usr/lib/ignition/user.ign" Dec 12 17:24:05.220462 ignition[884]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Dec 12 17:24:05.220659 ignition[884]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Dec 12 17:24:05.220888 ignition[884]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Dec 12 17:24:05.496168 ignition[884]: GET result: OK Dec 12 17:24:05.496491 ignition[884]: parsing config with SHA512: 96273b9ccf5e1187446ebd5aae2e0a6095cfe1e24a3c013daa3badd4b4558ec380185c9a7b1ff188f61f25cdc73b8355bebbe84446b1f0f09868889e61a805e2 Dec 12 17:24:05.501446 unknown[884]: fetched base config from "system" Dec 12 17:24:05.501458 unknown[884]: fetched base config from "system" Dec 12 17:24:05.501780 ignition[884]: fetch: fetch complete Dec 12 17:24:05.501463 unknown[884]: fetched user config from "openstack" Dec 12 17:24:05.501784 ignition[884]: fetch: fetch passed Dec 12 17:24:05.504265 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Dec 12 17:24:05.501821 ignition[884]: Ignition finished successfully Dec 12 17:24:05.506284 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Dec 12 17:24:05.538146 ignition[893]: Ignition 2.22.0 Dec 12 17:24:05.538168 ignition[893]: Stage: kargs Dec 12 17:24:05.538303 ignition[893]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:24:05.538312 ignition[893]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 17:24:05.539033 ignition[893]: kargs: kargs passed Dec 12 17:24:05.539081 ignition[893]: Ignition finished successfully Dec 12 17:24:05.544017 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Dec 12 17:24:05.545952 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Dec 12 17:24:05.578635 ignition[902]: Ignition 2.22.0 Dec 12 17:24:05.578647 ignition[902]: Stage: disks Dec 12 17:24:05.578784 ignition[902]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:24:05.578792 ignition[902]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 17:24:05.581082 systemd[1]: Finished ignition-disks.service - Ignition (disks). Dec 12 17:24:05.579493 ignition[902]: disks: disks passed Dec 12 17:24:05.583045 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Dec 12 17:24:05.579538 ignition[902]: Ignition finished successfully Dec 12 17:24:05.585711 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Dec 12 17:24:05.587358 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 12 17:24:05.589254 systemd[1]: Reached target sysinit.target - System Initialization. Dec 12 17:24:05.590800 systemd[1]: Reached target basic.target - Basic System. Dec 12 17:24:05.593528 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Dec 12 17:24:05.635438 systemd-fsck[912]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Dec 12 17:24:05.639207 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Dec 12 17:24:05.641934 systemd[1]: Mounting sysroot.mount - /sysroot... Dec 12 17:24:05.754384 kernel: EXT4-fs (vda9): mounted filesystem 895d7845-d0e8-43ae-a778-7804b473b868 r/w with ordered data mode. Quota mode: none. Dec 12 17:24:05.754920 systemd[1]: Mounted sysroot.mount - /sysroot. Dec 12 17:24:05.756292 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Dec 12 17:24:05.759571 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 12 17:24:05.761395 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Dec 12 17:24:05.762386 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Dec 12 17:24:05.763125 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Dec 12 17:24:05.765725 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Dec 12 17:24:05.765756 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Dec 12 17:24:05.775748 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Dec 12 17:24:05.778022 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Dec 12 17:24:05.791473 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (920) Dec 12 17:24:05.794407 kernel: BTRFS info (device vda6): first mount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:24:05.794446 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:24:05.800668 kernel: BTRFS info (device vda6): turning on async discard Dec 12 17:24:05.800727 kernel: BTRFS info (device vda6): enabling free space tree Dec 12 17:24:05.802630 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 12 17:24:05.824395 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:24:05.834794 initrd-setup-root[948]: cut: /sysroot/etc/passwd: No such file or directory Dec 12 17:24:05.838860 initrd-setup-root[955]: cut: /sysroot/etc/group: No such file or directory Dec 12 17:24:05.843920 initrd-setup-root[962]: cut: /sysroot/etc/shadow: No such file or directory Dec 12 17:24:05.847242 initrd-setup-root[969]: cut: /sysroot/etc/gshadow: No such file or directory Dec 12 17:24:05.936084 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Dec 12 17:24:05.938616 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Dec 12 17:24:05.940266 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Dec 12 17:24:05.961202 systemd[1]: sysroot-oem.mount: Deactivated successfully. Dec 12 17:24:05.963281 kernel: BTRFS info (device vda6): last unmount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:24:05.983678 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Dec 12 17:24:05.994626 ignition[1036]: INFO : Ignition 2.22.0 Dec 12 17:24:05.994626 ignition[1036]: INFO : Stage: mount Dec 12 17:24:05.997266 ignition[1036]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 17:24:05.997266 ignition[1036]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 17:24:05.997266 ignition[1036]: INFO : mount: mount passed Dec 12 17:24:05.997266 ignition[1036]: INFO : Ignition finished successfully Dec 12 17:24:05.998210 systemd[1]: Finished ignition-mount.service - Ignition (mount). Dec 12 17:24:06.798603 systemd-networkd[865]: eth0: Gained IPv6LL Dec 12 17:24:06.862397 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:24:08.870443 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:24:12.880445 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:24:12.882563 coreos-metadata[922]: Dec 12 17:24:12.882 WARN failed to locate config-drive, using the metadata service API instead Dec 12 17:24:12.898979 coreos-metadata[922]: Dec 12 17:24:12.898 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 12 17:24:13.208322 coreos-metadata[922]: Dec 12 17:24:13.208 INFO Fetch successful Dec 12 17:24:13.209572 coreos-metadata[922]: Dec 12 17:24:13.209 INFO wrote hostname ci-4459-2-2-d-e796afb129 to /sysroot/etc/hostname Dec 12 17:24:13.211096 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Dec 12 17:24:13.211200 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Dec 12 17:24:13.213469 systemd[1]: Starting ignition-files.service - Ignition (files)... Dec 12 17:24:13.232673 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 12 17:24:13.247388 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1056) Dec 12 17:24:13.250201 kernel: BTRFS info (device vda6): first mount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:24:13.250242 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:24:13.254644 kernel: BTRFS info (device vda6): turning on async discard Dec 12 17:24:13.254682 kernel: BTRFS info (device vda6): enabling free space tree Dec 12 17:24:13.255867 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 12 17:24:13.286234 ignition[1074]: INFO : Ignition 2.22.0 Dec 12 17:24:13.286234 ignition[1074]: INFO : Stage: files Dec 12 17:24:13.287945 ignition[1074]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 17:24:13.287945 ignition[1074]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 17:24:13.287945 ignition[1074]: DEBUG : files: compiled without relabeling support, skipping Dec 12 17:24:13.291340 ignition[1074]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Dec 12 17:24:13.291340 ignition[1074]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Dec 12 17:24:13.291340 ignition[1074]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Dec 12 17:24:13.291340 ignition[1074]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Dec 12 17:24:13.291340 ignition[1074]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Dec 12 17:24:13.291113 unknown[1074]: wrote ssh authorized keys file for user: core Dec 12 17:24:13.298555 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Dec 12 17:24:13.298555 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Dec 12 17:24:13.351022 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Dec 12 17:24:13.461268 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Dec 12 17:24:13.461268 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Dec 12 17:24:13.465050 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Dec 12 17:24:13.465050 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Dec 12 17:24:13.465050 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Dec 12 17:24:13.465050 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 12 17:24:13.465050 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 12 17:24:13.465050 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 12 17:24:13.465050 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 12 17:24:13.643520 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Dec 12 17:24:13.645358 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Dec 12 17:24:13.645358 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 12 17:24:13.652764 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 12 17:24:13.652764 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 12 17:24:13.657000 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.1-arm64.raw: attempt #1 Dec 12 17:24:13.996460 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Dec 12 17:24:15.949426 ignition[1074]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 12 17:24:15.949426 ignition[1074]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Dec 12 17:24:15.955346 ignition[1074]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 12 17:24:15.955346 ignition[1074]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 12 17:24:15.955346 ignition[1074]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Dec 12 17:24:15.955346 ignition[1074]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Dec 12 17:24:15.955346 ignition[1074]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Dec 12 17:24:15.955346 ignition[1074]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Dec 12 17:24:15.955346 ignition[1074]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Dec 12 17:24:15.955346 ignition[1074]: INFO : files: files passed Dec 12 17:24:15.955346 ignition[1074]: INFO : Ignition finished successfully Dec 12 17:24:15.955539 systemd[1]: Finished ignition-files.service - Ignition (files). Dec 12 17:24:15.960152 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Dec 12 17:24:15.963334 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Dec 12 17:24:15.971563 systemd[1]: ignition-quench.service: Deactivated successfully. Dec 12 17:24:15.980411 initrd-setup-root-after-ignition[1105]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 12 17:24:15.980411 initrd-setup-root-after-ignition[1105]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Dec 12 17:24:15.971652 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Dec 12 17:24:15.987453 initrd-setup-root-after-ignition[1109]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 12 17:24:15.981079 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 12 17:24:15.983587 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Dec 12 17:24:15.987050 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Dec 12 17:24:16.036336 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 12 17:24:16.037527 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Dec 12 17:24:16.040514 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Dec 12 17:24:16.041611 systemd[1]: Reached target initrd.target - Initrd Default Target. Dec 12 17:24:16.043356 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Dec 12 17:24:16.044210 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Dec 12 17:24:16.068131 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 12 17:24:16.070732 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Dec 12 17:24:16.089456 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Dec 12 17:24:16.090718 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 17:24:16.092701 systemd[1]: Stopped target timers.target - Timer Units. Dec 12 17:24:16.094349 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 12 17:24:16.094503 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 12 17:24:16.096950 systemd[1]: Stopped target initrd.target - Initrd Default Target. Dec 12 17:24:16.098852 systemd[1]: Stopped target basic.target - Basic System. Dec 12 17:24:16.100395 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Dec 12 17:24:16.102084 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Dec 12 17:24:16.103961 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Dec 12 17:24:16.105855 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Dec 12 17:24:16.107662 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Dec 12 17:24:16.109381 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Dec 12 17:24:16.111324 systemd[1]: Stopped target sysinit.target - System Initialization. Dec 12 17:24:16.113322 systemd[1]: Stopped target local-fs.target - Local File Systems. Dec 12 17:24:16.115071 systemd[1]: Stopped target swap.target - Swaps. Dec 12 17:24:16.116546 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 12 17:24:16.116680 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Dec 12 17:24:16.118916 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Dec 12 17:24:16.120809 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 17:24:16.122622 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Dec 12 17:24:16.123422 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 17:24:16.124605 systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 12 17:24:16.124729 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Dec 12 17:24:16.127459 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Dec 12 17:24:16.127587 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 12 17:24:16.129598 systemd[1]: ignition-files.service: Deactivated successfully. Dec 12 17:24:16.129705 systemd[1]: Stopped ignition-files.service - Ignition (files). Dec 12 17:24:16.132148 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Dec 12 17:24:16.133119 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 12 17:24:16.133261 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 17:24:16.144238 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Dec 12 17:24:16.145161 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 12 17:24:16.145304 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 17:24:16.147243 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 12 17:24:16.147350 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Dec 12 17:24:16.152807 systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 12 17:24:16.154395 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Dec 12 17:24:16.161062 systemd[1]: sysroot-boot.mount: Deactivated successfully. Dec 12 17:24:16.162352 ignition[1129]: INFO : Ignition 2.22.0 Dec 12 17:24:16.162352 ignition[1129]: INFO : Stage: umount Dec 12 17:24:16.162352 ignition[1129]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 17:24:16.162352 ignition[1129]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 17:24:16.167960 ignition[1129]: INFO : umount: umount passed Dec 12 17:24:16.167960 ignition[1129]: INFO : Ignition finished successfully Dec 12 17:24:16.165343 systemd[1]: ignition-mount.service: Deactivated successfully. Dec 12 17:24:16.165475 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Dec 12 17:24:16.167119 systemd[1]: sysroot-boot.service: Deactivated successfully. Dec 12 17:24:16.167229 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Dec 12 17:24:16.169268 systemd[1]: ignition-disks.service: Deactivated successfully. Dec 12 17:24:16.169351 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Dec 12 17:24:16.170399 systemd[1]: ignition-kargs.service: Deactivated successfully. Dec 12 17:24:16.170447 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Dec 12 17:24:16.172104 systemd[1]: ignition-fetch.service: Deactivated successfully. Dec 12 17:24:16.172144 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Dec 12 17:24:16.173760 systemd[1]: Stopped target network.target - Network. Dec 12 17:24:16.175203 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Dec 12 17:24:16.175257 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Dec 12 17:24:16.177085 systemd[1]: Stopped target paths.target - Path Units. Dec 12 17:24:16.178527 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 12 17:24:16.182457 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 17:24:16.184050 systemd[1]: Stopped target slices.target - Slice Units. Dec 12 17:24:16.185619 systemd[1]: Stopped target sockets.target - Socket Units. Dec 12 17:24:16.187141 systemd[1]: iscsid.socket: Deactivated successfully. Dec 12 17:24:16.187184 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Dec 12 17:24:16.188919 systemd[1]: iscsiuio.socket: Deactivated successfully. Dec 12 17:24:16.188951 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 12 17:24:16.190711 systemd[1]: ignition-setup.service: Deactivated successfully. Dec 12 17:24:16.190768 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Dec 12 17:24:16.192325 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Dec 12 17:24:16.192379 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Dec 12 17:24:16.194146 systemd[1]: initrd-setup-root.service: Deactivated successfully. Dec 12 17:24:16.194194 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Dec 12 17:24:16.196568 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Dec 12 17:24:16.198017 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Dec 12 17:24:16.206839 systemd[1]: systemd-resolved.service: Deactivated successfully. Dec 12 17:24:16.208425 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Dec 12 17:24:16.212000 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Dec 12 17:24:16.212288 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 12 17:24:16.212328 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 17:24:16.218296 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Dec 12 17:24:16.218567 systemd[1]: systemd-networkd.service: Deactivated successfully. Dec 12 17:24:16.218685 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Dec 12 17:24:16.222726 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Dec 12 17:24:16.223163 systemd[1]: Stopped target network-pre.target - Preparation for Network. Dec 12 17:24:16.225031 systemd[1]: systemd-networkd.socket: Deactivated successfully. Dec 12 17:24:16.225088 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Dec 12 17:24:16.227970 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Dec 12 17:24:16.228879 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Dec 12 17:24:16.228942 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 12 17:24:16.230912 systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 12 17:24:16.230959 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Dec 12 17:24:16.233628 systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 12 17:24:16.233671 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Dec 12 17:24:16.235575 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 17:24:16.238841 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Dec 12 17:24:16.264396 systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 12 17:24:16.264562 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 17:24:16.266849 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 12 17:24:16.266885 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Dec 12 17:24:16.267945 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 12 17:24:16.267978 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 17:24:16.270228 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 12 17:24:16.270279 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Dec 12 17:24:16.273027 systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 12 17:24:16.273077 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Dec 12 17:24:16.275787 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 12 17:24:16.275839 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 12 17:24:16.279309 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Dec 12 17:24:16.280427 systemd[1]: systemd-network-generator.service: Deactivated successfully. Dec 12 17:24:16.280486 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 17:24:16.283247 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 12 17:24:16.283288 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 17:24:16.286599 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 17:24:16.286639 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:24:16.290410 systemd[1]: network-cleanup.service: Deactivated successfully. Dec 12 17:24:16.292502 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Dec 12 17:24:16.298166 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 12 17:24:16.298256 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Dec 12 17:24:16.300725 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Dec 12 17:24:16.303182 systemd[1]: Starting initrd-switch-root.service - Switch Root... Dec 12 17:24:16.333651 systemd[1]: Switching root. Dec 12 17:24:16.368773 systemd-journald[313]: Journal stopped Dec 12 17:24:17.202611 systemd-journald[313]: Received SIGTERM from PID 1 (systemd). Dec 12 17:24:17.202686 kernel: SELinux: policy capability network_peer_controls=1 Dec 12 17:24:17.202705 kernel: SELinux: policy capability open_perms=1 Dec 12 17:24:17.202719 kernel: SELinux: policy capability extended_socket_class=1 Dec 12 17:24:17.202736 kernel: SELinux: policy capability always_check_network=0 Dec 12 17:24:17.202748 kernel: SELinux: policy capability cgroup_seclabel=1 Dec 12 17:24:17.202761 kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 12 17:24:17.202773 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Dec 12 17:24:17.202782 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Dec 12 17:24:17.202792 kernel: SELinux: policy capability userspace_initial_context=0 Dec 12 17:24:17.202801 kernel: audit: type=1403 audit(1765560256.514:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Dec 12 17:24:17.202812 systemd[1]: Successfully loaded SELinux policy in 64.852ms. Dec 12 17:24:17.202828 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.670ms. Dec 12 17:24:17.202839 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 12 17:24:17.202858 systemd[1]: Detected virtualization kvm. Dec 12 17:24:17.202926 systemd[1]: Detected architecture arm64. Dec 12 17:24:17.202939 systemd[1]: Detected first boot. Dec 12 17:24:17.202949 systemd[1]: Hostname set to . Dec 12 17:24:17.202961 systemd[1]: Initializing machine ID from VM UUID. Dec 12 17:24:17.202971 zram_generator::config[1176]: No configuration found. Dec 12 17:24:17.202981 kernel: NET: Registered PF_VSOCK protocol family Dec 12 17:24:17.202991 systemd[1]: Populated /etc with preset unit settings. Dec 12 17:24:17.203002 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Dec 12 17:24:17.203012 systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 12 17:24:17.203023 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Dec 12 17:24:17.203032 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 12 17:24:17.203043 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Dec 12 17:24:17.203053 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Dec 12 17:24:17.203063 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Dec 12 17:24:17.203073 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Dec 12 17:24:17.203083 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Dec 12 17:24:17.203093 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Dec 12 17:24:17.203104 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Dec 12 17:24:17.203114 systemd[1]: Created slice user.slice - User and Session Slice. Dec 12 17:24:17.203124 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 17:24:17.203142 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 17:24:17.203159 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Dec 12 17:24:17.203171 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Dec 12 17:24:17.203181 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Dec 12 17:24:17.203191 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 12 17:24:17.203204 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Dec 12 17:24:17.203216 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 17:24:17.203228 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 12 17:24:17.203238 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Dec 12 17:24:17.203248 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Dec 12 17:24:17.203258 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Dec 12 17:24:17.203267 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Dec 12 17:24:17.203279 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 17:24:17.203292 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 12 17:24:17.203302 systemd[1]: Reached target slices.target - Slice Units. Dec 12 17:24:17.203312 systemd[1]: Reached target swap.target - Swaps. Dec 12 17:24:17.203322 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Dec 12 17:24:17.203332 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Dec 12 17:24:17.203350 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Dec 12 17:24:17.203361 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 12 17:24:17.203434 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 12 17:24:17.203448 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 17:24:17.203461 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Dec 12 17:24:17.203472 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Dec 12 17:24:17.203482 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Dec 12 17:24:17.203492 systemd[1]: Mounting media.mount - External Media Directory... Dec 12 17:24:17.203502 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Dec 12 17:24:17.203512 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Dec 12 17:24:17.203522 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Dec 12 17:24:17.203533 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 12 17:24:17.203545 systemd[1]: Reached target machines.target - Containers. Dec 12 17:24:17.203555 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Dec 12 17:24:17.203566 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:24:17.203576 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 12 17:24:17.203587 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Dec 12 17:24:17.203600 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 17:24:17.203610 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 12 17:24:17.203620 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 17:24:17.203630 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Dec 12 17:24:17.203642 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 17:24:17.203652 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Dec 12 17:24:17.203662 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 12 17:24:17.203673 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Dec 12 17:24:17.203684 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Dec 12 17:24:17.203694 systemd[1]: Stopped systemd-fsck-usr.service. Dec 12 17:24:17.203705 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:24:17.203715 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 12 17:24:17.203726 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 12 17:24:17.203738 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 12 17:24:17.203748 kernel: ACPI: bus type drm_connector registered Dec 12 17:24:17.203758 kernel: fuse: init (API version 7.41) Dec 12 17:24:17.203768 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Dec 12 17:24:17.203779 kernel: loop: module loaded Dec 12 17:24:17.203789 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Dec 12 17:24:17.203799 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 12 17:24:17.203836 systemd-journald[1247]: Collecting audit messages is disabled. Dec 12 17:24:17.203861 systemd[1]: verity-setup.service: Deactivated successfully. Dec 12 17:24:17.203872 systemd[1]: Stopped verity-setup.service. Dec 12 17:24:17.203882 systemd-journald[1247]: Journal started Dec 12 17:24:17.203904 systemd-journald[1247]: Runtime Journal (/run/log/journal/c358f6af69a14dcba7040c439575f1a2) is 8M, max 319.5M, 311.5M free. Dec 12 17:24:16.982616 systemd[1]: Queued start job for default target multi-user.target. Dec 12 17:24:17.005635 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Dec 12 17:24:17.006035 systemd[1]: systemd-journald.service: Deactivated successfully. Dec 12 17:24:17.208160 systemd[1]: Started systemd-journald.service - Journal Service. Dec 12 17:24:17.208853 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Dec 12 17:24:17.210103 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Dec 12 17:24:17.211480 systemd[1]: Mounted media.mount - External Media Directory. Dec 12 17:24:17.212549 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Dec 12 17:24:17.213713 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Dec 12 17:24:17.214942 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Dec 12 17:24:17.217398 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Dec 12 17:24:17.218791 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 17:24:17.220405 systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 12 17:24:17.220580 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Dec 12 17:24:17.221963 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 17:24:17.222127 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 17:24:17.225052 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 12 17:24:17.225237 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 12 17:24:17.226642 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 17:24:17.226796 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 17:24:17.228790 systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 12 17:24:17.228959 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Dec 12 17:24:17.230407 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 17:24:17.230560 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 17:24:17.231966 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 12 17:24:17.233433 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 17:24:17.235016 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Dec 12 17:24:17.237829 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Dec 12 17:24:17.249554 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 12 17:24:17.251913 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Dec 12 17:24:17.254038 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Dec 12 17:24:17.255231 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Dec 12 17:24:17.255271 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 12 17:24:17.257182 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Dec 12 17:24:17.268531 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Dec 12 17:24:17.269647 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:24:17.270814 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Dec 12 17:24:17.275515 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Dec 12 17:24:17.276760 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 12 17:24:17.278662 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Dec 12 17:24:17.280090 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 12 17:24:17.281785 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 12 17:24:17.284620 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Dec 12 17:24:17.292320 systemd-journald[1247]: Time spent on flushing to /var/log/journal/c358f6af69a14dcba7040c439575f1a2 is 19.358ms for 1680 entries. Dec 12 17:24:17.292320 systemd-journald[1247]: System Journal (/var/log/journal/c358f6af69a14dcba7040c439575f1a2) is 8M, max 584.8M, 576.8M free. Dec 12 17:24:17.330352 systemd-journald[1247]: Received client request to flush runtime journal. Dec 12 17:24:17.330404 kernel: loop0: detected capacity change from 0 to 119840 Dec 12 17:24:17.295637 systemd[1]: Starting systemd-sysusers.service - Create System Users... Dec 12 17:24:17.300410 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 17:24:17.301832 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Dec 12 17:24:17.303242 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Dec 12 17:24:17.316400 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Dec 12 17:24:17.317810 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Dec 12 17:24:17.328771 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Dec 12 17:24:17.342907 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Dec 12 17:24:17.345153 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 12 17:24:17.349436 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Dec 12 17:24:17.349137 systemd[1]: Finished systemd-sysusers.service - Create System Users. Dec 12 17:24:17.355899 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 12 17:24:17.366391 kernel: loop1: detected capacity change from 0 to 100632 Dec 12 17:24:17.368138 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Dec 12 17:24:17.389679 systemd-tmpfiles[1312]: ACLs are not supported, ignoring. Dec 12 17:24:17.390000 systemd-tmpfiles[1312]: ACLs are not supported, ignoring. Dec 12 17:24:17.395430 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 17:24:17.419405 kernel: loop2: detected capacity change from 0 to 1632 Dec 12 17:24:17.460417 kernel: loop3: detected capacity change from 0 to 200800 Dec 12 17:24:17.513409 kernel: loop4: detected capacity change from 0 to 119840 Dec 12 17:24:17.526406 kernel: loop5: detected capacity change from 0 to 100632 Dec 12 17:24:17.537409 kernel: loop6: detected capacity change from 0 to 1632 Dec 12 17:24:17.543413 kernel: loop7: detected capacity change from 0 to 200800 Dec 12 17:24:17.563859 (sd-merge)[1319]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-stackit'. Dec 12 17:24:17.564314 (sd-merge)[1319]: Merged extensions into '/usr'. Dec 12 17:24:17.568468 systemd[1]: Reload requested from client PID 1296 ('systemd-sysext') (unit systemd-sysext.service)... Dec 12 17:24:17.568588 systemd[1]: Reloading... Dec 12 17:24:17.627408 zram_generator::config[1345]: No configuration found. Dec 12 17:24:17.776048 ldconfig[1291]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Dec 12 17:24:17.805929 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Dec 12 17:24:17.806036 systemd[1]: Reloading finished in 237 ms. Dec 12 17:24:17.842375 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Dec 12 17:24:17.843786 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Dec 12 17:24:17.845841 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Dec 12 17:24:17.860096 systemd[1]: Starting ensure-sysext.service... Dec 12 17:24:17.862041 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 12 17:24:17.864665 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 17:24:17.871922 systemd[1]: Reload requested from client PID 1384 ('systemctl') (unit ensure-sysext.service)... Dec 12 17:24:17.871940 systemd[1]: Reloading... Dec 12 17:24:17.877667 systemd-tmpfiles[1385]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Dec 12 17:24:17.877703 systemd-tmpfiles[1385]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Dec 12 17:24:17.877936 systemd-tmpfiles[1385]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 12 17:24:17.878123 systemd-tmpfiles[1385]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Dec 12 17:24:17.878754 systemd-tmpfiles[1385]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Dec 12 17:24:17.878959 systemd-tmpfiles[1385]: ACLs are not supported, ignoring. Dec 12 17:24:17.879005 systemd-tmpfiles[1385]: ACLs are not supported, ignoring. Dec 12 17:24:17.882587 systemd-tmpfiles[1385]: Detected autofs mount point /boot during canonicalization of boot. Dec 12 17:24:17.882598 systemd-tmpfiles[1385]: Skipping /boot Dec 12 17:24:17.888192 systemd-tmpfiles[1385]: Detected autofs mount point /boot during canonicalization of boot. Dec 12 17:24:17.888206 systemd-tmpfiles[1385]: Skipping /boot Dec 12 17:24:17.889831 systemd-udevd[1386]: Using default interface naming scheme 'v255'. Dec 12 17:24:17.922396 zram_generator::config[1413]: No configuration found. Dec 12 17:24:18.058406 kernel: mousedev: PS/2 mouse device common for all mice Dec 12 17:24:18.107095 kernel: [drm] pci: virtio-gpu-pci detected at 0000:06:00.0 Dec 12 17:24:18.107182 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Dec 12 17:24:18.107200 kernel: [drm] features: -context_init Dec 12 17:24:18.123957 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 12 17:24:18.129484 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Dec 12 17:24:18.133609 systemd[1]: Reloading finished in 261 ms. Dec 12 17:24:18.148792 kernel: [drm] number of scanouts: 1 Dec 12 17:24:18.148874 kernel: [drm] number of cap sets: 0 Dec 12 17:24:18.152400 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:06:00.0 on minor 0 Dec 12 17:24:18.155173 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 17:24:18.162535 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 17:24:18.171392 kernel: Console: switching to colour frame buffer device 160x50 Dec 12 17:24:18.193452 kernel: virtio-pci 0000:06:00.0: [drm] fb0: virtio_gpudrmfb frame buffer device Dec 12 17:24:18.205621 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 12 17:24:18.209209 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Dec 12 17:24:18.211552 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Dec 12 17:24:18.214170 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Dec 12 17:24:18.217751 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 12 17:24:18.221665 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 12 17:24:18.224125 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Dec 12 17:24:18.228067 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:24:18.238047 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Dec 12 17:24:18.248597 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:24:18.250351 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 17:24:18.253235 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 17:24:18.255684 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 17:24:18.259642 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:24:18.259832 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:24:18.264657 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Dec 12 17:24:18.269619 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Dec 12 17:24:18.271401 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 17:24:18.271594 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 17:24:18.273439 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 17:24:18.273598 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 17:24:18.275591 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 17:24:18.276702 augenrules[1543]: No rules Dec 12 17:24:18.281549 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 17:24:18.283262 systemd[1]: audit-rules.service: Deactivated successfully. Dec 12 17:24:18.283464 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 12 17:24:18.288026 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Dec 12 17:24:18.294110 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:24:18.295353 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 17:24:18.299648 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 17:24:18.311902 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 17:24:18.313341 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:24:18.313508 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:24:18.314998 systemd[1]: Starting systemd-update-done.service - Update is Completed... Dec 12 17:24:18.317461 systemd[1]: Started systemd-userdbd.service - User Database Manager. Dec 12 17:24:18.319048 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 17:24:18.319236 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 17:24:18.321971 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 17:24:18.322146 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 17:24:18.324424 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 17:24:18.324595 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 17:24:18.333726 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:24:18.338733 systemd[1]: Finished systemd-update-done.service - Update is Completed. Dec 12 17:24:18.344195 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 12 17:24:18.345595 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:24:18.347506 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 17:24:18.354527 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 12 17:24:18.356515 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 17:24:18.360045 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 17:24:18.362114 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Dec 12 17:24:18.364593 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:24:18.364649 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:24:18.364703 systemd[1]: Reached target time-set.target - System Time Set. Dec 12 17:24:18.366384 systemd[1]: Finished ensure-sysext.service. Dec 12 17:24:18.367537 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Dec 12 17:24:18.369155 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 17:24:18.369315 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 17:24:18.373078 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 17:24:18.373259 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 17:24:18.375834 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 17:24:18.375994 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 17:24:18.379306 kernel: pps_core: LinuxPPS API ver. 1 registered Dec 12 17:24:18.379386 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Dec 12 17:24:18.385488 kernel: PTP clock support registered Dec 12 17:24:18.384945 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 12 17:24:18.385049 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 12 17:24:18.385683 augenrules[1572]: /sbin/augenrules: No change Dec 12 17:24:18.385089 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 12 17:24:18.385712 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 12 17:24:18.385893 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 12 17:24:18.390304 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Dec 12 17:24:18.390551 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Dec 12 17:24:18.393097 systemd-networkd[1513]: lo: Link UP Dec 12 17:24:18.393117 systemd-networkd[1513]: lo: Gained carrier Dec 12 17:24:18.394298 systemd-networkd[1513]: Enumeration completed Dec 12 17:24:18.394437 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 12 17:24:18.394809 systemd-networkd[1513]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:24:18.394821 systemd-networkd[1513]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 12 17:24:18.395494 systemd-networkd[1513]: eth0: Link UP Dec 12 17:24:18.395627 systemd-networkd[1513]: eth0: Gained carrier Dec 12 17:24:18.395648 systemd-networkd[1513]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:24:18.396051 augenrules[1602]: No rules Dec 12 17:24:18.397080 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Dec 12 17:24:18.397845 systemd-resolved[1514]: Positive Trust Anchors: Dec 12 17:24:18.397857 systemd-resolved[1514]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 12 17:24:18.397888 systemd-resolved[1514]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 12 17:24:18.399480 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Dec 12 17:24:18.400929 systemd[1]: audit-rules.service: Deactivated successfully. Dec 12 17:24:18.401148 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 12 17:24:18.401750 systemd-resolved[1514]: Using system hostname 'ci-4459-2-2-d-e796afb129'. Dec 12 17:24:18.403298 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 12 17:24:18.404849 systemd[1]: Reached target network.target - Network. Dec 12 17:24:18.405888 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 12 17:24:18.407075 systemd[1]: Reached target sysinit.target - System Initialization. Dec 12 17:24:18.408289 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Dec 12 17:24:18.409642 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Dec 12 17:24:18.411014 systemd[1]: Started logrotate.timer - Daily rotation of log files. Dec 12 17:24:18.412203 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Dec 12 17:24:18.412873 systemd-networkd[1513]: eth0: DHCPv4 address 10.0.10.18/25, gateway 10.0.10.1 acquired from 10.0.10.1 Dec 12 17:24:18.413675 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Dec 12 17:24:18.415015 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Dec 12 17:24:18.415050 systemd[1]: Reached target paths.target - Path Units. Dec 12 17:24:18.416061 systemd[1]: Reached target timers.target - Timer Units. Dec 12 17:24:18.417919 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Dec 12 17:24:18.420397 systemd[1]: Starting docker.socket - Docker Socket for the API... Dec 12 17:24:18.422967 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Dec 12 17:24:18.424501 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Dec 12 17:24:18.425726 systemd[1]: Reached target ssh-access.target - SSH Access Available. Dec 12 17:24:18.428944 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Dec 12 17:24:18.430347 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Dec 12 17:24:18.432453 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Dec 12 17:24:18.433869 systemd[1]: Listening on docker.socket - Docker Socket for the API. Dec 12 17:24:18.435542 systemd[1]: Reached target sockets.target - Socket Units. Dec 12 17:24:18.436541 systemd[1]: Reached target basic.target - Basic System. Dec 12 17:24:18.437597 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Dec 12 17:24:18.437635 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Dec 12 17:24:18.440103 systemd[1]: Starting chronyd.service - NTP client/server... Dec 12 17:24:18.441970 systemd[1]: Starting containerd.service - containerd container runtime... Dec 12 17:24:18.444104 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Dec 12 17:24:18.448549 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Dec 12 17:24:18.450489 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Dec 12 17:24:18.451409 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:24:18.453880 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Dec 12 17:24:18.456600 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Dec 12 17:24:18.457955 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Dec 12 17:24:18.459211 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Dec 12 17:24:18.465537 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Dec 12 17:24:18.471558 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Dec 12 17:24:18.474275 jq[1621]: false Dec 12 17:24:18.474586 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Dec 12 17:24:18.479172 systemd[1]: Starting systemd-logind.service - User Login Management... Dec 12 17:24:18.481153 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Dec 12 17:24:18.481704 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Dec 12 17:24:18.482957 systemd[1]: Starting update-engine.service - Update Engine... Dec 12 17:24:18.485397 extend-filesystems[1622]: Found /dev/vda6 Dec 12 17:24:18.487909 chronyd[1614]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Dec 12 17:24:18.490037 chronyd[1614]: Loaded seccomp filter (level 2) Dec 12 17:24:18.490173 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Dec 12 17:24:18.491627 extend-filesystems[1622]: Found /dev/vda9 Dec 12 17:24:18.493072 systemd[1]: Started chronyd.service - NTP client/server. Dec 12 17:24:18.493458 extend-filesystems[1622]: Checking size of /dev/vda9 Dec 12 17:24:18.502903 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Dec 12 17:24:18.505284 jq[1641]: true Dec 12 17:24:18.505557 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Dec 12 17:24:18.505753 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Dec 12 17:24:18.506083 extend-filesystems[1622]: Resized partition /dev/vda9 Dec 12 17:24:18.506133 systemd[1]: motdgen.service: Deactivated successfully. Dec 12 17:24:18.506333 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Dec 12 17:24:18.512161 extend-filesystems[1649]: resize2fs 1.47.3 (8-Jul-2025) Dec 12 17:24:18.518196 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Dec 12 17:24:18.518397 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Dec 12 17:24:18.521409 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 12499963 blocks Dec 12 17:24:18.540798 update_engine[1634]: I20251212 17:24:18.540251 1634 main.cc:92] Flatcar Update Engine starting Dec 12 17:24:18.543102 (ntainerd)[1652]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Dec 12 17:24:18.555121 jq[1651]: true Dec 12 17:24:18.567422 tar[1650]: linux-arm64/LICENSE Dec 12 17:24:18.567422 tar[1650]: linux-arm64/helm Dec 12 17:24:18.584217 systemd-logind[1632]: New seat seat0. Dec 12 17:24:18.589487 systemd-logind[1632]: Watching system buttons on /dev/input/event0 (Power Button) Dec 12 17:24:18.589931 systemd-logind[1632]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Dec 12 17:24:18.590177 systemd[1]: Started systemd-logind.service - User Login Management. Dec 12 17:24:18.593944 dbus-daemon[1617]: [system] SELinux support is enabled Dec 12 17:24:18.594129 systemd[1]: Started dbus.service - D-Bus System Message Bus. Dec 12 17:24:18.597342 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Dec 12 17:24:18.598400 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Dec 12 17:24:18.600060 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Dec 12 17:24:18.600086 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Dec 12 17:24:18.603758 dbus-daemon[1617]: [system] Successfully activated service 'org.freedesktop.systemd1' Dec 12 17:24:18.606570 update_engine[1634]: I20251212 17:24:18.604517 1634 update_check_scheduler.cc:74] Next update check in 7m58s Dec 12 17:24:18.606642 systemd[1]: Started update-engine.service - Update Engine. Dec 12 17:24:18.609915 systemd[1]: Started locksmithd.service - Cluster reboot manager. Dec 12 17:24:18.670750 locksmithd[1679]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Dec 12 17:24:18.691607 bash[1680]: Updated "/home/core/.ssh/authorized_keys" Dec 12 17:24:18.694597 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Dec 12 17:24:18.698537 systemd[1]: Starting sshkeys.service... Dec 12 17:24:18.725655 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Dec 12 17:24:18.729807 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Dec 12 17:24:18.753476 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:24:18.760759 containerd[1652]: time="2025-12-12T17:24:18Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Dec 12 17:24:18.761777 containerd[1652]: time="2025-12-12T17:24:18.761732680Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Dec 12 17:24:18.781398 containerd[1652]: time="2025-12-12T17:24:18.781292240Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="12.28µs" Dec 12 17:24:18.781398 containerd[1652]: time="2025-12-12T17:24:18.781388200Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Dec 12 17:24:18.781398 containerd[1652]: time="2025-12-12T17:24:18.781409040Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Dec 12 17:24:18.781617 containerd[1652]: time="2025-12-12T17:24:18.781557840Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Dec 12 17:24:18.781617 containerd[1652]: time="2025-12-12T17:24:18.781582960Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Dec 12 17:24:18.781782 containerd[1652]: time="2025-12-12T17:24:18.781749520Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 12 17:24:18.781996 containerd[1652]: time="2025-12-12T17:24:18.781952480Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 12 17:24:18.781996 containerd[1652]: time="2025-12-12T17:24:18.781982040Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 12 17:24:18.782579 containerd[1652]: time="2025-12-12T17:24:18.782526280Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 12 17:24:18.782654 containerd[1652]: time="2025-12-12T17:24:18.782623520Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 12 17:24:18.782704 containerd[1652]: time="2025-12-12T17:24:18.782682280Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 12 17:24:18.782733 containerd[1652]: time="2025-12-12T17:24:18.782703040Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Dec 12 17:24:18.783045 containerd[1652]: time="2025-12-12T17:24:18.783005640Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Dec 12 17:24:18.783462 containerd[1652]: time="2025-12-12T17:24:18.783417760Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 12 17:24:18.783527 containerd[1652]: time="2025-12-12T17:24:18.783505240Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 12 17:24:18.783527 containerd[1652]: time="2025-12-12T17:24:18.783525080Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Dec 12 17:24:18.783697 containerd[1652]: time="2025-12-12T17:24:18.783598720Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Dec 12 17:24:18.784142 containerd[1652]: time="2025-12-12T17:24:18.784114120Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Dec 12 17:24:18.784224 containerd[1652]: time="2025-12-12T17:24:18.784206560Z" level=info msg="metadata content store policy set" policy=shared Dec 12 17:24:18.807955 containerd[1652]: time="2025-12-12T17:24:18.807861760Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Dec 12 17:24:18.807955 containerd[1652]: time="2025-12-12T17:24:18.807931920Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Dec 12 17:24:18.808167 containerd[1652]: time="2025-12-12T17:24:18.807979120Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Dec 12 17:24:18.808167 containerd[1652]: time="2025-12-12T17:24:18.807991760Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Dec 12 17:24:18.808167 containerd[1652]: time="2025-12-12T17:24:18.808003960Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Dec 12 17:24:18.808167 containerd[1652]: time="2025-12-12T17:24:18.808018320Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Dec 12 17:24:18.808167 containerd[1652]: time="2025-12-12T17:24:18.808044280Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Dec 12 17:24:18.808167 containerd[1652]: time="2025-12-12T17:24:18.808059400Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Dec 12 17:24:18.808167 containerd[1652]: time="2025-12-12T17:24:18.808071840Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Dec 12 17:24:18.808167 containerd[1652]: time="2025-12-12T17:24:18.808081880Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Dec 12 17:24:18.808167 containerd[1652]: time="2025-12-12T17:24:18.808091760Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Dec 12 17:24:18.808167 containerd[1652]: time="2025-12-12T17:24:18.808104800Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Dec 12 17:24:18.808327 containerd[1652]: time="2025-12-12T17:24:18.808262560Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Dec 12 17:24:18.808327 containerd[1652]: time="2025-12-12T17:24:18.808283320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Dec 12 17:24:18.808327 containerd[1652]: time="2025-12-12T17:24:18.808297120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Dec 12 17:24:18.808327 containerd[1652]: time="2025-12-12T17:24:18.808309200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Dec 12 17:24:18.808327 containerd[1652]: time="2025-12-12T17:24:18.808320920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Dec 12 17:24:18.808422 containerd[1652]: time="2025-12-12T17:24:18.808331520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Dec 12 17:24:18.808422 containerd[1652]: time="2025-12-12T17:24:18.808343080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Dec 12 17:24:18.808422 containerd[1652]: time="2025-12-12T17:24:18.808353640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Dec 12 17:24:18.808422 containerd[1652]: time="2025-12-12T17:24:18.808382320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Dec 12 17:24:18.808422 containerd[1652]: time="2025-12-12T17:24:18.808398120Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Dec 12 17:24:18.808422 containerd[1652]: time="2025-12-12T17:24:18.808409400Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Dec 12 17:24:18.808742 containerd[1652]: time="2025-12-12T17:24:18.808617600Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Dec 12 17:24:18.808742 containerd[1652]: time="2025-12-12T17:24:18.808644560Z" level=info msg="Start snapshots syncer" Dec 12 17:24:18.808742 containerd[1652]: time="2025-12-12T17:24:18.808671960Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Dec 12 17:24:18.808974 containerd[1652]: time="2025-12-12T17:24:18.808934720Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Dec 12 17:24:18.809078 containerd[1652]: time="2025-12-12T17:24:18.808992520Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Dec 12 17:24:18.809078 containerd[1652]: time="2025-12-12T17:24:18.809042600Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Dec 12 17:24:18.809182 containerd[1652]: time="2025-12-12T17:24:18.809143320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Dec 12 17:24:18.809182 containerd[1652]: time="2025-12-12T17:24:18.809163480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Dec 12 17:24:18.809182 containerd[1652]: time="2025-12-12T17:24:18.809173520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Dec 12 17:24:18.809238 containerd[1652]: time="2025-12-12T17:24:18.809195200Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Dec 12 17:24:18.809238 containerd[1652]: time="2025-12-12T17:24:18.809207200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Dec 12 17:24:18.809238 containerd[1652]: time="2025-12-12T17:24:18.809218520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Dec 12 17:24:18.809238 containerd[1652]: time="2025-12-12T17:24:18.809228440Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Dec 12 17:24:18.809349 containerd[1652]: time="2025-12-12T17:24:18.809250320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Dec 12 17:24:18.809349 containerd[1652]: time="2025-12-12T17:24:18.809263000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Dec 12 17:24:18.809349 containerd[1652]: time="2025-12-12T17:24:18.809273840Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Dec 12 17:24:18.809349 containerd[1652]: time="2025-12-12T17:24:18.809314720Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 12 17:24:18.809349 containerd[1652]: time="2025-12-12T17:24:18.809330200Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 12 17:24:18.809349 containerd[1652]: time="2025-12-12T17:24:18.809341320Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 12 17:24:18.809548 containerd[1652]: time="2025-12-12T17:24:18.809350840Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 12 17:24:18.809548 containerd[1652]: time="2025-12-12T17:24:18.809359000Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Dec 12 17:24:18.809548 containerd[1652]: time="2025-12-12T17:24:18.809386600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Dec 12 17:24:18.809548 containerd[1652]: time="2025-12-12T17:24:18.809398880Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Dec 12 17:24:18.809548 containerd[1652]: time="2025-12-12T17:24:18.809493240Z" level=info msg="runtime interface created" Dec 12 17:24:18.809548 containerd[1652]: time="2025-12-12T17:24:18.809498400Z" level=info msg="created NRI interface" Dec 12 17:24:18.809548 containerd[1652]: time="2025-12-12T17:24:18.809506360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Dec 12 17:24:18.809548 containerd[1652]: time="2025-12-12T17:24:18.809518520Z" level=info msg="Connect containerd service" Dec 12 17:24:18.809548 containerd[1652]: time="2025-12-12T17:24:18.809540760Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Dec 12 17:24:18.810246 containerd[1652]: time="2025-12-12T17:24:18.810218160Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 12 17:24:18.853396 kernel: EXT4-fs (vda9): resized filesystem to 12499963 Dec 12 17:24:18.886306 extend-filesystems[1649]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Dec 12 17:24:18.886306 extend-filesystems[1649]: old_desc_blocks = 1, new_desc_blocks = 6 Dec 12 17:24:18.886306 extend-filesystems[1649]: The filesystem on /dev/vda9 is now 12499963 (4k) blocks long. Dec 12 17:24:18.894611 extend-filesystems[1622]: Resized filesystem in /dev/vda9 Dec 12 17:24:18.889984 systemd[1]: extend-filesystems.service: Deactivated successfully. Dec 12 17:24:18.890199 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Dec 12 17:24:18.900392 containerd[1652]: time="2025-12-12T17:24:18.898710840Z" level=info msg="Start subscribing containerd event" Dec 12 17:24:18.900392 containerd[1652]: time="2025-12-12T17:24:18.898791080Z" level=info msg="Start recovering state" Dec 12 17:24:18.900392 containerd[1652]: time="2025-12-12T17:24:18.898741840Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Dec 12 17:24:18.900392 containerd[1652]: time="2025-12-12T17:24:18.898888120Z" level=info msg="Start event monitor" Dec 12 17:24:18.900392 containerd[1652]: time="2025-12-12T17:24:18.898901840Z" level=info msg="Start cni network conf syncer for default" Dec 12 17:24:18.900392 containerd[1652]: time="2025-12-12T17:24:18.898910600Z" level=info msg="Start streaming server" Dec 12 17:24:18.900392 containerd[1652]: time="2025-12-12T17:24:18.898919440Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Dec 12 17:24:18.900392 containerd[1652]: time="2025-12-12T17:24:18.898924600Z" level=info msg=serving... address=/run/containerd/containerd.sock Dec 12 17:24:18.900392 containerd[1652]: time="2025-12-12T17:24:18.898926480Z" level=info msg="runtime interface starting up..." Dec 12 17:24:18.900392 containerd[1652]: time="2025-12-12T17:24:18.898977600Z" level=info msg="starting plugins..." Dec 12 17:24:18.900392 containerd[1652]: time="2025-12-12T17:24:18.899009320Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Dec 12 17:24:18.900392 containerd[1652]: time="2025-12-12T17:24:18.899129360Z" level=info msg="containerd successfully booted in 0.138868s" Dec 12 17:24:18.899237 systemd[1]: Started containerd.service - containerd container runtime. Dec 12 17:24:18.988954 tar[1650]: linux-arm64/README.md Dec 12 17:24:19.005836 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Dec 12 17:24:19.464408 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:24:19.770419 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:24:19.839862 sshd_keygen[1642]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Dec 12 17:24:19.859676 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Dec 12 17:24:19.864825 systemd[1]: Starting issuegen.service - Generate /run/issue... Dec 12 17:24:19.886813 systemd[1]: issuegen.service: Deactivated successfully. Dec 12 17:24:19.887039 systemd[1]: Finished issuegen.service - Generate /run/issue. Dec 12 17:24:19.889922 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Dec 12 17:24:19.917394 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Dec 12 17:24:19.920549 systemd[1]: Started getty@tty1.service - Getty on tty1. Dec 12 17:24:19.922854 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Dec 12 17:24:19.924294 systemd[1]: Reached target getty.target - Login Prompts. Dec 12 17:24:20.430504 systemd-networkd[1513]: eth0: Gained IPv6LL Dec 12 17:24:20.433463 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Dec 12 17:24:20.435513 systemd[1]: Reached target network-online.target - Network is Online. Dec 12 17:24:20.438015 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:24:20.440385 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Dec 12 17:24:20.477719 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Dec 12 17:24:21.264658 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:24:21.270602 (kubelet)[1753]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:24:21.475428 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:24:21.768256 kubelet[1753]: E1212 17:24:21.768180 1753 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:24:21.770501 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:24:21.770630 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:24:21.772454 systemd[1]: kubelet.service: Consumed 709ms CPU time, 249M memory peak. Dec 12 17:24:21.780411 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:24:25.483405 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:24:25.489114 coreos-metadata[1616]: Dec 12 17:24:25.489 WARN failed to locate config-drive, using the metadata service API instead Dec 12 17:24:25.504316 coreos-metadata[1616]: Dec 12 17:24:25.504 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Dec 12 17:24:25.791390 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 17:24:25.798007 coreos-metadata[1692]: Dec 12 17:24:25.797 WARN failed to locate config-drive, using the metadata service API instead Dec 12 17:24:25.810049 coreos-metadata[1692]: Dec 12 17:24:25.810 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Dec 12 17:24:25.907643 coreos-metadata[1616]: Dec 12 17:24:25.907 INFO Fetch successful Dec 12 17:24:25.908007 coreos-metadata[1616]: Dec 12 17:24:25.907 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 12 17:24:26.063586 coreos-metadata[1692]: Dec 12 17:24:26.063 INFO Fetch successful Dec 12 17:24:26.063586 coreos-metadata[1692]: Dec 12 17:24:26.063 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Dec 12 17:24:27.699328 coreos-metadata[1616]: Dec 12 17:24:27.699 INFO Fetch successful Dec 12 17:24:27.699328 coreos-metadata[1616]: Dec 12 17:24:27.699 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Dec 12 17:24:27.860984 coreos-metadata[1692]: Dec 12 17:24:27.860 INFO Fetch successful Dec 12 17:24:27.862856 coreos-metadata[1616]: Dec 12 17:24:27.862 INFO Fetch successful Dec 12 17:24:27.862856 coreos-metadata[1616]: Dec 12 17:24:27.862 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Dec 12 17:24:27.863604 unknown[1692]: wrote ssh authorized keys file for user: core Dec 12 17:24:27.890252 update-ssh-keys[1773]: Updated "/home/core/.ssh/authorized_keys" Dec 12 17:24:27.891412 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Dec 12 17:24:27.894521 systemd[1]: Finished sshkeys.service. Dec 12 17:24:27.998070 coreos-metadata[1616]: Dec 12 17:24:27.997 INFO Fetch successful Dec 12 17:24:27.998205 coreos-metadata[1616]: Dec 12 17:24:27.998 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Dec 12 17:24:28.132328 coreos-metadata[1616]: Dec 12 17:24:28.132 INFO Fetch successful Dec 12 17:24:28.132328 coreos-metadata[1616]: Dec 12 17:24:28.132 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Dec 12 17:24:28.264285 coreos-metadata[1616]: Dec 12 17:24:28.264 INFO Fetch successful Dec 12 17:24:28.288462 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Dec 12 17:24:28.289043 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Dec 12 17:24:28.289723 systemd[1]: Reached target multi-user.target - Multi-User System. Dec 12 17:24:28.291499 systemd[1]: Startup finished in 2.953s (kernel) + 13.896s (initrd) + 11.842s (userspace) = 28.692s. Dec 12 17:24:32.021350 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Dec 12 17:24:32.023356 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:24:32.168516 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:24:32.172154 (kubelet)[1789]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:24:32.205851 kubelet[1789]: E1212 17:24:32.205797 1789 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:24:32.208864 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:24:32.208989 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:24:32.209508 systemd[1]: kubelet.service: Consumed 142ms CPU time, 108.2M memory peak. Dec 12 17:24:42.228205 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Dec 12 17:24:42.229517 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:24:42.273645 chronyd[1614]: Selected source PHC0 Dec 12 17:24:42.374137 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:24:42.378035 (kubelet)[1806]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:24:42.411534 kubelet[1806]: E1212 17:24:42.411470 1806 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:24:42.413887 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:24:42.414045 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:24:42.414399 systemd[1]: kubelet.service: Consumed 133ms CPU time, 107.2M memory peak. Dec 12 17:24:52.478280 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Dec 12 17:24:52.479708 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:24:52.629797 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:24:52.633315 (kubelet)[1822]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:24:52.666600 kubelet[1822]: E1212 17:24:52.666536 1822 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:24:52.668896 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:24:52.669133 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:24:52.669501 systemd[1]: kubelet.service: Consumed 137ms CPU time, 109.6M memory peak. Dec 12 17:25:02.728386 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Dec 12 17:25:02.729775 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:02.874143 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:02.877854 (kubelet)[1838]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:25:02.912122 kubelet[1838]: E1212 17:25:02.912050 1838 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:25:02.914358 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:25:02.914631 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:25:02.915036 systemd[1]: kubelet.service: Consumed 140ms CPU time, 107.5M memory peak. Dec 12 17:25:03.612287 update_engine[1634]: I20251212 17:25:03.611456 1634 update_attempter.cc:509] Updating boot flags... Dec 12 17:25:12.978327 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Dec 12 17:25:12.979752 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:13.179038 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:13.183141 (kubelet)[1868]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:25:13.214666 kubelet[1868]: E1212 17:25:13.214619 1868 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:25:13.217177 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:25:13.217301 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:25:13.218496 systemd[1]: kubelet.service: Consumed 141ms CPU time, 106.8M memory peak. Dec 12 17:25:23.228386 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Dec 12 17:25:23.229823 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:23.379173 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:23.382725 (kubelet)[1884]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:25:23.414974 kubelet[1884]: E1212 17:25:23.414900 1884 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:25:23.417030 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:25:23.417158 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:25:23.418919 systemd[1]: kubelet.service: Consumed 135ms CPU time, 106.1M memory peak. Dec 12 17:25:33.478263 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. Dec 12 17:25:33.479547 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:33.656777 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:33.660295 (kubelet)[1900]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:25:33.690575 kubelet[1900]: E1212 17:25:33.690527 1900 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:25:33.692959 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:25:33.693196 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:25:33.693789 systemd[1]: kubelet.service: Consumed 135ms CPU time, 107.7M memory peak. Dec 12 17:25:43.728405 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 8. Dec 12 17:25:43.729837 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:43.884996 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:43.888874 (kubelet)[1916]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:25:43.928703 kubelet[1916]: E1212 17:25:43.928641 1916 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:25:43.931530 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:25:43.931652 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:25:43.933565 systemd[1]: kubelet.service: Consumed 141ms CPU time, 105.4M memory peak. Dec 12 17:25:53.978179 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 9. Dec 12 17:25:53.979802 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:54.138448 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:54.141915 (kubelet)[1932]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:25:54.172980 kubelet[1932]: E1212 17:25:54.172919 1932 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:25:54.175522 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:25:54.175775 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:25:54.176444 systemd[1]: kubelet.service: Consumed 136ms CPU time, 105.5M memory peak. Dec 12 17:26:04.228152 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 10. Dec 12 17:26:04.229813 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:26:04.379017 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:26:04.382677 (kubelet)[1949]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:26:04.414124 kubelet[1949]: E1212 17:26:04.414067 1949 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:26:04.416597 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:26:04.416721 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:26:04.418567 systemd[1]: kubelet.service: Consumed 138ms CPU time, 106.9M memory peak. Dec 12 17:26:14.478055 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 11. Dec 12 17:26:14.479320 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:26:14.626025 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:26:14.629469 (kubelet)[1965]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:26:14.661339 kubelet[1965]: E1212 17:26:14.661278 1965 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:26:14.663388 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:26:14.663514 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:26:14.664495 systemd[1]: kubelet.service: Consumed 138ms CPU time, 107.1M memory peak. Dec 12 17:26:24.728427 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 12. Dec 12 17:26:24.730391 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:26:24.888508 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:26:24.892610 (kubelet)[1981]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:26:24.925249 kubelet[1981]: E1212 17:26:24.925172 1981 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:26:24.927517 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:26:24.927650 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:26:24.928189 systemd[1]: kubelet.service: Consumed 141ms CPU time, 105.5M memory peak. Dec 12 17:26:34.978126 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 13. Dec 12 17:26:34.979467 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:26:35.149263 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:26:35.153248 (kubelet)[1997]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:26:35.187602 kubelet[1997]: E1212 17:26:35.187544 1997 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:26:35.190311 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:26:35.190460 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:26:35.191048 systemd[1]: kubelet.service: Consumed 144ms CPU time, 110.9M memory peak. Dec 12 17:26:45.228294 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 14. Dec 12 17:26:45.229788 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:26:45.387106 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:26:45.390828 (kubelet)[2013]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:26:45.421436 kubelet[2013]: E1212 17:26:45.421352 2013 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:26:45.423571 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:26:45.423695 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:26:45.425454 systemd[1]: kubelet.service: Consumed 140ms CPU time, 106.2M memory peak. Dec 12 17:26:55.478304 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 15. Dec 12 17:26:55.479910 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:26:55.634100 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:26:55.638204 (kubelet)[2029]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:26:55.668778 kubelet[2029]: E1212 17:26:55.668702 2029 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:26:55.670836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:26:55.670955 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:26:55.672473 systemd[1]: kubelet.service: Consumed 138ms CPU time, 107.1M memory peak. Dec 12 17:27:05.728410 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 16. Dec 12 17:27:05.730082 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:27:05.884517 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:27:05.888658 (kubelet)[2046]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:27:05.920806 kubelet[2046]: E1212 17:27:05.920725 2046 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:27:05.922822 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:27:05.922943 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:27:05.923461 systemd[1]: kubelet.service: Consumed 140ms CPU time, 107.1M memory peak. Dec 12 17:27:15.978129 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 17. Dec 12 17:27:15.980129 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:27:16.136238 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:27:16.139953 (kubelet)[2062]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:27:16.171095 kubelet[2062]: E1212 17:27:16.171036 2062 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:27:16.173510 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:27:16.173630 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:27:16.175697 systemd[1]: kubelet.service: Consumed 141ms CPU time, 107.2M memory peak. Dec 12 17:27:26.228146 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 18. Dec 12 17:27:26.229468 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:27:26.372671 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:27:26.375954 (kubelet)[2079]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:27:26.405822 kubelet[2079]: E1212 17:27:26.405774 2079 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:27:26.408026 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:27:26.408151 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:27:26.409500 systemd[1]: kubelet.service: Consumed 136ms CPU time, 107.6M memory peak. Dec 12 17:27:36.478349 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 19. Dec 12 17:27:36.479913 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:27:36.631730 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:27:36.635857 (kubelet)[2095]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:27:36.669768 kubelet[2095]: E1212 17:27:36.669689 2095 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:27:36.671953 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:27:36.672202 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:27:36.673531 systemd[1]: kubelet.service: Consumed 146ms CPU time, 107.1M memory peak. Dec 12 17:27:46.728235 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 20. Dec 12 17:27:46.730206 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:27:46.885020 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:27:46.889054 (kubelet)[2112]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:27:46.919651 kubelet[2112]: E1212 17:27:46.919598 2112 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:27:46.922030 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:27:46.922283 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:27:46.922735 systemd[1]: kubelet.service: Consumed 139ms CPU time, 107.2M memory peak. Dec 12 17:27:56.978256 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 21. Dec 12 17:27:56.979686 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:27:57.154232 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:27:57.158290 (kubelet)[2128]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:27:57.189238 kubelet[2128]: E1212 17:27:57.189169 2128 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:27:57.191715 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:27:57.191855 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:27:57.193195 systemd[1]: kubelet.service: Consumed 138ms CPU time, 107.1M memory peak. Dec 12 17:28:07.228222 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 22. Dec 12 17:28:07.229530 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:28:07.395925 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:28:07.399956 (kubelet)[2144]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:28:07.430926 kubelet[2144]: E1212 17:28:07.430853 2144 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:28:07.432846 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:28:07.432966 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:28:07.434460 systemd[1]: kubelet.service: Consumed 139ms CPU time, 106M memory peak. Dec 12 17:28:17.478398 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 23. Dec 12 17:28:17.479797 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:28:17.634610 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:28:17.638700 (kubelet)[2160]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:28:17.673851 kubelet[2160]: E1212 17:28:17.673805 2160 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:28:17.676106 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:28:17.676237 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:28:17.678482 systemd[1]: kubelet.service: Consumed 144ms CPU time, 107.8M memory peak. Dec 12 17:28:26.786797 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Dec 12 17:28:26.788022 systemd[1]: Started sshd@0-10.0.10.18:22-147.75.109.163:48594.service - OpenSSH per-connection server daemon (147.75.109.163:48594). Dec 12 17:28:27.728331 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 24. Dec 12 17:28:27.730184 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:28:27.775933 sshd[2170]: Accepted publickey for core from 147.75.109.163 port 48594 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:28:27.779413 sshd-session[2170]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:27.786788 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Dec 12 17:28:27.787996 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Dec 12 17:28:27.794858 systemd-logind[1632]: New session 1 of user core. Dec 12 17:28:27.814566 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Dec 12 17:28:27.816886 systemd[1]: Starting user@500.service - User Manager for UID 500... Dec 12 17:28:27.840006 (systemd)[2178]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Dec 12 17:28:27.842318 systemd-logind[1632]: New session c1 of user core. Dec 12 17:28:27.942237 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:28:27.955020 (kubelet)[2189]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:28:27.975040 systemd[2178]: Queued start job for default target default.target. Dec 12 17:28:27.983310 systemd[2178]: Created slice app.slice - User Application Slice. Dec 12 17:28:27.983342 systemd[2178]: Reached target paths.target - Paths. Dec 12 17:28:27.983397 systemd[2178]: Reached target timers.target - Timers. Dec 12 17:28:27.984559 systemd[2178]: Starting dbus.socket - D-Bus User Message Bus Socket... Dec 12 17:28:27.989333 kubelet[2189]: E1212 17:28:27.989270 2189 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:28:27.992418 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:28:27.992555 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:28:27.993067 systemd[1]: kubelet.service: Consumed 141ms CPU time, 107.4M memory peak. Dec 12 17:28:27.996702 systemd[2178]: Listening on dbus.socket - D-Bus User Message Bus Socket. Dec 12 17:28:27.996816 systemd[2178]: Reached target sockets.target - Sockets. Dec 12 17:28:27.996853 systemd[2178]: Reached target basic.target - Basic System. Dec 12 17:28:27.996884 systemd[2178]: Reached target default.target - Main User Target. Dec 12 17:28:27.996910 systemd[2178]: Startup finished in 148ms. Dec 12 17:28:27.997000 systemd[1]: Started user@500.service - User Manager for UID 500. Dec 12 17:28:28.006560 systemd[1]: Started session-1.scope - Session 1 of User core. Dec 12 17:28:28.682881 systemd[1]: Started sshd@1-10.0.10.18:22-147.75.109.163:48602.service - OpenSSH per-connection server daemon (147.75.109.163:48602). Dec 12 17:28:29.643134 sshd[2203]: Accepted publickey for core from 147.75.109.163 port 48602 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:28:29.644506 sshd-session[2203]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:29.648210 systemd-logind[1632]: New session 2 of user core. Dec 12 17:28:29.661582 systemd[1]: Started session-2.scope - Session 2 of User core. Dec 12 17:28:30.306792 sshd[2206]: Connection closed by 147.75.109.163 port 48602 Dec 12 17:28:30.307256 sshd-session[2203]: pam_unix(sshd:session): session closed for user core Dec 12 17:28:30.310506 systemd[1]: sshd@1-10.0.10.18:22-147.75.109.163:48602.service: Deactivated successfully. Dec 12 17:28:30.313754 systemd[1]: session-2.scope: Deactivated successfully. Dec 12 17:28:30.314432 systemd-logind[1632]: Session 2 logged out. Waiting for processes to exit. Dec 12 17:28:30.315502 systemd-logind[1632]: Removed session 2. Dec 12 17:28:30.499762 systemd[1]: Started sshd@2-10.0.10.18:22-147.75.109.163:48606.service - OpenSSH per-connection server daemon (147.75.109.163:48606). Dec 12 17:28:31.558360 sshd[2212]: Accepted publickey for core from 147.75.109.163 port 48606 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:28:31.559567 sshd-session[2212]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:31.563192 systemd-logind[1632]: New session 3 of user core. Dec 12 17:28:31.574771 systemd[1]: Started session-3.scope - Session 3 of User core. Dec 12 17:28:32.275314 sshd[2215]: Connection closed by 147.75.109.163 port 48606 Dec 12 17:28:32.275921 sshd-session[2212]: pam_unix(sshd:session): session closed for user core Dec 12 17:28:32.279505 systemd[1]: sshd@2-10.0.10.18:22-147.75.109.163:48606.service: Deactivated successfully. Dec 12 17:28:32.281946 systemd[1]: session-3.scope: Deactivated successfully. Dec 12 17:28:32.282661 systemd-logind[1632]: Session 3 logged out. Waiting for processes to exit. Dec 12 17:28:32.283990 systemd-logind[1632]: Removed session 3. Dec 12 17:28:32.429883 systemd[1]: Started sshd@3-10.0.10.18:22-147.75.109.163:58848.service - OpenSSH per-connection server daemon (147.75.109.163:58848). Dec 12 17:28:33.384583 sshd[2221]: Accepted publickey for core from 147.75.109.163 port 58848 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:28:33.386066 sshd-session[2221]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:33.390442 systemd-logind[1632]: New session 4 of user core. Dec 12 17:28:33.396541 systemd[1]: Started session-4.scope - Session 4 of User core. Dec 12 17:28:34.045736 sshd[2224]: Connection closed by 147.75.109.163 port 58848 Dec 12 17:28:34.046247 sshd-session[2221]: pam_unix(sshd:session): session closed for user core Dec 12 17:28:34.049491 systemd[1]: sshd@3-10.0.10.18:22-147.75.109.163:58848.service: Deactivated successfully. Dec 12 17:28:34.050923 systemd[1]: session-4.scope: Deactivated successfully. Dec 12 17:28:34.051557 systemd-logind[1632]: Session 4 logged out. Waiting for processes to exit. Dec 12 17:28:34.052808 systemd-logind[1632]: Removed session 4. Dec 12 17:28:34.215543 systemd[1]: Started sshd@4-10.0.10.18:22-147.75.109.163:58852.service - OpenSSH per-connection server daemon (147.75.109.163:58852). Dec 12 17:28:35.175287 sshd[2230]: Accepted publickey for core from 147.75.109.163 port 58852 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:28:35.176589 sshd-session[2230]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:35.180425 systemd-logind[1632]: New session 5 of user core. Dec 12 17:28:35.189657 systemd[1]: Started session-5.scope - Session 5 of User core. Dec 12 17:28:35.696743 sudo[2234]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Dec 12 17:28:35.697010 sudo[2234]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:28:35.717560 sudo[2234]: pam_unix(sudo:session): session closed for user root Dec 12 17:28:35.873657 sshd[2233]: Connection closed by 147.75.109.163 port 58852 Dec 12 17:28:35.874113 sshd-session[2230]: pam_unix(sshd:session): session closed for user core Dec 12 17:28:35.877996 systemd[1]: sshd@4-10.0.10.18:22-147.75.109.163:58852.service: Deactivated successfully. Dec 12 17:28:35.880299 systemd[1]: session-5.scope: Deactivated successfully. Dec 12 17:28:35.881061 systemd-logind[1632]: Session 5 logged out. Waiting for processes to exit. Dec 12 17:28:35.882255 systemd-logind[1632]: Removed session 5. Dec 12 17:28:36.041980 systemd[1]: Started sshd@5-10.0.10.18:22-147.75.109.163:58868.service - OpenSSH per-connection server daemon (147.75.109.163:58868). Dec 12 17:28:37.007878 sshd[2240]: Accepted publickey for core from 147.75.109.163 port 58868 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:28:37.008511 sshd-session[2240]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:37.015187 systemd-logind[1632]: New session 6 of user core. Dec 12 17:28:37.024572 systemd[1]: Started session-6.scope - Session 6 of User core. Dec 12 17:28:37.515275 sudo[2245]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Dec 12 17:28:37.515551 sudo[2245]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:28:37.520040 sudo[2245]: pam_unix(sudo:session): session closed for user root Dec 12 17:28:37.524800 sudo[2244]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Dec 12 17:28:37.525064 sudo[2244]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:28:37.533515 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 12 17:28:37.576639 augenrules[2267]: No rules Dec 12 17:28:37.577281 systemd[1]: audit-rules.service: Deactivated successfully. Dec 12 17:28:37.577501 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 12 17:28:37.578470 sudo[2244]: pam_unix(sudo:session): session closed for user root Dec 12 17:28:37.732950 sshd[2243]: Connection closed by 147.75.109.163 port 58868 Dec 12 17:28:37.733257 sshd-session[2240]: pam_unix(sshd:session): session closed for user core Dec 12 17:28:37.737666 systemd[1]: sshd@5-10.0.10.18:22-147.75.109.163:58868.service: Deactivated successfully. Dec 12 17:28:37.739148 systemd[1]: session-6.scope: Deactivated successfully. Dec 12 17:28:37.740942 systemd-logind[1632]: Session 6 logged out. Waiting for processes to exit. Dec 12 17:28:37.741870 systemd-logind[1632]: Removed session 6. Dec 12 17:28:37.913666 systemd[1]: Started sshd@6-10.0.10.18:22-147.75.109.163:58876.service - OpenSSH per-connection server daemon (147.75.109.163:58876). Dec 12 17:28:38.228345 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 25. Dec 12 17:28:38.232407 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:28:38.389098 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:28:38.392867 (kubelet)[2287]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:28:38.422479 kubelet[2287]: E1212 17:28:38.422423 2287 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:28:38.424763 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:28:38.424890 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:28:38.425404 systemd[1]: kubelet.service: Consumed 140ms CPU time, 107.4M memory peak. Dec 12 17:28:38.970781 sshd[2276]: Accepted publickey for core from 147.75.109.163 port 58876 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:28:38.972128 sshd-session[2276]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:38.976189 systemd-logind[1632]: New session 7 of user core. Dec 12 17:28:38.985549 systemd[1]: Started session-7.scope - Session 7 of User core. Dec 12 17:28:39.515513 sudo[2296]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Dec 12 17:28:39.515779 sudo[2296]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:28:39.827761 systemd[1]: Starting docker.service - Docker Application Container Engine... Dec 12 17:28:39.842039 (dockerd)[2316]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Dec 12 17:28:40.068160 dockerd[2316]: time="2025-12-12T17:28:40.067803202Z" level=info msg="Starting up" Dec 12 17:28:40.068709 dockerd[2316]: time="2025-12-12T17:28:40.068672884Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Dec 12 17:28:40.079500 dockerd[2316]: time="2025-12-12T17:28:40.079358112Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Dec 12 17:28:40.134275 dockerd[2316]: time="2025-12-12T17:28:40.134109014Z" level=info msg="Loading containers: start." Dec 12 17:28:40.143415 kernel: Initializing XFRM netlink socket Dec 12 17:28:40.368117 systemd-networkd[1513]: docker0: Link UP Dec 12 17:28:40.372311 dockerd[2316]: time="2025-12-12T17:28:40.372246273Z" level=info msg="Loading containers: done." Dec 12 17:28:40.388569 dockerd[2316]: time="2025-12-12T17:28:40.388511435Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Dec 12 17:28:40.388711 dockerd[2316]: time="2025-12-12T17:28:40.388599555Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Dec 12 17:28:40.388711 dockerd[2316]: time="2025-12-12T17:28:40.388676355Z" level=info msg="Initializing buildkit" Dec 12 17:28:40.409401 dockerd[2316]: time="2025-12-12T17:28:40.409336969Z" level=info msg="Completed buildkit initialization" Dec 12 17:28:40.416095 dockerd[2316]: time="2025-12-12T17:28:40.416056227Z" level=info msg="Daemon has completed initialization" Dec 12 17:28:40.416252 dockerd[2316]: time="2025-12-12T17:28:40.416117987Z" level=info msg="API listen on /run/docker.sock" Dec 12 17:28:40.416307 systemd[1]: Started docker.service - Docker Application Container Engine. Dec 12 17:28:41.207427 containerd[1652]: time="2025-12-12T17:28:41.207213522Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\"" Dec 12 17:28:41.777401 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2435210250.mount: Deactivated successfully. Dec 12 17:28:42.447646 containerd[1652]: time="2025-12-12T17:28:42.447597104Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:28:42.448726 containerd[1652]: time="2025-12-12T17:28:42.448688987Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.3: active requests=0, bytes read=24571138" Dec 12 17:28:42.449574 containerd[1652]: time="2025-12-12T17:28:42.449534789Z" level=info msg="ImageCreate event name:\"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:28:42.453388 containerd[1652]: time="2025-12-12T17:28:42.453294839Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:28:42.456052 containerd[1652]: time="2025-12-12T17:28:42.455801485Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.3\" with image id \"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\", size \"24567639\" in 1.248545243s" Dec 12 17:28:42.456052 containerd[1652]: time="2025-12-12T17:28:42.455872445Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\" returns image reference \"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\"" Dec 12 17:28:42.456517 containerd[1652]: time="2025-12-12T17:28:42.456434647Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\"" Dec 12 17:28:43.360358 containerd[1652]: time="2025-12-12T17:28:43.360287475Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:28:43.362044 containerd[1652]: time="2025-12-12T17:28:43.361991319Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.3: active requests=0, bytes read=19135497" Dec 12 17:28:43.362997 containerd[1652]: time="2025-12-12T17:28:43.362954802Z" level=info msg="ImageCreate event name:\"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:28:43.366703 containerd[1652]: time="2025-12-12T17:28:43.366658611Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:28:43.368137 containerd[1652]: time="2025-12-12T17:28:43.368096135Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.3\" with image id \"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\", size \"20719958\" in 911.630208ms" Dec 12 17:28:43.368137 containerd[1652]: time="2025-12-12T17:28:43.368135295Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\" returns image reference \"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\"" Dec 12 17:28:43.369127 containerd[1652]: time="2025-12-12T17:28:43.369090778Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\"" Dec 12 17:28:44.158961 containerd[1652]: time="2025-12-12T17:28:44.158912429Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:28:44.160392 containerd[1652]: time="2025-12-12T17:28:44.160341113Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.3: active requests=0, bytes read=14191736" Dec 12 17:28:44.161482 containerd[1652]: time="2025-12-12T17:28:44.161430276Z" level=info msg="ImageCreate event name:\"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:28:44.164562 containerd[1652]: time="2025-12-12T17:28:44.164535564Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:28:44.165553 containerd[1652]: time="2025-12-12T17:28:44.165522406Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.3\" with image id \"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\", size \"15776215\" in 796.394508ms" Dec 12 17:28:44.165681 containerd[1652]: time="2025-12-12T17:28:44.165643847Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\" returns image reference \"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\"" Dec 12 17:28:44.166157 containerd[1652]: time="2025-12-12T17:28:44.166123448Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\"" Dec 12 17:28:45.052750 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount273442765.mount: Deactivated successfully. Dec 12 17:28:45.233249 containerd[1652]: time="2025-12-12T17:28:45.233185820Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:28:45.234409 containerd[1652]: time="2025-12-12T17:28:45.234380383Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.3: active requests=0, bytes read=22805279" Dec 12 17:28:45.235949 containerd[1652]: time="2025-12-12T17:28:45.235920827Z" level=info msg="ImageCreate event name:\"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:28:45.238395 containerd[1652]: time="2025-12-12T17:28:45.238335553Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:28:45.238995 containerd[1652]: time="2025-12-12T17:28:45.238967235Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.3\" with image id \"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\", repo tag \"registry.k8s.io/kube-proxy:v1.34.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\", size \"22804272\" in 1.072798427s" Dec 12 17:28:45.239097 containerd[1652]: time="2025-12-12T17:28:45.239081515Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\" returns image reference \"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\"" Dec 12 17:28:45.239767 containerd[1652]: time="2025-12-12T17:28:45.239741357Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Dec 12 17:28:45.831575 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2728870380.mount: Deactivated successfully. Dec 12 17:28:46.425845 containerd[1652]: time="2025-12-12T17:28:46.425800878Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:28:46.427103 containerd[1652]: time="2025-12-12T17:28:46.426563560Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=20395498" Dec 12 17:28:46.430263 containerd[1652]: time="2025-12-12T17:28:46.427899283Z" level=info msg="ImageCreate event name:\"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:28:46.433706 containerd[1652]: time="2025-12-12T17:28:46.433669658Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:28:46.435181 containerd[1652]: time="2025-12-12T17:28:46.435147742Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"20392204\" in 1.195372785s" Dec 12 17:28:46.435660 containerd[1652]: time="2025-12-12T17:28:46.435639063Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\"" Dec 12 17:28:46.436275 containerd[1652]: time="2025-12-12T17:28:46.436240065Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Dec 12 17:28:46.995877 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2637996244.mount: Deactivated successfully. Dec 12 17:28:47.002186 containerd[1652]: time="2025-12-12T17:28:47.001567133Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:28:47.002420 containerd[1652]: time="2025-12-12T17:28:47.002398296Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=268729" Dec 12 17:28:47.003221 containerd[1652]: time="2025-12-12T17:28:47.003196098Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:28:47.006178 containerd[1652]: time="2025-12-12T17:28:47.006123225Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:28:47.006992 containerd[1652]: time="2025-12-12T17:28:47.006954147Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 570.596282ms" Dec 12 17:28:47.007107 containerd[1652]: time="2025-12-12T17:28:47.007091268Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\"" Dec 12 17:28:47.007720 containerd[1652]: time="2025-12-12T17:28:47.007701109Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\"" Dec 12 17:28:47.544753 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount765889278.mount: Deactivated successfully. Dec 12 17:28:48.477877 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 26. Dec 12 17:28:48.479340 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:28:48.624572 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:28:48.636883 (kubelet)[2722]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:28:48.677729 kubelet[2722]: E1212 17:28:48.677632 2722 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:28:48.680093 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:28:48.680223 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:28:48.680533 systemd[1]: kubelet.service: Consumed 147ms CPU time, 107.5M memory peak. Dec 12 17:28:49.250068 containerd[1652]: time="2025-12-12T17:28:49.249984934Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.4-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:28:49.250495 containerd[1652]: time="2025-12-12T17:28:49.250459335Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.4-0: active requests=0, bytes read=98063043" Dec 12 17:28:49.252936 containerd[1652]: time="2025-12-12T17:28:49.252880301Z" level=info msg="ImageCreate event name:\"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:28:49.255899 containerd[1652]: time="2025-12-12T17:28:49.255853109Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:28:49.257031 containerd[1652]: time="2025-12-12T17:28:49.256993672Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.4-0\" with image id \"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\", repo tag \"registry.k8s.io/etcd:3.6.4-0\", repo digest \"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\", size \"98207481\" in 2.249191162s" Dec 12 17:28:49.257225 containerd[1652]: time="2025-12-12T17:28:49.257126272Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\" returns image reference \"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\"" Dec 12 17:28:56.104491 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:28:56.104633 systemd[1]: kubelet.service: Consumed 147ms CPU time, 107.5M memory peak. Dec 12 17:28:56.106562 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:28:56.133657 systemd[1]: Reload requested from client PID 2763 ('systemctl') (unit session-7.scope)... Dec 12 17:28:56.133674 systemd[1]: Reloading... Dec 12 17:28:56.212489 zram_generator::config[2805]: No configuration found. Dec 12 17:28:56.385138 systemd[1]: Reloading finished in 251 ms. Dec 12 17:28:56.438894 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Dec 12 17:28:56.438979 systemd[1]: kubelet.service: Failed with result 'signal'. Dec 12 17:28:56.439310 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:28:56.439358 systemd[1]: kubelet.service: Consumed 96ms CPU time, 95M memory peak. Dec 12 17:28:56.441156 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:28:56.575860 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:28:56.587897 (kubelet)[2853]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 12 17:28:56.625073 kubelet[2853]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 12 17:28:56.625073 kubelet[2853]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 17:28:56.626546 kubelet[2853]: I1212 17:28:56.626479 2853 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 12 17:28:57.490658 kubelet[2853]: I1212 17:28:57.490609 2853 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Dec 12 17:28:57.490658 kubelet[2853]: I1212 17:28:57.490642 2853 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 12 17:28:57.493076 kubelet[2853]: I1212 17:28:57.493021 2853 watchdog_linux.go:95] "Systemd watchdog is not enabled" Dec 12 17:28:57.493076 kubelet[2853]: I1212 17:28:57.493044 2853 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 12 17:28:57.493314 kubelet[2853]: I1212 17:28:57.493286 2853 server.go:956] "Client rotation is on, will bootstrap in background" Dec 12 17:28:57.501402 kubelet[2853]: E1212 17:28:57.501353 2853 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.10.18:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.10.18:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Dec 12 17:28:57.503638 kubelet[2853]: I1212 17:28:57.502946 2853 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 12 17:28:57.506817 kubelet[2853]: I1212 17:28:57.506799 2853 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 12 17:28:57.509475 kubelet[2853]: I1212 17:28:57.509445 2853 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Dec 12 17:28:57.509691 kubelet[2853]: I1212 17:28:57.509666 2853 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 12 17:28:57.509844 kubelet[2853]: I1212 17:28:57.509691 2853 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-d-e796afb129","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 12 17:28:57.509920 kubelet[2853]: I1212 17:28:57.509846 2853 topology_manager.go:138] "Creating topology manager with none policy" Dec 12 17:28:57.509920 kubelet[2853]: I1212 17:28:57.509856 2853 container_manager_linux.go:306] "Creating device plugin manager" Dec 12 17:28:57.509961 kubelet[2853]: I1212 17:28:57.509948 2853 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Dec 12 17:28:57.512526 kubelet[2853]: I1212 17:28:57.512504 2853 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:28:57.514259 kubelet[2853]: I1212 17:28:57.514220 2853 kubelet.go:475] "Attempting to sync node with API server" Dec 12 17:28:57.514259 kubelet[2853]: I1212 17:28:57.514252 2853 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 12 17:28:57.514905 kubelet[2853]: E1212 17:28:57.514863 2853 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.10.18:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-2-d-e796afb129&limit=500&resourceVersion=0\": dial tcp 10.0.10.18:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 12 17:28:57.515507 kubelet[2853]: I1212 17:28:57.515488 2853 kubelet.go:387] "Adding apiserver pod source" Dec 12 17:28:57.515539 kubelet[2853]: I1212 17:28:57.515523 2853 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 12 17:28:57.516810 kubelet[2853]: E1212 17:28:57.516775 2853 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.10.18:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.10.18:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 12 17:28:57.517995 kubelet[2853]: I1212 17:28:57.517975 2853 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 12 17:28:57.518654 kubelet[2853]: I1212 17:28:57.518637 2853 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 12 17:28:57.518688 kubelet[2853]: I1212 17:28:57.518669 2853 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Dec 12 17:28:57.518710 kubelet[2853]: W1212 17:28:57.518702 2853 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Dec 12 17:28:57.521044 kubelet[2853]: I1212 17:28:57.521024 2853 server.go:1262] "Started kubelet" Dec 12 17:28:57.521402 kubelet[2853]: I1212 17:28:57.521356 2853 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 12 17:28:57.521625 kubelet[2853]: I1212 17:28:57.521577 2853 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 12 17:28:57.521654 kubelet[2853]: I1212 17:28:57.521643 2853 server_v1.go:49] "podresources" method="list" useActivePods=true Dec 12 17:28:57.521959 kubelet[2853]: I1212 17:28:57.521916 2853 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 12 17:28:57.522229 kubelet[2853]: I1212 17:28:57.522210 2853 server.go:310] "Adding debug handlers to kubelet server" Dec 12 17:28:57.523053 kubelet[2853]: I1212 17:28:57.523033 2853 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 12 17:28:57.527409 kubelet[2853]: I1212 17:28:57.524583 2853 volume_manager.go:313] "Starting Kubelet Volume Manager" Dec 12 17:28:57.527409 kubelet[2853]: E1212 17:28:57.524825 2853 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-d-e796afb129\" not found" Dec 12 17:28:57.527409 kubelet[2853]: I1212 17:28:57.525210 2853 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 12 17:28:57.527409 kubelet[2853]: I1212 17:28:57.525291 2853 reconciler.go:29] "Reconciler: start to sync state" Dec 12 17:28:57.527409 kubelet[2853]: I1212 17:28:57.525595 2853 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 12 17:28:57.530928 kubelet[2853]: E1212 17:28:57.530871 2853 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.10.18:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-d-e796afb129?timeout=10s\": dial tcp 10.0.10.18:6443: connect: connection refused" interval="200ms" Dec 12 17:28:57.531459 kubelet[2853]: E1212 17:28:57.530912 2853 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.10.18:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.10.18:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 12 17:28:57.531854 kubelet[2853]: I1212 17:28:57.531791 2853 factory.go:223] Registration of the systemd container factory successfully Dec 12 17:28:57.531964 kubelet[2853]: I1212 17:28:57.531922 2853 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 12 17:28:57.534390 kubelet[2853]: E1212 17:28:57.530223 2853 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.10.18:6443/api/v1/namespaces/default/events\": dial tcp 10.0.10.18:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-2-d-e796afb129.188087f45c62ed0b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-2-d-e796afb129,UID:ci-4459-2-2-d-e796afb129,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-2-d-e796afb129,},FirstTimestamp:2025-12-12 17:28:57.520991499 +0000 UTC m=+0.930105497,LastTimestamp:2025-12-12 17:28:57.520991499 +0000 UTC m=+0.930105497,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-d-e796afb129,}" Dec 12 17:28:57.534390 kubelet[2853]: E1212 17:28:57.533175 2853 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 12 17:28:57.534390 kubelet[2853]: I1212 17:28:57.533216 2853 factory.go:223] Registration of the containerd container factory successfully Dec 12 17:28:57.544427 kubelet[2853]: I1212 17:28:57.544402 2853 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 12 17:28:57.544563 kubelet[2853]: I1212 17:28:57.544551 2853 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 12 17:28:57.544617 kubelet[2853]: I1212 17:28:57.544610 2853 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:28:57.546499 kubelet[2853]: I1212 17:28:57.546455 2853 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Dec 12 17:28:57.547636 kubelet[2853]: I1212 17:28:57.547420 2853 policy_none.go:49] "None policy: Start" Dec 12 17:28:57.547636 kubelet[2853]: I1212 17:28:57.547442 2853 memory_manager.go:187] "Starting memorymanager" policy="None" Dec 12 17:28:57.547636 kubelet[2853]: I1212 17:28:57.547452 2853 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Dec 12 17:28:57.547790 kubelet[2853]: I1212 17:28:57.547657 2853 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Dec 12 17:28:57.547790 kubelet[2853]: I1212 17:28:57.547671 2853 status_manager.go:244] "Starting to sync pod status with apiserver" Dec 12 17:28:57.547790 kubelet[2853]: I1212 17:28:57.547707 2853 kubelet.go:2427] "Starting kubelet main sync loop" Dec 12 17:28:57.547790 kubelet[2853]: E1212 17:28:57.547745 2853 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 12 17:28:57.550221 kubelet[2853]: I1212 17:28:57.549444 2853 policy_none.go:47] "Start" Dec 12 17:28:57.551726 kubelet[2853]: E1212 17:28:57.551677 2853 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.10.18:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.10.18:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Dec 12 17:28:57.555249 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Dec 12 17:28:57.567276 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Dec 12 17:28:57.570508 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Dec 12 17:28:57.587690 kubelet[2853]: E1212 17:28:57.587635 2853 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 12 17:28:57.588078 kubelet[2853]: I1212 17:28:57.587887 2853 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 12 17:28:57.588078 kubelet[2853]: I1212 17:28:57.587905 2853 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 12 17:28:57.588622 kubelet[2853]: I1212 17:28:57.588465 2853 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 12 17:28:57.589090 kubelet[2853]: E1212 17:28:57.588998 2853 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 12 17:28:57.589090 kubelet[2853]: E1212 17:28:57.589036 2853 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-2-d-e796afb129\" not found" Dec 12 17:28:57.659183 systemd[1]: Created slice kubepods-burstable-pod2d21f3c68ade6c5e71955b06908d1830.slice - libcontainer container kubepods-burstable-pod2d21f3c68ade6c5e71955b06908d1830.slice. Dec 12 17:28:57.671646 kubelet[2853]: E1212 17:28:57.671591 2853 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-d-e796afb129\" not found" node="ci-4459-2-2-d-e796afb129" Dec 12 17:28:57.674911 systemd[1]: Created slice kubepods-burstable-podb224700510742149228982abf9add67d.slice - libcontainer container kubepods-burstable-podb224700510742149228982abf9add67d.slice. Dec 12 17:28:57.690570 kubelet[2853]: I1212 17:28:57.690545 2853 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-d-e796afb129" Dec 12 17:28:57.690995 kubelet[2853]: E1212 17:28:57.690968 2853 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.10.18:6443/api/v1/nodes\": dial tcp 10.0.10.18:6443: connect: connection refused" node="ci-4459-2-2-d-e796afb129" Dec 12 17:28:57.693696 kubelet[2853]: E1212 17:28:57.693662 2853 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-d-e796afb129\" not found" node="ci-4459-2-2-d-e796afb129" Dec 12 17:28:57.695941 systemd[1]: Created slice kubepods-burstable-pod3478b1edc119b35a7ca6c7fef321b618.slice - libcontainer container kubepods-burstable-pod3478b1edc119b35a7ca6c7fef321b618.slice. Dec 12 17:28:57.697772 kubelet[2853]: E1212 17:28:57.697720 2853 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-d-e796afb129\" not found" node="ci-4459-2-2-d-e796afb129" Dec 12 17:28:57.727172 kubelet[2853]: I1212 17:28:57.727112 2853 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2d21f3c68ade6c5e71955b06908d1830-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-d-e796afb129\" (UID: \"2d21f3c68ade6c5e71955b06908d1830\") " pod="kube-system/kube-apiserver-ci-4459-2-2-d-e796afb129" Dec 12 17:28:57.727275 kubelet[2853]: I1212 17:28:57.727182 2853 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2d21f3c68ade6c5e71955b06908d1830-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-d-e796afb129\" (UID: \"2d21f3c68ade6c5e71955b06908d1830\") " pod="kube-system/kube-apiserver-ci-4459-2-2-d-e796afb129" Dec 12 17:28:57.727275 kubelet[2853]: I1212 17:28:57.727211 2853 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b224700510742149228982abf9add67d-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-d-e796afb129\" (UID: \"b224700510742149228982abf9add67d\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-d-e796afb129" Dec 12 17:28:57.727275 kubelet[2853]: I1212 17:28:57.727232 2853 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b224700510742149228982abf9add67d-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-d-e796afb129\" (UID: \"b224700510742149228982abf9add67d\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-d-e796afb129" Dec 12 17:28:57.727275 kubelet[2853]: I1212 17:28:57.727247 2853 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b224700510742149228982abf9add67d-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-d-e796afb129\" (UID: \"b224700510742149228982abf9add67d\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-d-e796afb129" Dec 12 17:28:57.727275 kubelet[2853]: I1212 17:28:57.727262 2853 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3478b1edc119b35a7ca6c7fef321b618-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-d-e796afb129\" (UID: \"3478b1edc119b35a7ca6c7fef321b618\") " pod="kube-system/kube-scheduler-ci-4459-2-2-d-e796afb129" Dec 12 17:28:57.727398 kubelet[2853]: I1212 17:28:57.727277 2853 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2d21f3c68ade6c5e71955b06908d1830-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-d-e796afb129\" (UID: \"2d21f3c68ade6c5e71955b06908d1830\") " pod="kube-system/kube-apiserver-ci-4459-2-2-d-e796afb129" Dec 12 17:28:57.727398 kubelet[2853]: I1212 17:28:57.727326 2853 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b224700510742149228982abf9add67d-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-d-e796afb129\" (UID: \"b224700510742149228982abf9add67d\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-d-e796afb129" Dec 12 17:28:57.727398 kubelet[2853]: I1212 17:28:57.727363 2853 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b224700510742149228982abf9add67d-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-d-e796afb129\" (UID: \"b224700510742149228982abf9add67d\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-d-e796afb129" Dec 12 17:28:57.731642 kubelet[2853]: E1212 17:28:57.731604 2853 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.10.18:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-d-e796afb129?timeout=10s\": dial tcp 10.0.10.18:6443: connect: connection refused" interval="400ms" Dec 12 17:28:57.892857 kubelet[2853]: I1212 17:28:57.892755 2853 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-d-e796afb129" Dec 12 17:28:57.894332 kubelet[2853]: E1212 17:28:57.894282 2853 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.10.18:6443/api/v1/nodes\": dial tcp 10.0.10.18:6443: connect: connection refused" node="ci-4459-2-2-d-e796afb129" Dec 12 17:28:57.976777 containerd[1652]: time="2025-12-12T17:28:57.976713563Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-d-e796afb129,Uid:2d21f3c68ade6c5e71955b06908d1830,Namespace:kube-system,Attempt:0,}" Dec 12 17:28:57.997416 containerd[1652]: time="2025-12-12T17:28:57.997359376Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-d-e796afb129,Uid:b224700510742149228982abf9add67d,Namespace:kube-system,Attempt:0,}" Dec 12 17:28:58.003124 containerd[1652]: time="2025-12-12T17:28:58.002936191Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-d-e796afb129,Uid:3478b1edc119b35a7ca6c7fef321b618,Namespace:kube-system,Attempt:0,}" Dec 12 17:28:58.132663 kubelet[2853]: E1212 17:28:58.132587 2853 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.10.18:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-d-e796afb129?timeout=10s\": dial tcp 10.0.10.18:6443: connect: connection refused" interval="800ms" Dec 12 17:28:58.295888 kubelet[2853]: I1212 17:28:58.295856 2853 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-d-e796afb129" Dec 12 17:28:58.296540 kubelet[2853]: E1212 17:28:58.296506 2853 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.10.18:6443/api/v1/nodes\": dial tcp 10.0.10.18:6443: connect: connection refused" node="ci-4459-2-2-d-e796afb129" Dec 12 17:28:58.479942 kubelet[2853]: E1212 17:28:58.479897 2853 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.10.18:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-2-d-e796afb129&limit=500&resourceVersion=0\": dial tcp 10.0.10.18:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 12 17:28:58.528785 kubelet[2853]: E1212 17:28:58.528747 2853 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.10.18:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.10.18:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Dec 12 17:28:58.529539 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1031601457.mount: Deactivated successfully. Dec 12 17:28:58.535020 containerd[1652]: time="2025-12-12T17:28:58.534947973Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:28:58.537291 containerd[1652]: time="2025-12-12T17:28:58.537260499Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Dec 12 17:28:58.540962 containerd[1652]: time="2025-12-12T17:28:58.540921068Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:28:58.544420 containerd[1652]: time="2025-12-12T17:28:58.543876436Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:28:58.547714 containerd[1652]: time="2025-12-12T17:28:58.547606966Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:28:58.549897 containerd[1652]: time="2025-12-12T17:28:58.549837531Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:28:58.550564 containerd[1652]: time="2025-12-12T17:28:58.550534093Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 571.120763ms" Dec 12 17:28:58.553017 containerd[1652]: time="2025-12-12T17:28:58.552986460Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 12 17:28:58.555403 containerd[1652]: time="2025-12-12T17:28:58.555377066Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 12 17:28:58.560739 containerd[1652]: time="2025-12-12T17:28:58.560519079Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 558.459851ms" Dec 12 17:28:58.562768 containerd[1652]: time="2025-12-12T17:28:58.562740045Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 557.525408ms" Dec 12 17:28:58.577018 containerd[1652]: time="2025-12-12T17:28:58.576975122Z" level=info msg="connecting to shim 12cd95a26902ae20ab1c96117d2a19b06ab004273e8838528fdd87f0ef93b51a" address="unix:///run/containerd/s/45892690fb7869ab4b562b1d5777f1b7641cb6a4c8045bccd461250e0211288b" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:28:58.599121 containerd[1652]: time="2025-12-12T17:28:58.598406418Z" level=info msg="connecting to shim 103bebf08f432d6627be308c0ef121d0f5220314a42c263e9fccf3682dafe7a7" address="unix:///run/containerd/s/0f84d482106c4e36d1f4118ca3114299293212e0ccbf992f367f961a5e80e8f2" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:28:58.598539 systemd[1]: Started cri-containerd-12cd95a26902ae20ab1c96117d2a19b06ab004273e8838528fdd87f0ef93b51a.scope - libcontainer container 12cd95a26902ae20ab1c96117d2a19b06ab004273e8838528fdd87f0ef93b51a. Dec 12 17:28:58.605225 containerd[1652]: time="2025-12-12T17:28:58.604647474Z" level=info msg="connecting to shim 5d78bc2dc87b504e3456e91acc999ad532d01e462b5f7ef85eac864d6d85c5cc" address="unix:///run/containerd/s/0169a36c4a241f7144c72ebbb56ef9e59ec856dd6444e7820bef07abdf706c52" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:28:58.632642 systemd[1]: Started cri-containerd-103bebf08f432d6627be308c0ef121d0f5220314a42c263e9fccf3682dafe7a7.scope - libcontainer container 103bebf08f432d6627be308c0ef121d0f5220314a42c263e9fccf3682dafe7a7. Dec 12 17:28:58.636408 systemd[1]: Started cri-containerd-5d78bc2dc87b504e3456e91acc999ad532d01e462b5f7ef85eac864d6d85c5cc.scope - libcontainer container 5d78bc2dc87b504e3456e91acc999ad532d01e462b5f7ef85eac864d6d85c5cc. Dec 12 17:28:58.640506 containerd[1652]: time="2025-12-12T17:28:58.640464487Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-d-e796afb129,Uid:2d21f3c68ade6c5e71955b06908d1830,Namespace:kube-system,Attempt:0,} returns sandbox id \"12cd95a26902ae20ab1c96117d2a19b06ab004273e8838528fdd87f0ef93b51a\"" Dec 12 17:28:58.648925 containerd[1652]: time="2025-12-12T17:28:58.648884749Z" level=info msg="CreateContainer within sandbox \"12cd95a26902ae20ab1c96117d2a19b06ab004273e8838528fdd87f0ef93b51a\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Dec 12 17:28:58.657914 containerd[1652]: time="2025-12-12T17:28:58.657869252Z" level=info msg="Container 0e337617f32f0da5e4b647d6ec9841626d4b82778d57558d21f21beba74fca0d: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:28:58.668500 containerd[1652]: time="2025-12-12T17:28:58.668156799Z" level=info msg="CreateContainer within sandbox \"12cd95a26902ae20ab1c96117d2a19b06ab004273e8838528fdd87f0ef93b51a\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"0e337617f32f0da5e4b647d6ec9841626d4b82778d57558d21f21beba74fca0d\"" Dec 12 17:28:58.669194 containerd[1652]: time="2025-12-12T17:28:58.669159041Z" level=info msg="StartContainer for \"0e337617f32f0da5e4b647d6ec9841626d4b82778d57558d21f21beba74fca0d\"" Dec 12 17:28:58.670360 containerd[1652]: time="2025-12-12T17:28:58.670317524Z" level=info msg="connecting to shim 0e337617f32f0da5e4b647d6ec9841626d4b82778d57558d21f21beba74fca0d" address="unix:///run/containerd/s/45892690fb7869ab4b562b1d5777f1b7641cb6a4c8045bccd461250e0211288b" protocol=ttrpc version=3 Dec 12 17:28:58.675253 containerd[1652]: time="2025-12-12T17:28:58.675209177Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-d-e796afb129,Uid:b224700510742149228982abf9add67d,Namespace:kube-system,Attempt:0,} returns sandbox id \"103bebf08f432d6627be308c0ef121d0f5220314a42c263e9fccf3682dafe7a7\"" Dec 12 17:28:58.678961 containerd[1652]: time="2025-12-12T17:28:58.678899547Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-d-e796afb129,Uid:3478b1edc119b35a7ca6c7fef321b618,Namespace:kube-system,Attempt:0,} returns sandbox id \"5d78bc2dc87b504e3456e91acc999ad532d01e462b5f7ef85eac864d6d85c5cc\"" Dec 12 17:28:58.681437 containerd[1652]: time="2025-12-12T17:28:58.680960232Z" level=info msg="CreateContainer within sandbox \"103bebf08f432d6627be308c0ef121d0f5220314a42c263e9fccf3682dafe7a7\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Dec 12 17:28:58.684451 containerd[1652]: time="2025-12-12T17:28:58.684410841Z" level=info msg="CreateContainer within sandbox \"5d78bc2dc87b504e3456e91acc999ad532d01e462b5f7ef85eac864d6d85c5cc\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Dec 12 17:28:58.688615 systemd[1]: Started cri-containerd-0e337617f32f0da5e4b647d6ec9841626d4b82778d57558d21f21beba74fca0d.scope - libcontainer container 0e337617f32f0da5e4b647d6ec9841626d4b82778d57558d21f21beba74fca0d. Dec 12 17:28:58.694396 containerd[1652]: time="2025-12-12T17:28:58.693524385Z" level=info msg="Container a2421665c5754c36b73daf1574b84757f864cb4efa1c01d254a1613418902429: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:28:58.695465 containerd[1652]: time="2025-12-12T17:28:58.695428750Z" level=info msg="Container 3631e6b2bbee2601d8f3e02bf8652523cd17bdfffb189050845d13a704131559: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:28:58.701406 containerd[1652]: time="2025-12-12T17:28:58.701148604Z" level=info msg="CreateContainer within sandbox \"103bebf08f432d6627be308c0ef121d0f5220314a42c263e9fccf3682dafe7a7\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"a2421665c5754c36b73daf1574b84757f864cb4efa1c01d254a1613418902429\"" Dec 12 17:28:58.702407 containerd[1652]: time="2025-12-12T17:28:58.701814886Z" level=info msg="StartContainer for \"a2421665c5754c36b73daf1574b84757f864cb4efa1c01d254a1613418902429\"" Dec 12 17:28:58.703103 containerd[1652]: time="2025-12-12T17:28:58.703061969Z" level=info msg="connecting to shim a2421665c5754c36b73daf1574b84757f864cb4efa1c01d254a1613418902429" address="unix:///run/containerd/s/0f84d482106c4e36d1f4118ca3114299293212e0ccbf992f367f961a5e80e8f2" protocol=ttrpc version=3 Dec 12 17:28:58.705858 containerd[1652]: time="2025-12-12T17:28:58.705776256Z" level=info msg="CreateContainer within sandbox \"5d78bc2dc87b504e3456e91acc999ad532d01e462b5f7ef85eac864d6d85c5cc\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"3631e6b2bbee2601d8f3e02bf8652523cd17bdfffb189050845d13a704131559\"" Dec 12 17:28:58.706343 containerd[1652]: time="2025-12-12T17:28:58.706292018Z" level=info msg="StartContainer for \"3631e6b2bbee2601d8f3e02bf8652523cd17bdfffb189050845d13a704131559\"" Dec 12 17:28:58.707358 containerd[1652]: time="2025-12-12T17:28:58.707310180Z" level=info msg="connecting to shim 3631e6b2bbee2601d8f3e02bf8652523cd17bdfffb189050845d13a704131559" address="unix:///run/containerd/s/0169a36c4a241f7144c72ebbb56ef9e59ec856dd6444e7820bef07abdf706c52" protocol=ttrpc version=3 Dec 12 17:28:58.720605 systemd[1]: Started cri-containerd-a2421665c5754c36b73daf1574b84757f864cb4efa1c01d254a1613418902429.scope - libcontainer container a2421665c5754c36b73daf1574b84757f864cb4efa1c01d254a1613418902429. Dec 12 17:28:58.730802 containerd[1652]: time="2025-12-12T17:28:58.730763041Z" level=info msg="StartContainer for \"0e337617f32f0da5e4b647d6ec9841626d4b82778d57558d21f21beba74fca0d\" returns successfully" Dec 12 17:28:58.734864 systemd[1]: Started cri-containerd-3631e6b2bbee2601d8f3e02bf8652523cd17bdfffb189050845d13a704131559.scope - libcontainer container 3631e6b2bbee2601d8f3e02bf8652523cd17bdfffb189050845d13a704131559. Dec 12 17:28:58.785127 containerd[1652]: time="2025-12-12T17:28:58.785075622Z" level=info msg="StartContainer for \"3631e6b2bbee2601d8f3e02bf8652523cd17bdfffb189050845d13a704131559\" returns successfully" Dec 12 17:28:58.786132 containerd[1652]: time="2025-12-12T17:28:58.786109065Z" level=info msg="StartContainer for \"a2421665c5754c36b73daf1574b84757f864cb4efa1c01d254a1613418902429\" returns successfully" Dec 12 17:28:59.098410 kubelet[2853]: I1212 17:28:59.098359 2853 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-d-e796afb129" Dec 12 17:28:59.560849 kubelet[2853]: E1212 17:28:59.560819 2853 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-d-e796afb129\" not found" node="ci-4459-2-2-d-e796afb129" Dec 12 17:28:59.565354 kubelet[2853]: E1212 17:28:59.565320 2853 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-d-e796afb129\" not found" node="ci-4459-2-2-d-e796afb129" Dec 12 17:28:59.566688 kubelet[2853]: E1212 17:28:59.566665 2853 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-d-e796afb129\" not found" node="ci-4459-2-2-d-e796afb129" Dec 12 17:29:00.568610 kubelet[2853]: E1212 17:29:00.568580 2853 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-d-e796afb129\" not found" node="ci-4459-2-2-d-e796afb129" Dec 12 17:29:00.568610 kubelet[2853]: E1212 17:29:00.568582 2853 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-d-e796afb129\" not found" node="ci-4459-2-2-d-e796afb129" Dec 12 17:29:00.742513 kubelet[2853]: E1212 17:29:00.742465 2853 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-2-2-d-e796afb129\" not found" node="ci-4459-2-2-d-e796afb129" Dec 12 17:29:00.932243 kubelet[2853]: I1212 17:29:00.931902 2853 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-d-e796afb129" Dec 12 17:29:00.932243 kubelet[2853]: E1212 17:29:00.931948 2853 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"ci-4459-2-2-d-e796afb129\": node \"ci-4459-2-2-d-e796afb129\" not found" Dec 12 17:29:00.944297 kubelet[2853]: E1212 17:29:00.944227 2853 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-d-e796afb129\" not found" Dec 12 17:29:01.045020 kubelet[2853]: E1212 17:29:01.044970 2853 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-d-e796afb129\" not found" Dec 12 17:29:01.145728 kubelet[2853]: E1212 17:29:01.145635 2853 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-d-e796afb129\" not found" Dec 12 17:29:01.226399 kubelet[2853]: I1212 17:29:01.226032 2853 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-d-e796afb129" Dec 12 17:29:01.231784 kubelet[2853]: E1212 17:29:01.231741 2853 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-d-e796afb129\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-2-d-e796afb129" Dec 12 17:29:01.231784 kubelet[2853]: I1212 17:29:01.231777 2853 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-d-e796afb129" Dec 12 17:29:01.233628 kubelet[2853]: E1212 17:29:01.233580 2853 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-2-d-e796afb129\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459-2-2-d-e796afb129" Dec 12 17:29:01.233628 kubelet[2853]: I1212 17:29:01.233610 2853 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-d-e796afb129" Dec 12 17:29:01.235213 kubelet[2853]: E1212 17:29:01.235174 2853 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-d-e796afb129\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-2-d-e796afb129" Dec 12 17:29:01.519380 kubelet[2853]: I1212 17:29:01.519318 2853 apiserver.go:52] "Watching apiserver" Dec 12 17:29:01.525476 kubelet[2853]: I1212 17:29:01.525434 2853 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 12 17:29:03.055516 systemd[1]: Reload requested from client PID 3147 ('systemctl') (unit session-7.scope)... Dec 12 17:29:03.055532 systemd[1]: Reloading... Dec 12 17:29:03.118614 kubelet[2853]: I1212 17:29:03.118572 2853 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-d-e796afb129" Dec 12 17:29:03.122395 zram_generator::config[3193]: No configuration found. Dec 12 17:29:03.302511 systemd[1]: Reloading finished in 246 ms. Dec 12 17:29:03.330977 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:29:03.344015 systemd[1]: kubelet.service: Deactivated successfully. Dec 12 17:29:03.344243 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:29:03.344293 systemd[1]: kubelet.service: Consumed 1.316s CPU time, 121.7M memory peak. Dec 12 17:29:03.346464 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:29:03.503944 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:29:03.514883 (kubelet)[3235]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 12 17:29:03.549627 kubelet[3235]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 12 17:29:03.549627 kubelet[3235]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 17:29:03.550964 kubelet[3235]: I1212 17:29:03.549666 3235 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 12 17:29:03.558590 kubelet[3235]: I1212 17:29:03.558554 3235 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Dec 12 17:29:03.558590 kubelet[3235]: I1212 17:29:03.558586 3235 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 12 17:29:03.558727 kubelet[3235]: I1212 17:29:03.558616 3235 watchdog_linux.go:95] "Systemd watchdog is not enabled" Dec 12 17:29:03.558727 kubelet[3235]: I1212 17:29:03.558623 3235 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 12 17:29:03.558815 kubelet[3235]: I1212 17:29:03.558799 3235 server.go:956] "Client rotation is on, will bootstrap in background" Dec 12 17:29:03.560199 kubelet[3235]: I1212 17:29:03.560160 3235 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Dec 12 17:29:03.562298 kubelet[3235]: I1212 17:29:03.562240 3235 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 12 17:29:03.566154 kubelet[3235]: I1212 17:29:03.566123 3235 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 12 17:29:03.570648 kubelet[3235]: I1212 17:29:03.569801 3235 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Dec 12 17:29:03.570648 kubelet[3235]: I1212 17:29:03.570013 3235 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 12 17:29:03.570648 kubelet[3235]: I1212 17:29:03.570047 3235 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-d-e796afb129","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 12 17:29:03.570648 kubelet[3235]: I1212 17:29:03.570278 3235 topology_manager.go:138] "Creating topology manager with none policy" Dec 12 17:29:03.570842 kubelet[3235]: I1212 17:29:03.570292 3235 container_manager_linux.go:306] "Creating device plugin manager" Dec 12 17:29:03.570842 kubelet[3235]: I1212 17:29:03.570326 3235 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Dec 12 17:29:03.571452 kubelet[3235]: I1212 17:29:03.571412 3235 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:29:03.571601 kubelet[3235]: I1212 17:29:03.571588 3235 kubelet.go:475] "Attempting to sync node with API server" Dec 12 17:29:03.571638 kubelet[3235]: I1212 17:29:03.571607 3235 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 12 17:29:03.571638 kubelet[3235]: I1212 17:29:03.571632 3235 kubelet.go:387] "Adding apiserver pod source" Dec 12 17:29:03.571676 kubelet[3235]: I1212 17:29:03.571641 3235 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 12 17:29:03.573690 kubelet[3235]: I1212 17:29:03.573614 3235 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 12 17:29:03.574273 kubelet[3235]: I1212 17:29:03.574245 3235 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 12 17:29:03.574401 kubelet[3235]: I1212 17:29:03.574280 3235 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Dec 12 17:29:03.576314 kubelet[3235]: I1212 17:29:03.576279 3235 server.go:1262] "Started kubelet" Dec 12 17:29:03.576810 kubelet[3235]: I1212 17:29:03.576748 3235 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 12 17:29:03.576872 kubelet[3235]: I1212 17:29:03.576815 3235 server_v1.go:49] "podresources" method="list" useActivePods=true Dec 12 17:29:03.576872 kubelet[3235]: I1212 17:29:03.576868 3235 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 12 17:29:03.577017 kubelet[3235]: I1212 17:29:03.576993 3235 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 12 17:29:03.577100 kubelet[3235]: I1212 17:29:03.577079 3235 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 12 17:29:03.577293 kubelet[3235]: I1212 17:29:03.577271 3235 volume_manager.go:313] "Starting Kubelet Volume Manager" Dec 12 17:29:03.577379 kubelet[3235]: I1212 17:29:03.577358 3235 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 12 17:29:03.577510 kubelet[3235]: I1212 17:29:03.577492 3235 reconciler.go:29] "Reconciler: start to sync state" Dec 12 17:29:03.577782 kubelet[3235]: I1212 17:29:03.577752 3235 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 12 17:29:03.578428 kubelet[3235]: I1212 17:29:03.578409 3235 server.go:310] "Adding debug handlers to kubelet server" Dec 12 17:29:03.578523 kubelet[3235]: E1212 17:29:03.578491 3235 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-d-e796afb129\" not found" Dec 12 17:29:03.583502 kubelet[3235]: I1212 17:29:03.583406 3235 factory.go:223] Registration of the containerd container factory successfully Dec 12 17:29:03.583502 kubelet[3235]: I1212 17:29:03.583430 3235 factory.go:223] Registration of the systemd container factory successfully Dec 12 17:29:03.583608 kubelet[3235]: I1212 17:29:03.583522 3235 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 12 17:29:03.598289 kubelet[3235]: E1212 17:29:03.598244 3235 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 12 17:29:03.617160 kubelet[3235]: I1212 17:29:03.617115 3235 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Dec 12 17:29:03.618459 kubelet[3235]: I1212 17:29:03.618424 3235 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Dec 12 17:29:03.618459 kubelet[3235]: I1212 17:29:03.618453 3235 status_manager.go:244] "Starting to sync pod status with apiserver" Dec 12 17:29:03.618574 kubelet[3235]: I1212 17:29:03.618489 3235 kubelet.go:2427] "Starting kubelet main sync loop" Dec 12 17:29:03.618574 kubelet[3235]: E1212 17:29:03.618538 3235 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 12 17:29:03.633055 kubelet[3235]: I1212 17:29:03.631924 3235 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 12 17:29:03.633055 kubelet[3235]: I1212 17:29:03.631941 3235 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 12 17:29:03.633055 kubelet[3235]: I1212 17:29:03.631961 3235 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:29:03.633055 kubelet[3235]: I1212 17:29:03.632080 3235 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 12 17:29:03.633055 kubelet[3235]: I1212 17:29:03.632090 3235 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 12 17:29:03.633281 kubelet[3235]: I1212 17:29:03.633262 3235 policy_none.go:49] "None policy: Start" Dec 12 17:29:03.633350 kubelet[3235]: I1212 17:29:03.633330 3235 memory_manager.go:187] "Starting memorymanager" policy="None" Dec 12 17:29:03.633443 kubelet[3235]: I1212 17:29:03.633430 3235 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Dec 12 17:29:03.633637 kubelet[3235]: I1212 17:29:03.633612 3235 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Dec 12 17:29:03.633838 kubelet[3235]: I1212 17:29:03.633775 3235 policy_none.go:47] "Start" Dec 12 17:29:03.639342 kubelet[3235]: E1212 17:29:03.639310 3235 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 12 17:29:03.639511 kubelet[3235]: I1212 17:29:03.639495 3235 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 12 17:29:03.639549 kubelet[3235]: I1212 17:29:03.639514 3235 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 12 17:29:03.640226 kubelet[3235]: I1212 17:29:03.640026 3235 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 12 17:29:03.641930 kubelet[3235]: E1212 17:29:03.641906 3235 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 12 17:29:03.720002 kubelet[3235]: I1212 17:29:03.719959 3235 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-d-e796afb129" Dec 12 17:29:03.720002 kubelet[3235]: I1212 17:29:03.719982 3235 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-d-e796afb129" Dec 12 17:29:03.720159 kubelet[3235]: I1212 17:29:03.720102 3235 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-d-e796afb129" Dec 12 17:29:03.729939 kubelet[3235]: E1212 17:29:03.729907 3235 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-d-e796afb129\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-2-d-e796afb129" Dec 12 17:29:03.741969 kubelet[3235]: I1212 17:29:03.741933 3235 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-d-e796afb129" Dec 12 17:29:03.749497 kubelet[3235]: I1212 17:29:03.749456 3235 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459-2-2-d-e796afb129" Dec 12 17:29:03.749604 kubelet[3235]: I1212 17:29:03.749537 3235 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-d-e796afb129" Dec 12 17:29:03.879092 kubelet[3235]: I1212 17:29:03.878966 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b224700510742149228982abf9add67d-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-d-e796afb129\" (UID: \"b224700510742149228982abf9add67d\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-d-e796afb129" Dec 12 17:29:03.879092 kubelet[3235]: I1212 17:29:03.879008 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b224700510742149228982abf9add67d-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-d-e796afb129\" (UID: \"b224700510742149228982abf9add67d\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-d-e796afb129" Dec 12 17:29:03.879092 kubelet[3235]: I1212 17:29:03.879029 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b224700510742149228982abf9add67d-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-d-e796afb129\" (UID: \"b224700510742149228982abf9add67d\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-d-e796afb129" Dec 12 17:29:03.879092 kubelet[3235]: I1212 17:29:03.879046 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3478b1edc119b35a7ca6c7fef321b618-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-d-e796afb129\" (UID: \"3478b1edc119b35a7ca6c7fef321b618\") " pod="kube-system/kube-scheduler-ci-4459-2-2-d-e796afb129" Dec 12 17:29:03.879566 kubelet[3235]: I1212 17:29:03.879065 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2d21f3c68ade6c5e71955b06908d1830-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-d-e796afb129\" (UID: \"2d21f3c68ade6c5e71955b06908d1830\") " pod="kube-system/kube-apiserver-ci-4459-2-2-d-e796afb129" Dec 12 17:29:03.879566 kubelet[3235]: I1212 17:29:03.879518 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b224700510742149228982abf9add67d-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-d-e796afb129\" (UID: \"b224700510742149228982abf9add67d\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-d-e796afb129" Dec 12 17:29:03.879566 kubelet[3235]: I1212 17:29:03.879559 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b224700510742149228982abf9add67d-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-d-e796afb129\" (UID: \"b224700510742149228982abf9add67d\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-d-e796afb129" Dec 12 17:29:03.880733 kubelet[3235]: I1212 17:29:03.879601 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2d21f3c68ade6c5e71955b06908d1830-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-d-e796afb129\" (UID: \"2d21f3c68ade6c5e71955b06908d1830\") " pod="kube-system/kube-apiserver-ci-4459-2-2-d-e796afb129" Dec 12 17:29:03.880733 kubelet[3235]: I1212 17:29:03.879660 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2d21f3c68ade6c5e71955b06908d1830-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-d-e796afb129\" (UID: \"2d21f3c68ade6c5e71955b06908d1830\") " pod="kube-system/kube-apiserver-ci-4459-2-2-d-e796afb129" Dec 12 17:29:04.572030 kubelet[3235]: I1212 17:29:04.571986 3235 apiserver.go:52] "Watching apiserver" Dec 12 17:29:04.578137 kubelet[3235]: I1212 17:29:04.578098 3235 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 12 17:29:04.628890 kubelet[3235]: I1212 17:29:04.628850 3235 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-d-e796afb129" Dec 12 17:29:04.629129 kubelet[3235]: I1212 17:29:04.628973 3235 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-d-e796afb129" Dec 12 17:29:04.633262 kubelet[3235]: E1212 17:29:04.633227 3235 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-d-e796afb129\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-2-d-e796afb129" Dec 12 17:29:04.637580 kubelet[3235]: E1212 17:29:04.637548 3235 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-d-e796afb129\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-2-d-e796afb129" Dec 12 17:29:04.658444 kubelet[3235]: I1212 17:29:04.657625 3235 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-2-d-e796afb129" podStartSLOduration=1.657610357 podStartE2EDuration="1.657610357s" podCreationTimestamp="2025-12-12 17:29:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:29:04.657536117 +0000 UTC m=+1.139775602" watchObservedRunningTime="2025-12-12 17:29:04.657610357 +0000 UTC m=+1.139849842" Dec 12 17:29:04.658444 kubelet[3235]: I1212 17:29:04.657811 3235 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-2-d-e796afb129" podStartSLOduration=1.657805718 podStartE2EDuration="1.657805718s" podCreationTimestamp="2025-12-12 17:29:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:29:04.648239253 +0000 UTC m=+1.130478738" watchObservedRunningTime="2025-12-12 17:29:04.657805718 +0000 UTC m=+1.140045203" Dec 12 17:29:04.667707 kubelet[3235]: I1212 17:29:04.667659 3235 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-2-d-e796afb129" podStartSLOduration=1.667644223 podStartE2EDuration="1.667644223s" podCreationTimestamp="2025-12-12 17:29:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:29:04.667497143 +0000 UTC m=+1.149736628" watchObservedRunningTime="2025-12-12 17:29:04.667644223 +0000 UTC m=+1.149883708" Dec 12 17:29:10.136248 kubelet[3235]: I1212 17:29:10.136203 3235 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Dec 12 17:29:10.136666 containerd[1652]: time="2025-12-12T17:29:10.136603109Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Dec 12 17:29:10.136828 kubelet[3235]: I1212 17:29:10.136809 3235 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Dec 12 17:29:10.885277 systemd[1]: Created slice kubepods-besteffort-pod354bc314_d224_4a13_bd2d_0390ce4f88ac.slice - libcontainer container kubepods-besteffort-pod354bc314_d224_4a13_bd2d_0390ce4f88ac.slice. Dec 12 17:29:10.926032 kubelet[3235]: I1212 17:29:10.925959 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/354bc314-d224-4a13-bd2d-0390ce4f88ac-kube-proxy\") pod \"kube-proxy-h2wt7\" (UID: \"354bc314-d224-4a13-bd2d-0390ce4f88ac\") " pod="kube-system/kube-proxy-h2wt7" Dec 12 17:29:10.926032 kubelet[3235]: I1212 17:29:10.926002 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/354bc314-d224-4a13-bd2d-0390ce4f88ac-xtables-lock\") pod \"kube-proxy-h2wt7\" (UID: \"354bc314-d224-4a13-bd2d-0390ce4f88ac\") " pod="kube-system/kube-proxy-h2wt7" Dec 12 17:29:10.926032 kubelet[3235]: I1212 17:29:10.926025 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/354bc314-d224-4a13-bd2d-0390ce4f88ac-lib-modules\") pod \"kube-proxy-h2wt7\" (UID: \"354bc314-d224-4a13-bd2d-0390ce4f88ac\") " pod="kube-system/kube-proxy-h2wt7" Dec 12 17:29:10.926032 kubelet[3235]: I1212 17:29:10.926042 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7rdw\" (UniqueName: \"kubernetes.io/projected/354bc314-d224-4a13-bd2d-0390ce4f88ac-kube-api-access-k7rdw\") pod \"kube-proxy-h2wt7\" (UID: \"354bc314-d224-4a13-bd2d-0390ce4f88ac\") " pod="kube-system/kube-proxy-h2wt7" Dec 12 17:29:11.205608 containerd[1652]: time="2025-12-12T17:29:11.204525163Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-h2wt7,Uid:354bc314-d224-4a13-bd2d-0390ce4f88ac,Namespace:kube-system,Attempt:0,}" Dec 12 17:29:11.230830 containerd[1652]: time="2025-12-12T17:29:11.230786952Z" level=info msg="connecting to shim 1eb4c475c63404294fcde93e262753fb786a5d53f9345e8b5afb55faee4086a2" address="unix:///run/containerd/s/61c56680ec225475f559280da14ec94865ae9e6460c818e3126779d66e2fba40" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:29:11.268699 systemd[1]: Started cri-containerd-1eb4c475c63404294fcde93e262753fb786a5d53f9345e8b5afb55faee4086a2.scope - libcontainer container 1eb4c475c63404294fcde93e262753fb786a5d53f9345e8b5afb55faee4086a2. Dec 12 17:29:11.304790 containerd[1652]: time="2025-12-12T17:29:11.304751344Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-h2wt7,Uid:354bc314-d224-4a13-bd2d-0390ce4f88ac,Namespace:kube-system,Attempt:0,} returns sandbox id \"1eb4c475c63404294fcde93e262753fb786a5d53f9345e8b5afb55faee4086a2\"" Dec 12 17:29:11.311177 containerd[1652]: time="2025-12-12T17:29:11.311121160Z" level=info msg="CreateContainer within sandbox \"1eb4c475c63404294fcde93e262753fb786a5d53f9345e8b5afb55faee4086a2\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Dec 12 17:29:11.334317 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1970473915.mount: Deactivated successfully. Dec 12 17:29:11.337399 containerd[1652]: time="2025-12-12T17:29:11.337227268Z" level=info msg="Container ff336d9321e1aa2f18ba0ec792b8b4da0b75dfa237c24d121f1a431ca72f0a3f: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:29:11.354544 containerd[1652]: time="2025-12-12T17:29:11.354486513Z" level=info msg="CreateContainer within sandbox \"1eb4c475c63404294fcde93e262753fb786a5d53f9345e8b5afb55faee4086a2\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"ff336d9321e1aa2f18ba0ec792b8b4da0b75dfa237c24d121f1a431ca72f0a3f\"" Dec 12 17:29:11.355132 containerd[1652]: time="2025-12-12T17:29:11.355093675Z" level=info msg="StartContainer for \"ff336d9321e1aa2f18ba0ec792b8b4da0b75dfa237c24d121f1a431ca72f0a3f\"" Dec 12 17:29:11.358821 containerd[1652]: time="2025-12-12T17:29:11.358786484Z" level=info msg="connecting to shim ff336d9321e1aa2f18ba0ec792b8b4da0b75dfa237c24d121f1a431ca72f0a3f" address="unix:///run/containerd/s/61c56680ec225475f559280da14ec94865ae9e6460c818e3126779d66e2fba40" protocol=ttrpc version=3 Dec 12 17:29:11.362350 systemd[1]: Created slice kubepods-besteffort-pod70eee0d1_ef89_4791_a6c7_2d7f193e0555.slice - libcontainer container kubepods-besteffort-pod70eee0d1_ef89_4791_a6c7_2d7f193e0555.slice. Dec 12 17:29:11.387576 systemd[1]: Started cri-containerd-ff336d9321e1aa2f18ba0ec792b8b4da0b75dfa237c24d121f1a431ca72f0a3f.scope - libcontainer container ff336d9321e1aa2f18ba0ec792b8b4da0b75dfa237c24d121f1a431ca72f0a3f. Dec 12 17:29:11.430058 kubelet[3235]: I1212 17:29:11.430012 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6whrv\" (UniqueName: \"kubernetes.io/projected/70eee0d1-ef89-4791-a6c7-2d7f193e0555-kube-api-access-6whrv\") pod \"tigera-operator-65cdcdfd6d-zpbf5\" (UID: \"70eee0d1-ef89-4791-a6c7-2d7f193e0555\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-zpbf5" Dec 12 17:29:11.430363 kubelet[3235]: I1212 17:29:11.430052 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/70eee0d1-ef89-4791-a6c7-2d7f193e0555-var-lib-calico\") pod \"tigera-operator-65cdcdfd6d-zpbf5\" (UID: \"70eee0d1-ef89-4791-a6c7-2d7f193e0555\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-zpbf5" Dec 12 17:29:11.467758 containerd[1652]: time="2025-12-12T17:29:11.467611207Z" level=info msg="StartContainer for \"ff336d9321e1aa2f18ba0ec792b8b4da0b75dfa237c24d121f1a431ca72f0a3f\" returns successfully" Dec 12 17:29:11.653720 kubelet[3235]: I1212 17:29:11.653435 3235 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-h2wt7" podStartSLOduration=1.65341677 podStartE2EDuration="1.65341677s" podCreationTimestamp="2025-12-12 17:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:29:11.651878326 +0000 UTC m=+8.134117811" watchObservedRunningTime="2025-12-12 17:29:11.65341677 +0000 UTC m=+8.135656255" Dec 12 17:29:11.668403 containerd[1652]: time="2025-12-12T17:29:11.668201208Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-zpbf5,Uid:70eee0d1-ef89-4791-a6c7-2d7f193e0555,Namespace:tigera-operator,Attempt:0,}" Dec 12 17:29:11.686344 containerd[1652]: time="2025-12-12T17:29:11.686185135Z" level=info msg="connecting to shim b32ff23a73fccd1dbaf065ea0e6420ed59971fc99f16f4f18b58844487fc14e8" address="unix:///run/containerd/s/1237a6a65f87052829d8eb509a63708872b2794376d7c3d5b49bfc339ec2bc4c" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:29:11.708575 systemd[1]: Started cri-containerd-b32ff23a73fccd1dbaf065ea0e6420ed59971fc99f16f4f18b58844487fc14e8.scope - libcontainer container b32ff23a73fccd1dbaf065ea0e6420ed59971fc99f16f4f18b58844487fc14e8. Dec 12 17:29:11.740176 containerd[1652]: time="2025-12-12T17:29:11.740124115Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-zpbf5,Uid:70eee0d1-ef89-4791-a6c7-2d7f193e0555,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"b32ff23a73fccd1dbaf065ea0e6420ed59971fc99f16f4f18b58844487fc14e8\"" Dec 12 17:29:11.741722 containerd[1652]: time="2025-12-12T17:29:11.741696239Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Dec 12 17:29:14.001683 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2118875965.mount: Deactivated successfully. Dec 12 17:29:14.391997 containerd[1652]: time="2025-12-12T17:29:14.391875883Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:29:14.392942 containerd[1652]: time="2025-12-12T17:29:14.392812285Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=22152004" Dec 12 17:29:14.393841 containerd[1652]: time="2025-12-12T17:29:14.393798288Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:29:14.396241 containerd[1652]: time="2025-12-12T17:29:14.396205574Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:29:14.397279 containerd[1652]: time="2025-12-12T17:29:14.397228217Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 2.655501418s" Dec 12 17:29:14.397279 containerd[1652]: time="2025-12-12T17:29:14.397263257Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Dec 12 17:29:14.401532 containerd[1652]: time="2025-12-12T17:29:14.401489468Z" level=info msg="CreateContainer within sandbox \"b32ff23a73fccd1dbaf065ea0e6420ed59971fc99f16f4f18b58844487fc14e8\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Dec 12 17:29:14.412825 containerd[1652]: time="2025-12-12T17:29:14.412773337Z" level=info msg="Container 65590692c14e385ea9a2d636338f510af9688b7c152c7f3173d71a4d32405a66: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:29:14.421977 containerd[1652]: time="2025-12-12T17:29:14.421919001Z" level=info msg="CreateContainer within sandbox \"b32ff23a73fccd1dbaf065ea0e6420ed59971fc99f16f4f18b58844487fc14e8\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"65590692c14e385ea9a2d636338f510af9688b7c152c7f3173d71a4d32405a66\"" Dec 12 17:29:14.422474 containerd[1652]: time="2025-12-12T17:29:14.422435682Z" level=info msg="StartContainer for \"65590692c14e385ea9a2d636338f510af9688b7c152c7f3173d71a4d32405a66\"" Dec 12 17:29:14.423230 containerd[1652]: time="2025-12-12T17:29:14.423206764Z" level=info msg="connecting to shim 65590692c14e385ea9a2d636338f510af9688b7c152c7f3173d71a4d32405a66" address="unix:///run/containerd/s/1237a6a65f87052829d8eb509a63708872b2794376d7c3d5b49bfc339ec2bc4c" protocol=ttrpc version=3 Dec 12 17:29:14.439591 systemd[1]: Started cri-containerd-65590692c14e385ea9a2d636338f510af9688b7c152c7f3173d71a4d32405a66.scope - libcontainer container 65590692c14e385ea9a2d636338f510af9688b7c152c7f3173d71a4d32405a66. Dec 12 17:29:14.465502 containerd[1652]: time="2025-12-12T17:29:14.465459034Z" level=info msg="StartContainer for \"65590692c14e385ea9a2d636338f510af9688b7c152c7f3173d71a4d32405a66\" returns successfully" Dec 12 17:29:14.660799 kubelet[3235]: I1212 17:29:14.660665 3235 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-65cdcdfd6d-zpbf5" podStartSLOduration=1.00384448 podStartE2EDuration="3.660650341s" podCreationTimestamp="2025-12-12 17:29:11 +0000 UTC" firstStartedPulling="2025-12-12 17:29:11.741321438 +0000 UTC m=+8.223560883" lastFinishedPulling="2025-12-12 17:29:14.398127259 +0000 UTC m=+10.880366744" observedRunningTime="2025-12-12 17:29:14.660489861 +0000 UTC m=+11.142729346" watchObservedRunningTime="2025-12-12 17:29:14.660650341 +0000 UTC m=+11.142889826" Dec 12 17:29:19.614979 sudo[2296]: pam_unix(sudo:session): session closed for user root Dec 12 17:29:19.783629 sshd[2295]: Connection closed by 147.75.109.163 port 58876 Dec 12 17:29:19.784189 sshd-session[2276]: pam_unix(sshd:session): session closed for user core Dec 12 17:29:19.788063 systemd[1]: sshd@6-10.0.10.18:22-147.75.109.163:58876.service: Deactivated successfully. Dec 12 17:29:19.789912 systemd[1]: session-7.scope: Deactivated successfully. Dec 12 17:29:19.790079 systemd[1]: session-7.scope: Consumed 8.589s CPU time, 224.4M memory peak. Dec 12 17:29:19.791198 systemd-logind[1632]: Session 7 logged out. Waiting for processes to exit. Dec 12 17:29:19.792967 systemd-logind[1632]: Removed session 7. Dec 12 17:29:29.475167 systemd[1]: Created slice kubepods-besteffort-podd85c9639_8358_4aa6_8647_1eba6bb7e97f.slice - libcontainer container kubepods-besteffort-podd85c9639_8358_4aa6_8647_1eba6bb7e97f.slice. Dec 12 17:29:29.545779 kubelet[3235]: I1212 17:29:29.545674 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/d85c9639-8358-4aa6-8647-1eba6bb7e97f-typha-certs\") pod \"calico-typha-64575d46b8-79btq\" (UID: \"d85c9639-8358-4aa6-8647-1eba6bb7e97f\") " pod="calico-system/calico-typha-64575d46b8-79btq" Dec 12 17:29:29.545779 kubelet[3235]: I1212 17:29:29.545775 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqvwk\" (UniqueName: \"kubernetes.io/projected/d85c9639-8358-4aa6-8647-1eba6bb7e97f-kube-api-access-dqvwk\") pod \"calico-typha-64575d46b8-79btq\" (UID: \"d85c9639-8358-4aa6-8647-1eba6bb7e97f\") " pod="calico-system/calico-typha-64575d46b8-79btq" Dec 12 17:29:29.546449 kubelet[3235]: I1212 17:29:29.545833 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d85c9639-8358-4aa6-8647-1eba6bb7e97f-tigera-ca-bundle\") pod \"calico-typha-64575d46b8-79btq\" (UID: \"d85c9639-8358-4aa6-8647-1eba6bb7e97f\") " pod="calico-system/calico-typha-64575d46b8-79btq" Dec 12 17:29:29.680725 systemd[1]: Created slice kubepods-besteffort-pod49854909_6fc1_4d2e_ae55_c9ec62e76538.slice - libcontainer container kubepods-besteffort-pod49854909_6fc1_4d2e_ae55_c9ec62e76538.slice. Dec 12 17:29:29.746682 kubelet[3235]: I1212 17:29:29.746646 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/49854909-6fc1-4d2e-ae55-c9ec62e76538-policysync\") pod \"calico-node-rsntg\" (UID: \"49854909-6fc1-4d2e-ae55-c9ec62e76538\") " pod="calico-system/calico-node-rsntg" Dec 12 17:29:29.746682 kubelet[3235]: I1212 17:29:29.746686 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frsdv\" (UniqueName: \"kubernetes.io/projected/49854909-6fc1-4d2e-ae55-c9ec62e76538-kube-api-access-frsdv\") pod \"calico-node-rsntg\" (UID: \"49854909-6fc1-4d2e-ae55-c9ec62e76538\") " pod="calico-system/calico-node-rsntg" Dec 12 17:29:29.746851 kubelet[3235]: I1212 17:29:29.746706 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/49854909-6fc1-4d2e-ae55-c9ec62e76538-cni-log-dir\") pod \"calico-node-rsntg\" (UID: \"49854909-6fc1-4d2e-ae55-c9ec62e76538\") " pod="calico-system/calico-node-rsntg" Dec 12 17:29:29.746851 kubelet[3235]: I1212 17:29:29.746721 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/49854909-6fc1-4d2e-ae55-c9ec62e76538-flexvol-driver-host\") pod \"calico-node-rsntg\" (UID: \"49854909-6fc1-4d2e-ae55-c9ec62e76538\") " pod="calico-system/calico-node-rsntg" Dec 12 17:29:29.746851 kubelet[3235]: I1212 17:29:29.746735 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/49854909-6fc1-4d2e-ae55-c9ec62e76538-lib-modules\") pod \"calico-node-rsntg\" (UID: \"49854909-6fc1-4d2e-ae55-c9ec62e76538\") " pod="calico-system/calico-node-rsntg" Dec 12 17:29:29.746851 kubelet[3235]: I1212 17:29:29.746756 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/49854909-6fc1-4d2e-ae55-c9ec62e76538-xtables-lock\") pod \"calico-node-rsntg\" (UID: \"49854909-6fc1-4d2e-ae55-c9ec62e76538\") " pod="calico-system/calico-node-rsntg" Dec 12 17:29:29.746851 kubelet[3235]: I1212 17:29:29.746774 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/49854909-6fc1-4d2e-ae55-c9ec62e76538-var-lib-calico\") pod \"calico-node-rsntg\" (UID: \"49854909-6fc1-4d2e-ae55-c9ec62e76538\") " pod="calico-system/calico-node-rsntg" Dec 12 17:29:29.746961 kubelet[3235]: I1212 17:29:29.746791 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/49854909-6fc1-4d2e-ae55-c9ec62e76538-cni-net-dir\") pod \"calico-node-rsntg\" (UID: \"49854909-6fc1-4d2e-ae55-c9ec62e76538\") " pod="calico-system/calico-node-rsntg" Dec 12 17:29:29.746961 kubelet[3235]: I1212 17:29:29.746819 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/49854909-6fc1-4d2e-ae55-c9ec62e76538-node-certs\") pod \"calico-node-rsntg\" (UID: \"49854909-6fc1-4d2e-ae55-c9ec62e76538\") " pod="calico-system/calico-node-rsntg" Dec 12 17:29:29.746961 kubelet[3235]: I1212 17:29:29.746839 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/49854909-6fc1-4d2e-ae55-c9ec62e76538-var-run-calico\") pod \"calico-node-rsntg\" (UID: \"49854909-6fc1-4d2e-ae55-c9ec62e76538\") " pod="calico-system/calico-node-rsntg" Dec 12 17:29:29.746961 kubelet[3235]: I1212 17:29:29.746853 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/49854909-6fc1-4d2e-ae55-c9ec62e76538-cni-bin-dir\") pod \"calico-node-rsntg\" (UID: \"49854909-6fc1-4d2e-ae55-c9ec62e76538\") " pod="calico-system/calico-node-rsntg" Dec 12 17:29:29.746961 kubelet[3235]: I1212 17:29:29.746868 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49854909-6fc1-4d2e-ae55-c9ec62e76538-tigera-ca-bundle\") pod \"calico-node-rsntg\" (UID: \"49854909-6fc1-4d2e-ae55-c9ec62e76538\") " pod="calico-system/calico-node-rsntg" Dec 12 17:29:29.782125 containerd[1652]: time="2025-12-12T17:29:29.782086021Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-64575d46b8-79btq,Uid:d85c9639-8358-4aa6-8647-1eba6bb7e97f,Namespace:calico-system,Attempt:0,}" Dec 12 17:29:29.799030 containerd[1652]: time="2025-12-12T17:29:29.798903865Z" level=info msg="connecting to shim 9b6de97e3114f002b3c98fcc4f7c59d1682ed445b1e310c68a16135720370e1d" address="unix:///run/containerd/s/2a4af42ff8d44c4173795d40ded08d1ef77bcdffaabe9ecdc05823f999a1ab0e" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:29:29.824629 systemd[1]: Started cri-containerd-9b6de97e3114f002b3c98fcc4f7c59d1682ed445b1e310c68a16135720370e1d.scope - libcontainer container 9b6de97e3114f002b3c98fcc4f7c59d1682ed445b1e310c68a16135720370e1d. Dec 12 17:29:29.848759 kubelet[3235]: E1212 17:29:29.848694 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.848759 kubelet[3235]: W1212 17:29:29.848742 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.848759 kubelet[3235]: E1212 17:29:29.848760 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.848998 kubelet[3235]: E1212 17:29:29.848977 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.848998 kubelet[3235]: W1212 17:29:29.848989 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.849061 kubelet[3235]: E1212 17:29:29.848999 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.849257 kubelet[3235]: E1212 17:29:29.849234 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.849257 kubelet[3235]: W1212 17:29:29.849249 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.849257 kubelet[3235]: E1212 17:29:29.849259 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.850379 kubelet[3235]: E1212 17:29:29.849574 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.850379 kubelet[3235]: W1212 17:29:29.849588 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.850379 kubelet[3235]: E1212 17:29:29.849620 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.850379 kubelet[3235]: E1212 17:29:29.849874 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.850379 kubelet[3235]: W1212 17:29:29.849884 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.850379 kubelet[3235]: E1212 17:29:29.849895 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.850379 kubelet[3235]: E1212 17:29:29.850102 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.850379 kubelet[3235]: W1212 17:29:29.850110 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.850379 kubelet[3235]: E1212 17:29:29.850120 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.856828 kubelet[3235]: E1212 17:29:29.856532 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.856828 kubelet[3235]: W1212 17:29:29.856569 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.856828 kubelet[3235]: E1212 17:29:29.856635 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.869047 kubelet[3235]: E1212 17:29:29.869001 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.869047 kubelet[3235]: W1212 17:29:29.869025 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.869047 kubelet[3235]: E1212 17:29:29.869056 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.870269 kubelet[3235]: E1212 17:29:29.870227 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ksbxj" podUID="5bf56c15-e8b7-4324-9dd0-89444eba43fb" Dec 12 17:29:29.878516 containerd[1652]: time="2025-12-12T17:29:29.878479911Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-64575d46b8-79btq,Uid:d85c9639-8358-4aa6-8647-1eba6bb7e97f,Namespace:calico-system,Attempt:0,} returns sandbox id \"9b6de97e3114f002b3c98fcc4f7c59d1682ed445b1e310c68a16135720370e1d\"" Dec 12 17:29:29.880650 containerd[1652]: time="2025-12-12T17:29:29.880602237Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Dec 12 17:29:29.926315 kubelet[3235]: E1212 17:29:29.926284 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.926315 kubelet[3235]: W1212 17:29:29.926305 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.926315 kubelet[3235]: E1212 17:29:29.926324 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.926580 kubelet[3235]: E1212 17:29:29.926487 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.926580 kubelet[3235]: W1212 17:29:29.926505 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.926580 kubelet[3235]: E1212 17:29:29.926548 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.926692 kubelet[3235]: E1212 17:29:29.926676 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.926692 kubelet[3235]: W1212 17:29:29.926686 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.926799 kubelet[3235]: E1212 17:29:29.926694 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.927090 kubelet[3235]: E1212 17:29:29.927077 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.927090 kubelet[3235]: W1212 17:29:29.927088 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.927168 kubelet[3235]: E1212 17:29:29.927096 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.927468 kubelet[3235]: E1212 17:29:29.927447 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.927468 kubelet[3235]: W1212 17:29:29.927463 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.927562 kubelet[3235]: E1212 17:29:29.927475 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.927644 kubelet[3235]: E1212 17:29:29.927631 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.927673 kubelet[3235]: W1212 17:29:29.927648 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.927673 kubelet[3235]: E1212 17:29:29.927657 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.927788 kubelet[3235]: E1212 17:29:29.927776 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.927788 kubelet[3235]: W1212 17:29:29.927786 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.927848 kubelet[3235]: E1212 17:29:29.927793 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.928003 kubelet[3235]: E1212 17:29:29.927989 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.928003 kubelet[3235]: W1212 17:29:29.928000 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.928061 kubelet[3235]: E1212 17:29:29.928009 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.928160 kubelet[3235]: E1212 17:29:29.928148 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.928160 kubelet[3235]: W1212 17:29:29.928158 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.928203 kubelet[3235]: E1212 17:29:29.928175 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.928293 kubelet[3235]: E1212 17:29:29.928283 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.928316 kubelet[3235]: W1212 17:29:29.928293 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.928316 kubelet[3235]: E1212 17:29:29.928300 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.928431 kubelet[3235]: E1212 17:29:29.928421 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.928462 kubelet[3235]: W1212 17:29:29.928430 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.928462 kubelet[3235]: E1212 17:29:29.928446 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.928562 kubelet[3235]: E1212 17:29:29.928552 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.928585 kubelet[3235]: W1212 17:29:29.928562 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.928585 kubelet[3235]: E1212 17:29:29.928569 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.928734 kubelet[3235]: E1212 17:29:29.928713 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.928734 kubelet[3235]: W1212 17:29:29.928732 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.928785 kubelet[3235]: E1212 17:29:29.928740 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.928907 kubelet[3235]: E1212 17:29:29.928864 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.928907 kubelet[3235]: W1212 17:29:29.928885 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.928907 kubelet[3235]: E1212 17:29:29.928892 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.929074 kubelet[3235]: E1212 17:29:29.929053 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.929074 kubelet[3235]: W1212 17:29:29.929069 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.929121 kubelet[3235]: E1212 17:29:29.929076 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.929208 kubelet[3235]: E1212 17:29:29.929197 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.929230 kubelet[3235]: W1212 17:29:29.929207 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.929230 kubelet[3235]: E1212 17:29:29.929216 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.929392 kubelet[3235]: E1212 17:29:29.929380 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.929432 kubelet[3235]: W1212 17:29:29.929396 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.929432 kubelet[3235]: E1212 17:29:29.929404 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.930417 kubelet[3235]: E1212 17:29:29.930218 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.930417 kubelet[3235]: W1212 17:29:29.930403 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.930417 kubelet[3235]: E1212 17:29:29.930418 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.930796 kubelet[3235]: E1212 17:29:29.930766 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.930796 kubelet[3235]: W1212 17:29:29.930781 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.930796 kubelet[3235]: E1212 17:29:29.930792 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.931511 kubelet[3235]: E1212 17:29:29.931476 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.931511 kubelet[3235]: W1212 17:29:29.931499 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.931511 kubelet[3235]: E1212 17:29:29.931510 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.949406 kubelet[3235]: E1212 17:29:29.948944 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.949406 kubelet[3235]: W1212 17:29:29.948967 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.949406 kubelet[3235]: E1212 17:29:29.948986 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.949406 kubelet[3235]: I1212 17:29:29.949015 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5bf56c15-e8b7-4324-9dd0-89444eba43fb-kubelet-dir\") pod \"csi-node-driver-ksbxj\" (UID: \"5bf56c15-e8b7-4324-9dd0-89444eba43fb\") " pod="calico-system/csi-node-driver-ksbxj" Dec 12 17:29:29.949406 kubelet[3235]: E1212 17:29:29.949228 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.949712 kubelet[3235]: W1212 17:29:29.949425 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.949712 kubelet[3235]: E1212 17:29:29.949441 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.950118 kubelet[3235]: E1212 17:29:29.950089 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.950118 kubelet[3235]: W1212 17:29:29.950105 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.950118 kubelet[3235]: E1212 17:29:29.950117 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.950699 kubelet[3235]: E1212 17:29:29.950670 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.950699 kubelet[3235]: W1212 17:29:29.950685 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.950699 kubelet[3235]: E1212 17:29:29.950697 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.950805 kubelet[3235]: I1212 17:29:29.950720 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5bf56c15-e8b7-4324-9dd0-89444eba43fb-registration-dir\") pod \"csi-node-driver-ksbxj\" (UID: \"5bf56c15-e8b7-4324-9dd0-89444eba43fb\") " pod="calico-system/csi-node-driver-ksbxj" Dec 12 17:29:29.950968 kubelet[3235]: E1212 17:29:29.950926 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.950968 kubelet[3235]: W1212 17:29:29.950940 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.950968 kubelet[3235]: E1212 17:29:29.950957 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.951060 kubelet[3235]: I1212 17:29:29.950979 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfkxq\" (UniqueName: \"kubernetes.io/projected/5bf56c15-e8b7-4324-9dd0-89444eba43fb-kube-api-access-sfkxq\") pod \"csi-node-driver-ksbxj\" (UID: \"5bf56c15-e8b7-4324-9dd0-89444eba43fb\") " pod="calico-system/csi-node-driver-ksbxj" Dec 12 17:29:29.951392 kubelet[3235]: E1212 17:29:29.951358 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.951392 kubelet[3235]: W1212 17:29:29.951387 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.951610 kubelet[3235]: E1212 17:29:29.951401 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.951610 kubelet[3235]: E1212 17:29:29.951576 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.951610 kubelet[3235]: W1212 17:29:29.951584 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.951610 kubelet[3235]: E1212 17:29:29.951592 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.951822 kubelet[3235]: E1212 17:29:29.951748 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.951822 kubelet[3235]: W1212 17:29:29.951756 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.951822 kubelet[3235]: E1212 17:29:29.951764 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.951822 kubelet[3235]: I1212 17:29:29.951782 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/5bf56c15-e8b7-4324-9dd0-89444eba43fb-varrun\") pod \"csi-node-driver-ksbxj\" (UID: \"5bf56c15-e8b7-4324-9dd0-89444eba43fb\") " pod="calico-system/csi-node-driver-ksbxj" Dec 12 17:29:29.952382 kubelet[3235]: E1212 17:29:29.952015 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.952382 kubelet[3235]: W1212 17:29:29.952031 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.952382 kubelet[3235]: E1212 17:29:29.952040 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.952382 kubelet[3235]: I1212 17:29:29.952055 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5bf56c15-e8b7-4324-9dd0-89444eba43fb-socket-dir\") pod \"csi-node-driver-ksbxj\" (UID: \"5bf56c15-e8b7-4324-9dd0-89444eba43fb\") " pod="calico-system/csi-node-driver-ksbxj" Dec 12 17:29:29.952382 kubelet[3235]: E1212 17:29:29.952230 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.952382 kubelet[3235]: W1212 17:29:29.952239 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.952382 kubelet[3235]: E1212 17:29:29.952247 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.952602 kubelet[3235]: E1212 17:29:29.952429 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.952602 kubelet[3235]: W1212 17:29:29.952437 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.952602 kubelet[3235]: E1212 17:29:29.952445 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.952602 kubelet[3235]: E1212 17:29:29.952583 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.952602 kubelet[3235]: W1212 17:29:29.952593 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.952602 kubelet[3235]: E1212 17:29:29.952601 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.954090 kubelet[3235]: E1212 17:29:29.952829 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.954090 kubelet[3235]: W1212 17:29:29.952839 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.954090 kubelet[3235]: E1212 17:29:29.952848 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.954090 kubelet[3235]: E1212 17:29:29.953016 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.954090 kubelet[3235]: W1212 17:29:29.953025 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.954090 kubelet[3235]: E1212 17:29:29.953034 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.954090 kubelet[3235]: E1212 17:29:29.953180 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:29.954090 kubelet[3235]: W1212 17:29:29.953188 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:29.954090 kubelet[3235]: E1212 17:29:29.953197 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:29.987137 containerd[1652]: time="2025-12-12T17:29:29.987086273Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-rsntg,Uid:49854909-6fc1-4d2e-ae55-c9ec62e76538,Namespace:calico-system,Attempt:0,}" Dec 12 17:29:30.008111 containerd[1652]: time="2025-12-12T17:29:30.007934768Z" level=info msg="connecting to shim f5549e7f7e7f4a92c6ef9bf6b444d6730b23980719b88ff7f459adee442f1381" address="unix:///run/containerd/s/cedd41179a2cfd09185367434a52cb2eed496483a28c88ca533400bbe5bcf22f" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:29:30.027665 systemd[1]: Started cri-containerd-f5549e7f7e7f4a92c6ef9bf6b444d6730b23980719b88ff7f459adee442f1381.scope - libcontainer container f5549e7f7e7f4a92c6ef9bf6b444d6730b23980719b88ff7f459adee442f1381. Dec 12 17:29:30.050742 containerd[1652]: time="2025-12-12T17:29:30.050707679Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-rsntg,Uid:49854909-6fc1-4d2e-ae55-c9ec62e76538,Namespace:calico-system,Attempt:0,} returns sandbox id \"f5549e7f7e7f4a92c6ef9bf6b444d6730b23980719b88ff7f459adee442f1381\"" Dec 12 17:29:30.053627 kubelet[3235]: E1212 17:29:30.053602 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.053627 kubelet[3235]: W1212 17:29:30.053625 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.053755 kubelet[3235]: E1212 17:29:30.053645 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:30.053835 kubelet[3235]: E1212 17:29:30.053819 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.053835 kubelet[3235]: W1212 17:29:30.053832 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.053904 kubelet[3235]: E1212 17:29:30.053840 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:30.054110 kubelet[3235]: E1212 17:29:30.054098 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.054110 kubelet[3235]: W1212 17:29:30.054108 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.054173 kubelet[3235]: E1212 17:29:30.054116 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:30.054429 kubelet[3235]: E1212 17:29:30.054415 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.054461 kubelet[3235]: W1212 17:29:30.054429 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.054461 kubelet[3235]: E1212 17:29:30.054439 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:30.054623 kubelet[3235]: E1212 17:29:30.054613 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.054648 kubelet[3235]: W1212 17:29:30.054623 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.054648 kubelet[3235]: E1212 17:29:30.054631 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:30.054820 kubelet[3235]: E1212 17:29:30.054810 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.054843 kubelet[3235]: W1212 17:29:30.054820 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.054843 kubelet[3235]: E1212 17:29:30.054829 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:30.054995 kubelet[3235]: E1212 17:29:30.054983 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.055021 kubelet[3235]: W1212 17:29:30.054995 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.055021 kubelet[3235]: E1212 17:29:30.055004 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:30.055839 kubelet[3235]: E1212 17:29:30.055783 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.055870 kubelet[3235]: W1212 17:29:30.055840 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.055870 kubelet[3235]: E1212 17:29:30.055853 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:30.056030 kubelet[3235]: E1212 17:29:30.056018 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.056061 kubelet[3235]: W1212 17:29:30.056029 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.056061 kubelet[3235]: E1212 17:29:30.056038 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:30.056174 kubelet[3235]: E1212 17:29:30.056162 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.056174 kubelet[3235]: W1212 17:29:30.056172 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.056225 kubelet[3235]: E1212 17:29:30.056180 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:30.056336 kubelet[3235]: E1212 17:29:30.056325 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.056383 kubelet[3235]: W1212 17:29:30.056336 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.056383 kubelet[3235]: E1212 17:29:30.056360 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:30.056509 kubelet[3235]: E1212 17:29:30.056497 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.056509 kubelet[3235]: W1212 17:29:30.056508 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.056552 kubelet[3235]: E1212 17:29:30.056515 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:30.056637 kubelet[3235]: E1212 17:29:30.056627 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.056660 kubelet[3235]: W1212 17:29:30.056637 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.056660 kubelet[3235]: E1212 17:29:30.056645 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:30.056771 kubelet[3235]: E1212 17:29:30.056762 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.056791 kubelet[3235]: W1212 17:29:30.056770 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.056791 kubelet[3235]: E1212 17:29:30.056778 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:30.057011 kubelet[3235]: E1212 17:29:30.056994 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.057034 kubelet[3235]: W1212 17:29:30.057012 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.057034 kubelet[3235]: E1212 17:29:30.057025 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:30.057175 kubelet[3235]: E1212 17:29:30.057165 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.057195 kubelet[3235]: W1212 17:29:30.057175 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.057195 kubelet[3235]: E1212 17:29:30.057184 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:30.057312 kubelet[3235]: E1212 17:29:30.057302 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.057334 kubelet[3235]: W1212 17:29:30.057311 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.057334 kubelet[3235]: E1212 17:29:30.057319 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:30.057474 kubelet[3235]: E1212 17:29:30.057464 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.057503 kubelet[3235]: W1212 17:29:30.057474 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.057503 kubelet[3235]: E1212 17:29:30.057481 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:30.057734 kubelet[3235]: E1212 17:29:30.057714 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.057757 kubelet[3235]: W1212 17:29:30.057734 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.057757 kubelet[3235]: E1212 17:29:30.057748 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:30.057916 kubelet[3235]: E1212 17:29:30.057904 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.057916 kubelet[3235]: W1212 17:29:30.057913 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.057966 kubelet[3235]: E1212 17:29:30.057921 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:30.058069 kubelet[3235]: E1212 17:29:30.058057 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.058069 kubelet[3235]: W1212 17:29:30.058067 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.058116 kubelet[3235]: E1212 17:29:30.058076 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:30.058268 kubelet[3235]: E1212 17:29:30.058245 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.058268 kubelet[3235]: W1212 17:29:30.058267 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.058315 kubelet[3235]: E1212 17:29:30.058276 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:30.058452 kubelet[3235]: E1212 17:29:30.058440 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.058452 kubelet[3235]: W1212 17:29:30.058451 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.058511 kubelet[3235]: E1212 17:29:30.058458 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:30.058612 kubelet[3235]: E1212 17:29:30.058600 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.058612 kubelet[3235]: W1212 17:29:30.058611 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.058654 kubelet[3235]: E1212 17:29:30.058619 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:30.058787 kubelet[3235]: E1212 17:29:30.058775 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.058787 kubelet[3235]: W1212 17:29:30.058786 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.058840 kubelet[3235]: E1212 17:29:30.058794 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:30.068250 kubelet[3235]: E1212 17:29:30.068202 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:30.068250 kubelet[3235]: W1212 17:29:30.068222 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:30.068250 kubelet[3235]: E1212 17:29:30.068238 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:31.167136 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2669409988.mount: Deactivated successfully. Dec 12 17:29:31.622347 kubelet[3235]: E1212 17:29:31.622252 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ksbxj" podUID="5bf56c15-e8b7-4324-9dd0-89444eba43fb" Dec 12 17:29:31.623240 containerd[1652]: time="2025-12-12T17:29:31.623195003Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:29:31.624717 containerd[1652]: time="2025-12-12T17:29:31.624658207Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=33090687" Dec 12 17:29:31.626005 containerd[1652]: time="2025-12-12T17:29:31.625972531Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:29:31.628157 containerd[1652]: time="2025-12-12T17:29:31.628116976Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:29:31.629126 containerd[1652]: time="2025-12-12T17:29:31.629099779Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 1.748444182s" Dec 12 17:29:31.629181 containerd[1652]: time="2025-12-12T17:29:31.629131299Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Dec 12 17:29:31.630325 containerd[1652]: time="2025-12-12T17:29:31.630098141Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Dec 12 17:29:31.640739 containerd[1652]: time="2025-12-12T17:29:31.640679929Z" level=info msg="CreateContainer within sandbox \"9b6de97e3114f002b3c98fcc4f7c59d1682ed445b1e310c68a16135720370e1d\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Dec 12 17:29:31.652795 containerd[1652]: time="2025-12-12T17:29:31.652705400Z" level=info msg="Container e844bbc0643e59213c2dedb6a709c070ceb93f06d7f5f62f76df6fe46469f1eb: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:29:31.655362 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1821478421.mount: Deactivated successfully. Dec 12 17:29:31.663687 containerd[1652]: time="2025-12-12T17:29:31.663644988Z" level=info msg="CreateContainer within sandbox \"9b6de97e3114f002b3c98fcc4f7c59d1682ed445b1e310c68a16135720370e1d\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"e844bbc0643e59213c2dedb6a709c070ceb93f06d7f5f62f76df6fe46469f1eb\"" Dec 12 17:29:31.664300 containerd[1652]: time="2025-12-12T17:29:31.664275670Z" level=info msg="StartContainer for \"e844bbc0643e59213c2dedb6a709c070ceb93f06d7f5f62f76df6fe46469f1eb\"" Dec 12 17:29:31.665712 containerd[1652]: time="2025-12-12T17:29:31.665685514Z" level=info msg="connecting to shim e844bbc0643e59213c2dedb6a709c070ceb93f06d7f5f62f76df6fe46469f1eb" address="unix:///run/containerd/s/2a4af42ff8d44c4173795d40ded08d1ef77bcdffaabe9ecdc05823f999a1ab0e" protocol=ttrpc version=3 Dec 12 17:29:31.689548 systemd[1]: Started cri-containerd-e844bbc0643e59213c2dedb6a709c070ceb93f06d7f5f62f76df6fe46469f1eb.scope - libcontainer container e844bbc0643e59213c2dedb6a709c070ceb93f06d7f5f62f76df6fe46469f1eb. Dec 12 17:29:31.723556 containerd[1652]: time="2025-12-12T17:29:31.723449144Z" level=info msg="StartContainer for \"e844bbc0643e59213c2dedb6a709c070ceb93f06d7f5f62f76df6fe46469f1eb\" returns successfully" Dec 12 17:29:32.748469 kubelet[3235]: E1212 17:29:32.748309 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.748469 kubelet[3235]: W1212 17:29:32.748334 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.748469 kubelet[3235]: E1212 17:29:32.748354 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.748893 kubelet[3235]: E1212 17:29:32.748514 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.748893 kubelet[3235]: W1212 17:29:32.748521 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.748893 kubelet[3235]: E1212 17:29:32.748558 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.748893 kubelet[3235]: E1212 17:29:32.748673 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.748893 kubelet[3235]: W1212 17:29:32.748680 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.748893 kubelet[3235]: E1212 17:29:32.748693 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.748893 kubelet[3235]: E1212 17:29:32.748810 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.748893 kubelet[3235]: W1212 17:29:32.748817 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.748893 kubelet[3235]: E1212 17:29:32.748824 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.749067 kubelet[3235]: E1212 17:29:32.748943 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.749067 kubelet[3235]: W1212 17:29:32.748950 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.749067 kubelet[3235]: E1212 17:29:32.748958 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.749067 kubelet[3235]: E1212 17:29:32.749063 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.749134 kubelet[3235]: W1212 17:29:32.749069 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.749134 kubelet[3235]: E1212 17:29:32.749077 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.749197 kubelet[3235]: E1212 17:29:32.749181 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.749197 kubelet[3235]: W1212 17:29:32.749190 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.749197 kubelet[3235]: E1212 17:29:32.749197 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.749337 kubelet[3235]: E1212 17:29:32.749306 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.749337 kubelet[3235]: W1212 17:29:32.749317 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.749337 kubelet[3235]: E1212 17:29:32.749324 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.749468 kubelet[3235]: E1212 17:29:32.749457 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.749468 kubelet[3235]: W1212 17:29:32.749466 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.749514 kubelet[3235]: E1212 17:29:32.749473 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.749607 kubelet[3235]: E1212 17:29:32.749584 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.749607 kubelet[3235]: W1212 17:29:32.749594 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.749607 kubelet[3235]: E1212 17:29:32.749601 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.749713 kubelet[3235]: E1212 17:29:32.749704 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.749713 kubelet[3235]: W1212 17:29:32.749712 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.749753 kubelet[3235]: E1212 17:29:32.749719 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.749841 kubelet[3235]: E1212 17:29:32.749822 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.749841 kubelet[3235]: W1212 17:29:32.749832 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.749841 kubelet[3235]: E1212 17:29:32.749838 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.749961 kubelet[3235]: E1212 17:29:32.749952 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.749961 kubelet[3235]: W1212 17:29:32.749960 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.750003 kubelet[3235]: E1212 17:29:32.749967 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.750080 kubelet[3235]: E1212 17:29:32.750071 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.750100 kubelet[3235]: W1212 17:29:32.750082 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.750100 kubelet[3235]: E1212 17:29:32.750092 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.750213 kubelet[3235]: E1212 17:29:32.750203 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.750213 kubelet[3235]: W1212 17:29:32.750211 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.750253 kubelet[3235]: E1212 17:29:32.750218 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.772849 kubelet[3235]: E1212 17:29:32.772813 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.772849 kubelet[3235]: W1212 17:29:32.772834 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.772849 kubelet[3235]: E1212 17:29:32.772852 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.773134 kubelet[3235]: E1212 17:29:32.773112 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.773134 kubelet[3235]: W1212 17:29:32.773121 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.773134 kubelet[3235]: E1212 17:29:32.773130 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.773347 kubelet[3235]: E1212 17:29:32.773318 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.773347 kubelet[3235]: W1212 17:29:32.773330 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.773347 kubelet[3235]: E1212 17:29:32.773339 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.773619 kubelet[3235]: E1212 17:29:32.773585 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.773619 kubelet[3235]: W1212 17:29:32.773604 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.773619 kubelet[3235]: E1212 17:29:32.773617 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.773827 kubelet[3235]: E1212 17:29:32.773800 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.773827 kubelet[3235]: W1212 17:29:32.773812 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.773827 kubelet[3235]: E1212 17:29:32.773821 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.773981 kubelet[3235]: E1212 17:29:32.773969 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.774007 kubelet[3235]: W1212 17:29:32.773980 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.774007 kubelet[3235]: E1212 17:29:32.773989 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.774177 kubelet[3235]: E1212 17:29:32.774166 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.774200 kubelet[3235]: W1212 17:29:32.774176 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.774200 kubelet[3235]: E1212 17:29:32.774185 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.774434 kubelet[3235]: E1212 17:29:32.774421 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.774461 kubelet[3235]: W1212 17:29:32.774435 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.774461 kubelet[3235]: E1212 17:29:32.774445 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.774602 kubelet[3235]: E1212 17:29:32.774592 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.774624 kubelet[3235]: W1212 17:29:32.774601 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.774624 kubelet[3235]: E1212 17:29:32.774609 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.774743 kubelet[3235]: E1212 17:29:32.774733 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.774771 kubelet[3235]: W1212 17:29:32.774743 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.774771 kubelet[3235]: E1212 17:29:32.774750 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.774877 kubelet[3235]: E1212 17:29:32.774867 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.774899 kubelet[3235]: W1212 17:29:32.774876 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.774899 kubelet[3235]: E1212 17:29:32.774884 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.775046 kubelet[3235]: E1212 17:29:32.775035 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.775066 kubelet[3235]: W1212 17:29:32.775045 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.775066 kubelet[3235]: E1212 17:29:32.775052 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.775214 kubelet[3235]: E1212 17:29:32.775204 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.775234 kubelet[3235]: W1212 17:29:32.775214 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.775234 kubelet[3235]: E1212 17:29:32.775222 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.775444 kubelet[3235]: E1212 17:29:32.775431 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.775469 kubelet[3235]: W1212 17:29:32.775444 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.775469 kubelet[3235]: E1212 17:29:32.775455 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.775583 kubelet[3235]: E1212 17:29:32.775574 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.775605 kubelet[3235]: W1212 17:29:32.775582 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.775605 kubelet[3235]: E1212 17:29:32.775590 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.775737 kubelet[3235]: E1212 17:29:32.775728 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.775761 kubelet[3235]: W1212 17:29:32.775737 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.775761 kubelet[3235]: E1212 17:29:32.775745 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.775991 kubelet[3235]: E1212 17:29:32.775960 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.775991 kubelet[3235]: W1212 17:29:32.775974 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.775991 kubelet[3235]: E1212 17:29:32.775986 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:32.776159 kubelet[3235]: E1212 17:29:32.776145 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:32.776159 kubelet[3235]: W1212 17:29:32.776158 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:32.776204 kubelet[3235]: E1212 17:29:32.776169 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.619405 kubelet[3235]: E1212 17:29:33.619284 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ksbxj" podUID="5bf56c15-e8b7-4324-9dd0-89444eba43fb" Dec 12 17:29:33.694806 kubelet[3235]: I1212 17:29:33.694652 3235 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 12 17:29:33.757495 kubelet[3235]: E1212 17:29:33.757360 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.757495 kubelet[3235]: W1212 17:29:33.757457 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.757495 kubelet[3235]: E1212 17:29:33.757495 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.757987 kubelet[3235]: E1212 17:29:33.757731 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.757987 kubelet[3235]: W1212 17:29:33.757740 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.757987 kubelet[3235]: E1212 17:29:33.757748 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.757987 kubelet[3235]: E1212 17:29:33.757912 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.757987 kubelet[3235]: W1212 17:29:33.757920 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.757987 kubelet[3235]: E1212 17:29:33.757928 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.758142 kubelet[3235]: E1212 17:29:33.758065 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.758142 kubelet[3235]: W1212 17:29:33.758072 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.758142 kubelet[3235]: E1212 17:29:33.758083 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.758228 kubelet[3235]: E1212 17:29:33.758213 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.758228 kubelet[3235]: W1212 17:29:33.758223 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.758290 kubelet[3235]: E1212 17:29:33.758230 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.758353 kubelet[3235]: E1212 17:29:33.758343 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.758353 kubelet[3235]: W1212 17:29:33.758352 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.758411 kubelet[3235]: E1212 17:29:33.758359 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.758491 kubelet[3235]: E1212 17:29:33.758482 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.758491 kubelet[3235]: W1212 17:29:33.758490 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.758543 kubelet[3235]: E1212 17:29:33.758498 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.758612 kubelet[3235]: E1212 17:29:33.758603 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.758612 kubelet[3235]: W1212 17:29:33.758612 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.758668 kubelet[3235]: E1212 17:29:33.758619 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.758743 kubelet[3235]: E1212 17:29:33.758733 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.758782 kubelet[3235]: W1212 17:29:33.758744 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.758782 kubelet[3235]: E1212 17:29:33.758751 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.758863 kubelet[3235]: E1212 17:29:33.758853 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.758863 kubelet[3235]: W1212 17:29:33.758862 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.758912 kubelet[3235]: E1212 17:29:33.758869 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.759027 kubelet[3235]: E1212 17:29:33.759008 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.759027 kubelet[3235]: W1212 17:29:33.759015 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.759027 kubelet[3235]: E1212 17:29:33.759022 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.759142 kubelet[3235]: E1212 17:29:33.759132 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.759142 kubelet[3235]: W1212 17:29:33.759141 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.759198 kubelet[3235]: E1212 17:29:33.759148 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.759277 kubelet[3235]: E1212 17:29:33.759268 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.759277 kubelet[3235]: W1212 17:29:33.759277 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.759326 kubelet[3235]: E1212 17:29:33.759284 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.759416 kubelet[3235]: E1212 17:29:33.759406 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.759416 kubelet[3235]: W1212 17:29:33.759416 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.759465 kubelet[3235]: E1212 17:29:33.759423 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.759541 kubelet[3235]: E1212 17:29:33.759532 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.759541 kubelet[3235]: W1212 17:29:33.759541 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.759590 kubelet[3235]: E1212 17:29:33.759548 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.781977 kubelet[3235]: E1212 17:29:33.781804 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.781977 kubelet[3235]: W1212 17:29:33.781839 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.781977 kubelet[3235]: E1212 17:29:33.781859 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.782256 kubelet[3235]: E1212 17:29:33.782243 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.782440 kubelet[3235]: W1212 17:29:33.782297 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.782440 kubelet[3235]: E1212 17:29:33.782313 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.782601 kubelet[3235]: E1212 17:29:33.782590 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.782656 kubelet[3235]: W1212 17:29:33.782645 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.782837 kubelet[3235]: E1212 17:29:33.782699 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.782980 kubelet[3235]: E1212 17:29:33.782969 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.783041 kubelet[3235]: W1212 17:29:33.783031 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.783089 kubelet[3235]: E1212 17:29:33.783080 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.783435 kubelet[3235]: E1212 17:29:33.783405 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.783435 kubelet[3235]: W1212 17:29:33.783424 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.783569 kubelet[3235]: E1212 17:29:33.783438 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.784130 kubelet[3235]: E1212 17:29:33.784109 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.784202 kubelet[3235]: W1212 17:29:33.784189 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.784251 kubelet[3235]: E1212 17:29:33.784240 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.784563 kubelet[3235]: E1212 17:29:33.784449 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.784563 kubelet[3235]: W1212 17:29:33.784460 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.784563 kubelet[3235]: E1212 17:29:33.784470 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.784734 kubelet[3235]: E1212 17:29:33.784722 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.784789 kubelet[3235]: W1212 17:29:33.784779 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.784838 kubelet[3235]: E1212 17:29:33.784829 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.785133 kubelet[3235]: E1212 17:29:33.785018 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.785133 kubelet[3235]: W1212 17:29:33.785029 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.785133 kubelet[3235]: E1212 17:29:33.785038 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.785297 kubelet[3235]: E1212 17:29:33.785285 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.785347 kubelet[3235]: W1212 17:29:33.785337 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.785623 kubelet[3235]: E1212 17:29:33.785412 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.785685 kubelet[3235]: E1212 17:29:33.785663 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.785685 kubelet[3235]: W1212 17:29:33.785680 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.785734 kubelet[3235]: E1212 17:29:33.785693 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.785879 kubelet[3235]: E1212 17:29:33.785869 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.785879 kubelet[3235]: W1212 17:29:33.785878 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.785935 kubelet[3235]: E1212 17:29:33.785886 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.786079 kubelet[3235]: E1212 17:29:33.786067 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.786079 kubelet[3235]: W1212 17:29:33.786077 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.786133 kubelet[3235]: E1212 17:29:33.786085 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.786387 kubelet[3235]: E1212 17:29:33.786341 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.786387 kubelet[3235]: W1212 17:29:33.786356 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.786387 kubelet[3235]: E1212 17:29:33.786377 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.786518 kubelet[3235]: E1212 17:29:33.786506 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.786518 kubelet[3235]: W1212 17:29:33.786516 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.786610 kubelet[3235]: E1212 17:29:33.786524 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.786651 kubelet[3235]: E1212 17:29:33.786639 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.786651 kubelet[3235]: W1212 17:29:33.786649 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.786698 kubelet[3235]: E1212 17:29:33.786657 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.786849 kubelet[3235]: E1212 17:29:33.786838 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.786849 kubelet[3235]: W1212 17:29:33.786848 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.786900 kubelet[3235]: E1212 17:29:33.786858 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:33.787294 kubelet[3235]: E1212 17:29:33.787257 3235 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:29:33.787294 kubelet[3235]: W1212 17:29:33.787273 3235 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:29:33.787294 kubelet[3235]: E1212 17:29:33.787285 3235 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:29:34.060080 containerd[1652]: time="2025-12-12T17:29:34.060036373Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:29:34.060821 containerd[1652]: time="2025-12-12T17:29:34.060791775Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4266741" Dec 12 17:29:34.061835 containerd[1652]: time="2025-12-12T17:29:34.061788418Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:29:34.064209 containerd[1652]: time="2025-12-12T17:29:34.064145464Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:29:34.065233 containerd[1652]: time="2025-12-12T17:29:34.064808266Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 2.434653325s" Dec 12 17:29:34.065233 containerd[1652]: time="2025-12-12T17:29:34.064844266Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Dec 12 17:29:34.072779 containerd[1652]: time="2025-12-12T17:29:34.072733166Z" level=info msg="CreateContainer within sandbox \"f5549e7f7e7f4a92c6ef9bf6b444d6730b23980719b88ff7f459adee442f1381\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Dec 12 17:29:34.083023 containerd[1652]: time="2025-12-12T17:29:34.081570229Z" level=info msg="Container d21f34d5ca2b5f115df0b758b25c089a45a160dee18a3ba0dd169a500b7cf10f: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:29:34.091978 containerd[1652]: time="2025-12-12T17:29:34.091355535Z" level=info msg="CreateContainer within sandbox \"f5549e7f7e7f4a92c6ef9bf6b444d6730b23980719b88ff7f459adee442f1381\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"d21f34d5ca2b5f115df0b758b25c089a45a160dee18a3ba0dd169a500b7cf10f\"" Dec 12 17:29:34.093357 containerd[1652]: time="2025-12-12T17:29:34.093325700Z" level=info msg="StartContainer for \"d21f34d5ca2b5f115df0b758b25c089a45a160dee18a3ba0dd169a500b7cf10f\"" Dec 12 17:29:34.095914 containerd[1652]: time="2025-12-12T17:29:34.095884186Z" level=info msg="connecting to shim d21f34d5ca2b5f115df0b758b25c089a45a160dee18a3ba0dd169a500b7cf10f" address="unix:///run/containerd/s/cedd41179a2cfd09185367434a52cb2eed496483a28c88ca533400bbe5bcf22f" protocol=ttrpc version=3 Dec 12 17:29:34.118580 systemd[1]: Started cri-containerd-d21f34d5ca2b5f115df0b758b25c089a45a160dee18a3ba0dd169a500b7cf10f.scope - libcontainer container d21f34d5ca2b5f115df0b758b25c089a45a160dee18a3ba0dd169a500b7cf10f. Dec 12 17:29:34.193328 containerd[1652]: time="2025-12-12T17:29:34.193289880Z" level=info msg="StartContainer for \"d21f34d5ca2b5f115df0b758b25c089a45a160dee18a3ba0dd169a500b7cf10f\" returns successfully" Dec 12 17:29:34.207498 systemd[1]: cri-containerd-d21f34d5ca2b5f115df0b758b25c089a45a160dee18a3ba0dd169a500b7cf10f.scope: Deactivated successfully. Dec 12 17:29:34.212060 containerd[1652]: time="2025-12-12T17:29:34.212015128Z" level=info msg="received container exit event container_id:\"d21f34d5ca2b5f115df0b758b25c089a45a160dee18a3ba0dd169a500b7cf10f\" id:\"d21f34d5ca2b5f115df0b758b25c089a45a160dee18a3ba0dd169a500b7cf10f\" pid:3971 exited_at:{seconds:1765560574 nanos:211604327}" Dec 12 17:29:34.231682 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d21f34d5ca2b5f115df0b758b25c089a45a160dee18a3ba0dd169a500b7cf10f-rootfs.mount: Deactivated successfully. Dec 12 17:29:34.713600 kubelet[3235]: I1212 17:29:34.713544 3235 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-64575d46b8-79btq" podStartSLOduration=3.963393725 podStartE2EDuration="5.713530591s" podCreationTimestamp="2025-12-12 17:29:29 +0000 UTC" firstStartedPulling="2025-12-12 17:29:29.879769955 +0000 UTC m=+26.362009440" lastFinishedPulling="2025-12-12 17:29:31.629906821 +0000 UTC m=+28.112146306" observedRunningTime="2025-12-12 17:29:32.70354473 +0000 UTC m=+29.185784215" watchObservedRunningTime="2025-12-12 17:29:34.713530591 +0000 UTC m=+31.195770076" Dec 12 17:29:35.619849 kubelet[3235]: E1212 17:29:35.619691 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ksbxj" podUID="5bf56c15-e8b7-4324-9dd0-89444eba43fb" Dec 12 17:29:35.705105 containerd[1652]: time="2025-12-12T17:29:35.705064167Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Dec 12 17:29:37.619871 kubelet[3235]: E1212 17:29:37.619827 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ksbxj" podUID="5bf56c15-e8b7-4324-9dd0-89444eba43fb" Dec 12 17:29:37.991552 containerd[1652]: time="2025-12-12T17:29:37.991457626Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:29:37.992553 containerd[1652]: time="2025-12-12T17:29:37.992511988Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65925816" Dec 12 17:29:37.994314 containerd[1652]: time="2025-12-12T17:29:37.994276793Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:29:37.996867 containerd[1652]: time="2025-12-12T17:29:37.996797600Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:29:37.997767 containerd[1652]: time="2025-12-12T17:29:37.997519441Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 2.292397314s" Dec 12 17:29:37.997767 containerd[1652]: time="2025-12-12T17:29:37.997568602Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Dec 12 17:29:38.002958 containerd[1652]: time="2025-12-12T17:29:38.002909895Z" level=info msg="CreateContainer within sandbox \"f5549e7f7e7f4a92c6ef9bf6b444d6730b23980719b88ff7f459adee442f1381\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Dec 12 17:29:38.014650 containerd[1652]: time="2025-12-12T17:29:38.014242645Z" level=info msg="Container dc344b948c19c2ef401286d5999b725f10dd7d10c4ec3cdf59fa832d311eb16b: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:29:38.024608 containerd[1652]: time="2025-12-12T17:29:38.024545912Z" level=info msg="CreateContainer within sandbox \"f5549e7f7e7f4a92c6ef9bf6b444d6730b23980719b88ff7f459adee442f1381\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"dc344b948c19c2ef401286d5999b725f10dd7d10c4ec3cdf59fa832d311eb16b\"" Dec 12 17:29:38.025124 containerd[1652]: time="2025-12-12T17:29:38.025088393Z" level=info msg="StartContainer for \"dc344b948c19c2ef401286d5999b725f10dd7d10c4ec3cdf59fa832d311eb16b\"" Dec 12 17:29:38.027379 containerd[1652]: time="2025-12-12T17:29:38.027251759Z" level=info msg="connecting to shim dc344b948c19c2ef401286d5999b725f10dd7d10c4ec3cdf59fa832d311eb16b" address="unix:///run/containerd/s/cedd41179a2cfd09185367434a52cb2eed496483a28c88ca533400bbe5bcf22f" protocol=ttrpc version=3 Dec 12 17:29:38.044567 systemd[1]: Started cri-containerd-dc344b948c19c2ef401286d5999b725f10dd7d10c4ec3cdf59fa832d311eb16b.scope - libcontainer container dc344b948c19c2ef401286d5999b725f10dd7d10c4ec3cdf59fa832d311eb16b. Dec 12 17:29:38.136145 containerd[1652]: time="2025-12-12T17:29:38.136079721Z" level=info msg="StartContainer for \"dc344b948c19c2ef401286d5999b725f10dd7d10c4ec3cdf59fa832d311eb16b\" returns successfully" Dec 12 17:29:39.398195 containerd[1652]: time="2025-12-12T17:29:39.398137480Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 12 17:29:39.399986 systemd[1]: cri-containerd-dc344b948c19c2ef401286d5999b725f10dd7d10c4ec3cdf59fa832d311eb16b.scope: Deactivated successfully. Dec 12 17:29:39.400259 systemd[1]: cri-containerd-dc344b948c19c2ef401286d5999b725f10dd7d10c4ec3cdf59fa832d311eb16b.scope: Consumed 455ms CPU time, 188.5M memory peak, 165.9M written to disk. Dec 12 17:29:39.401901 containerd[1652]: time="2025-12-12T17:29:39.401818929Z" level=info msg="received container exit event container_id:\"dc344b948c19c2ef401286d5999b725f10dd7d10c4ec3cdf59fa832d311eb16b\" id:\"dc344b948c19c2ef401286d5999b725f10dd7d10c4ec3cdf59fa832d311eb16b\" pid:4031 exited_at:{seconds:1765560579 nanos:401647729}" Dec 12 17:29:39.420477 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-dc344b948c19c2ef401286d5999b725f10dd7d10c4ec3cdf59fa832d311eb16b-rootfs.mount: Deactivated successfully. Dec 12 17:29:39.483512 kubelet[3235]: I1212 17:29:39.483423 3235 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Dec 12 17:29:40.754804 systemd[1]: Created slice kubepods-besteffort-podfc15d043_74d0_4be2_a3f9_2946b5a0ba7c.slice - libcontainer container kubepods-besteffort-podfc15d043_74d0_4be2_a3f9_2946b5a0ba7c.slice. Dec 12 17:29:40.831796 kubelet[3235]: I1212 17:29:40.831712 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgnjm\" (UniqueName: \"kubernetes.io/projected/fc15d043-74d0-4be2-a3f9-2946b5a0ba7c-kube-api-access-mgnjm\") pod \"whisker-7d8c7f4f44-hst8t\" (UID: \"fc15d043-74d0-4be2-a3f9-2946b5a0ba7c\") " pod="calico-system/whisker-7d8c7f4f44-hst8t" Dec 12 17:29:40.831796 kubelet[3235]: I1212 17:29:40.831765 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/fc15d043-74d0-4be2-a3f9-2946b5a0ba7c-whisker-backend-key-pair\") pod \"whisker-7d8c7f4f44-hst8t\" (UID: \"fc15d043-74d0-4be2-a3f9-2946b5a0ba7c\") " pod="calico-system/whisker-7d8c7f4f44-hst8t" Dec 12 17:29:40.832229 kubelet[3235]: I1212 17:29:40.831879 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc15d043-74d0-4be2-a3f9-2946b5a0ba7c-whisker-ca-bundle\") pod \"whisker-7d8c7f4f44-hst8t\" (UID: \"fc15d043-74d0-4be2-a3f9-2946b5a0ba7c\") " pod="calico-system/whisker-7d8c7f4f44-hst8t" Dec 12 17:29:40.856815 systemd[1]: Created slice kubepods-besteffort-pod5bf56c15_e8b7_4324_9dd0_89444eba43fb.slice - libcontainer container kubepods-besteffort-pod5bf56c15_e8b7_4324_9dd0_89444eba43fb.slice. Dec 12 17:29:40.863337 containerd[1652]: time="2025-12-12T17:29:40.862851524Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ksbxj,Uid:5bf56c15-e8b7-4324-9dd0-89444eba43fb,Namespace:calico-system,Attempt:0,}" Dec 12 17:29:40.865177 systemd[1]: Created slice kubepods-burstable-pod2a50682a_2098_46b3_a84d_a1bc6d985fd9.slice - libcontainer container kubepods-burstable-pod2a50682a_2098_46b3_a84d_a1bc6d985fd9.slice. Dec 12 17:29:40.881085 systemd[1]: Created slice kubepods-burstable-podef22260f_549b_4cab_ad32_7abdf107199f.slice - libcontainer container kubepods-burstable-podef22260f_549b_4cab_ad32_7abdf107199f.slice. Dec 12 17:29:40.888501 systemd[1]: Created slice kubepods-besteffort-pod419c68ca_a927_4109_af9a_f076c2eb6b23.slice - libcontainer container kubepods-besteffort-pod419c68ca_a927_4109_af9a_f076c2eb6b23.slice. Dec 12 17:29:40.895653 systemd[1]: Created slice kubepods-besteffort-pod7a593db9_1db8_4942_815e_ae24e8a457d5.slice - libcontainer container kubepods-besteffort-pod7a593db9_1db8_4942_815e_ae24e8a457d5.slice. Dec 12 17:29:40.900757 systemd[1]: Created slice kubepods-besteffort-podf10953b5_1654_43b7_bb1b_acb41105d201.slice - libcontainer container kubepods-besteffort-podf10953b5_1654_43b7_bb1b_acb41105d201.slice. Dec 12 17:29:40.908840 systemd[1]: Created slice kubepods-besteffort-podf134713e_1f0f_4a16_9202_6adaa12db341.slice - libcontainer container kubepods-besteffort-podf134713e_1f0f_4a16_9202_6adaa12db341.slice. Dec 12 17:29:40.912269 systemd[1]: Created slice kubepods-besteffort-podafc05240_ed5c_4c99_8e0c_0cca61ebd35a.slice - libcontainer container kubepods-besteffort-podafc05240_ed5c_4c99_8e0c_0cca61ebd35a.slice. Dec 12 17:29:40.932217 kubelet[3235]: I1212 17:29:40.932171 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtxw8\" (UniqueName: \"kubernetes.io/projected/afc05240-ed5c-4c99-8e0c-0cca61ebd35a-kube-api-access-mtxw8\") pod \"goldmane-7c778bb748-fvf46\" (UID: \"afc05240-ed5c-4c99-8e0c-0cca61ebd35a\") " pod="calico-system/goldmane-7c778bb748-fvf46" Dec 12 17:29:40.932217 kubelet[3235]: I1212 17:29:40.932214 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/419c68ca-a927-4109-af9a-f076c2eb6b23-calico-apiserver-certs\") pod \"calico-apiserver-7d565b7c7c-954tf\" (UID: \"419c68ca-a927-4109-af9a-f076c2eb6b23\") " pod="calico-apiserver/calico-apiserver-7d565b7c7c-954tf" Dec 12 17:29:40.932217 kubelet[3235]: I1212 17:29:40.932253 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq9jm\" (UniqueName: \"kubernetes.io/projected/419c68ca-a927-4109-af9a-f076c2eb6b23-kube-api-access-mq9jm\") pod \"calico-apiserver-7d565b7c7c-954tf\" (UID: \"419c68ca-a927-4109-af9a-f076c2eb6b23\") " pod="calico-apiserver/calico-apiserver-7d565b7c7c-954tf" Dec 12 17:29:40.932217 kubelet[3235]: I1212 17:29:40.932269 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f10953b5-1654-43b7-bb1b-acb41105d201-calico-apiserver-certs\") pod \"calico-apiserver-7bccd8547-rp9k9\" (UID: \"f10953b5-1654-43b7-bb1b-acb41105d201\") " pod="calico-apiserver/calico-apiserver-7bccd8547-rp9k9" Dec 12 17:29:40.932217 kubelet[3235]: I1212 17:29:40.932296 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/afc05240-ed5c-4c99-8e0c-0cca61ebd35a-goldmane-ca-bundle\") pod \"goldmane-7c778bb748-fvf46\" (UID: \"afc05240-ed5c-4c99-8e0c-0cca61ebd35a\") " pod="calico-system/goldmane-7c778bb748-fvf46" Dec 12 17:29:40.932578 kubelet[3235]: I1212 17:29:40.932311 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp6bf\" (UniqueName: \"kubernetes.io/projected/ef22260f-549b-4cab-ad32-7abdf107199f-kube-api-access-vp6bf\") pod \"coredns-66bc5c9577-6r5pt\" (UID: \"ef22260f-549b-4cab-ad32-7abdf107199f\") " pod="kube-system/coredns-66bc5c9577-6r5pt" Dec 12 17:29:40.932578 kubelet[3235]: I1212 17:29:40.932327 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f99n\" (UniqueName: \"kubernetes.io/projected/2a50682a-2098-46b3-a84d-a1bc6d985fd9-kube-api-access-4f99n\") pod \"coredns-66bc5c9577-zmb9q\" (UID: \"2a50682a-2098-46b3-a84d-a1bc6d985fd9\") " pod="kube-system/coredns-66bc5c9577-zmb9q" Dec 12 17:29:40.932578 kubelet[3235]: I1212 17:29:40.932343 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-549ns\" (UniqueName: \"kubernetes.io/projected/7a593db9-1db8-4942-815e-ae24e8a457d5-kube-api-access-549ns\") pod \"calico-apiserver-7bccd8547-ssn5z\" (UID: \"7a593db9-1db8-4942-815e-ae24e8a457d5\") " pod="calico-apiserver/calico-apiserver-7bccd8547-ssn5z" Dec 12 17:29:40.932578 kubelet[3235]: I1212 17:29:40.932358 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9pjf\" (UniqueName: \"kubernetes.io/projected/f10953b5-1654-43b7-bb1b-acb41105d201-kube-api-access-t9pjf\") pod \"calico-apiserver-7bccd8547-rp9k9\" (UID: \"f10953b5-1654-43b7-bb1b-acb41105d201\") " pod="calico-apiserver/calico-apiserver-7bccd8547-rp9k9" Dec 12 17:29:40.932578 kubelet[3235]: I1212 17:29:40.932389 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/afc05240-ed5c-4c99-8e0c-0cca61ebd35a-goldmane-key-pair\") pod \"goldmane-7c778bb748-fvf46\" (UID: \"afc05240-ed5c-4c99-8e0c-0cca61ebd35a\") " pod="calico-system/goldmane-7c778bb748-fvf46" Dec 12 17:29:40.934380 kubelet[3235]: I1212 17:29:40.933571 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef22260f-549b-4cab-ad32-7abdf107199f-config-volume\") pod \"coredns-66bc5c9577-6r5pt\" (UID: \"ef22260f-549b-4cab-ad32-7abdf107199f\") " pod="kube-system/coredns-66bc5c9577-6r5pt" Dec 12 17:29:40.934380 kubelet[3235]: I1212 17:29:40.933788 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a50682a-2098-46b3-a84d-a1bc6d985fd9-config-volume\") pod \"coredns-66bc5c9577-zmb9q\" (UID: \"2a50682a-2098-46b3-a84d-a1bc6d985fd9\") " pod="kube-system/coredns-66bc5c9577-zmb9q" Dec 12 17:29:40.935573 kubelet[3235]: I1212 17:29:40.934607 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f134713e-1f0f-4a16-9202-6adaa12db341-tigera-ca-bundle\") pod \"calico-kube-controllers-68c844c6f6-fvk5j\" (UID: \"f134713e-1f0f-4a16-9202-6adaa12db341\") " pod="calico-system/calico-kube-controllers-68c844c6f6-fvk5j" Dec 12 17:29:40.935573 kubelet[3235]: I1212 17:29:40.934659 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afc05240-ed5c-4c99-8e0c-0cca61ebd35a-config\") pod \"goldmane-7c778bb748-fvf46\" (UID: \"afc05240-ed5c-4c99-8e0c-0cca61ebd35a\") " pod="calico-system/goldmane-7c778bb748-fvf46" Dec 12 17:29:40.935573 kubelet[3235]: I1212 17:29:40.934678 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/7a593db9-1db8-4942-815e-ae24e8a457d5-calico-apiserver-certs\") pod \"calico-apiserver-7bccd8547-ssn5z\" (UID: \"7a593db9-1db8-4942-815e-ae24e8a457d5\") " pod="calico-apiserver/calico-apiserver-7bccd8547-ssn5z" Dec 12 17:29:40.935573 kubelet[3235]: I1212 17:29:40.934707 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbvmw\" (UniqueName: \"kubernetes.io/projected/f134713e-1f0f-4a16-9202-6adaa12db341-kube-api-access-cbvmw\") pod \"calico-kube-controllers-68c844c6f6-fvk5j\" (UID: \"f134713e-1f0f-4a16-9202-6adaa12db341\") " pod="calico-system/calico-kube-controllers-68c844c6f6-fvk5j" Dec 12 17:29:40.963463 containerd[1652]: time="2025-12-12T17:29:40.963418066Z" level=error msg="Failed to destroy network for sandbox \"b055a3d3329126c7a5718906109365c711e908236099bebc6541efd21328e968\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:40.965191 containerd[1652]: time="2025-12-12T17:29:40.965013910Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ksbxj,Uid:5bf56c15-e8b7-4324-9dd0-89444eba43fb,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b055a3d3329126c7a5718906109365c711e908236099bebc6541efd21328e968\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:40.965195 systemd[1]: run-netns-cni\x2d5e62e3a8\x2d4784\x2d2c02\x2d38c6\x2d29bf2560cc56.mount: Deactivated successfully. Dec 12 17:29:40.965383 kubelet[3235]: E1212 17:29:40.965220 3235 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b055a3d3329126c7a5718906109365c711e908236099bebc6541efd21328e968\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:40.965383 kubelet[3235]: E1212 17:29:40.965278 3235 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b055a3d3329126c7a5718906109365c711e908236099bebc6541efd21328e968\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ksbxj" Dec 12 17:29:40.965383 kubelet[3235]: E1212 17:29:40.965296 3235 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b055a3d3329126c7a5718906109365c711e908236099bebc6541efd21328e968\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ksbxj" Dec 12 17:29:40.965474 kubelet[3235]: E1212 17:29:40.965341 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-ksbxj_calico-system(5bf56c15-e8b7-4324-9dd0-89444eba43fb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-ksbxj_calico-system(5bf56c15-e8b7-4324-9dd0-89444eba43fb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b055a3d3329126c7a5718906109365c711e908236099bebc6541efd21328e968\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-ksbxj" podUID="5bf56c15-e8b7-4324-9dd0-89444eba43fb" Dec 12 17:29:41.063546 containerd[1652]: time="2025-12-12T17:29:41.063447686Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7d8c7f4f44-hst8t,Uid:fc15d043-74d0-4be2-a3f9-2946b5a0ba7c,Namespace:calico-system,Attempt:0,}" Dec 12 17:29:41.105702 containerd[1652]: time="2025-12-12T17:29:41.105641275Z" level=error msg="Failed to destroy network for sandbox \"7801cf8175bfe4b66089615530e52a179eb1e2744a2cffb6b6bb0529acdc96cd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:41.107494 containerd[1652]: time="2025-12-12T17:29:41.107461680Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7d8c7f4f44-hst8t,Uid:fc15d043-74d0-4be2-a3f9-2946b5a0ba7c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7801cf8175bfe4b66089615530e52a179eb1e2744a2cffb6b6bb0529acdc96cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:41.107751 kubelet[3235]: E1212 17:29:41.107709 3235 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7801cf8175bfe4b66089615530e52a179eb1e2744a2cffb6b6bb0529acdc96cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:41.107795 kubelet[3235]: E1212 17:29:41.107775 3235 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7801cf8175bfe4b66089615530e52a179eb1e2744a2cffb6b6bb0529acdc96cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7d8c7f4f44-hst8t" Dec 12 17:29:41.107836 kubelet[3235]: E1212 17:29:41.107794 3235 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7801cf8175bfe4b66089615530e52a179eb1e2744a2cffb6b6bb0529acdc96cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7d8c7f4f44-hst8t" Dec 12 17:29:41.107888 kubelet[3235]: E1212 17:29:41.107861 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7d8c7f4f44-hst8t_calico-system(fc15d043-74d0-4be2-a3f9-2946b5a0ba7c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7d8c7f4f44-hst8t_calico-system(fc15d043-74d0-4be2-a3f9-2946b5a0ba7c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7801cf8175bfe4b66089615530e52a179eb1e2744a2cffb6b6bb0529acdc96cd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7d8c7f4f44-hst8t" podUID="fc15d043-74d0-4be2-a3f9-2946b5a0ba7c" Dec 12 17:29:41.173908 containerd[1652]: time="2025-12-12T17:29:41.173867092Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-zmb9q,Uid:2a50682a-2098-46b3-a84d-a1bc6d985fd9,Namespace:kube-system,Attempt:0,}" Dec 12 17:29:41.188244 containerd[1652]: time="2025-12-12T17:29:41.188194930Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-6r5pt,Uid:ef22260f-549b-4cab-ad32-7abdf107199f,Namespace:kube-system,Attempt:0,}" Dec 12 17:29:41.194387 containerd[1652]: time="2025-12-12T17:29:41.194336466Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d565b7c7c-954tf,Uid:419c68ca-a927-4109-af9a-f076c2eb6b23,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:29:41.202330 containerd[1652]: time="2025-12-12T17:29:41.202294406Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7bccd8547-ssn5z,Uid:7a593db9-1db8-4942-815e-ae24e8a457d5,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:29:41.218358 containerd[1652]: time="2025-12-12T17:29:41.218276328Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7bccd8547-rp9k9,Uid:f10953b5-1654-43b7-bb1b-acb41105d201,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:29:41.220739 containerd[1652]: time="2025-12-12T17:29:41.220695894Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-fvf46,Uid:afc05240-ed5c-4c99-8e0c-0cca61ebd35a,Namespace:calico-system,Attempt:0,}" Dec 12 17:29:41.222770 containerd[1652]: time="2025-12-12T17:29:41.222712779Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68c844c6f6-fvk5j,Uid:f134713e-1f0f-4a16-9202-6adaa12db341,Namespace:calico-system,Attempt:0,}" Dec 12 17:29:41.237315 containerd[1652]: time="2025-12-12T17:29:41.237263457Z" level=error msg="Failed to destroy network for sandbox \"acf9ce1fdd1b28215e014c38b8e585e8f637f7e5ae66ebb093dd0b6f7f22b128\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:41.242193 containerd[1652]: time="2025-12-12T17:29:41.242099390Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-zmb9q,Uid:2a50682a-2098-46b3-a84d-a1bc6d985fd9,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"acf9ce1fdd1b28215e014c38b8e585e8f637f7e5ae66ebb093dd0b6f7f22b128\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:41.242413 kubelet[3235]: E1212 17:29:41.242347 3235 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"acf9ce1fdd1b28215e014c38b8e585e8f637f7e5ae66ebb093dd0b6f7f22b128\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:41.242577 kubelet[3235]: E1212 17:29:41.242444 3235 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"acf9ce1fdd1b28215e014c38b8e585e8f637f7e5ae66ebb093dd0b6f7f22b128\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-zmb9q" Dec 12 17:29:41.242577 kubelet[3235]: E1212 17:29:41.242469 3235 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"acf9ce1fdd1b28215e014c38b8e585e8f637f7e5ae66ebb093dd0b6f7f22b128\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-zmb9q" Dec 12 17:29:41.242860 kubelet[3235]: E1212 17:29:41.242821 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-zmb9q_kube-system(2a50682a-2098-46b3-a84d-a1bc6d985fd9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-zmb9q_kube-system(2a50682a-2098-46b3-a84d-a1bc6d985fd9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"acf9ce1fdd1b28215e014c38b8e585e8f637f7e5ae66ebb093dd0b6f7f22b128\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-zmb9q" podUID="2a50682a-2098-46b3-a84d-a1bc6d985fd9" Dec 12 17:29:41.276153 containerd[1652]: time="2025-12-12T17:29:41.276109198Z" level=error msg="Failed to destroy network for sandbox \"632decb4b824edf8b189e7cebe19ae5c14ca9da25428699b2263260709d661a0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:41.278362 containerd[1652]: time="2025-12-12T17:29:41.278309404Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-6r5pt,Uid:ef22260f-549b-4cab-ad32-7abdf107199f,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"632decb4b824edf8b189e7cebe19ae5c14ca9da25428699b2263260709d661a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:41.278546 containerd[1652]: time="2025-12-12T17:29:41.278413404Z" level=error msg="Failed to destroy network for sandbox \"ac6654d5723095cbf8e8a67de95aaadaa5a82af58fe0e9c4a7ea8ef049b5398d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:41.278767 kubelet[3235]: E1212 17:29:41.278733 3235 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"632decb4b824edf8b189e7cebe19ae5c14ca9da25428699b2263260709d661a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:41.278808 kubelet[3235]: E1212 17:29:41.278787 3235 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"632decb4b824edf8b189e7cebe19ae5c14ca9da25428699b2263260709d661a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-6r5pt" Dec 12 17:29:41.278835 kubelet[3235]: E1212 17:29:41.278811 3235 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"632decb4b824edf8b189e7cebe19ae5c14ca9da25428699b2263260709d661a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-6r5pt" Dec 12 17:29:41.278879 kubelet[3235]: E1212 17:29:41.278856 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-6r5pt_kube-system(ef22260f-549b-4cab-ad32-7abdf107199f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-6r5pt_kube-system(ef22260f-549b-4cab-ad32-7abdf107199f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"632decb4b824edf8b189e7cebe19ae5c14ca9da25428699b2263260709d661a0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-6r5pt" podUID="ef22260f-549b-4cab-ad32-7abdf107199f" Dec 12 17:29:41.281244 containerd[1652]: time="2025-12-12T17:29:41.281165771Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d565b7c7c-954tf,Uid:419c68ca-a927-4109-af9a-f076c2eb6b23,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac6654d5723095cbf8e8a67de95aaadaa5a82af58fe0e9c4a7ea8ef049b5398d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:41.281464 kubelet[3235]: E1212 17:29:41.281431 3235 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac6654d5723095cbf8e8a67de95aaadaa5a82af58fe0e9c4a7ea8ef049b5398d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:41.281530 kubelet[3235]: E1212 17:29:41.281484 3235 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac6654d5723095cbf8e8a67de95aaadaa5a82af58fe0e9c4a7ea8ef049b5398d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d565b7c7c-954tf" Dec 12 17:29:41.281530 kubelet[3235]: E1212 17:29:41.281505 3235 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac6654d5723095cbf8e8a67de95aaadaa5a82af58fe0e9c4a7ea8ef049b5398d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d565b7c7c-954tf" Dec 12 17:29:41.281597 kubelet[3235]: E1212 17:29:41.281548 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7d565b7c7c-954tf_calico-apiserver(419c68ca-a927-4109-af9a-f076c2eb6b23)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7d565b7c7c-954tf_calico-apiserver(419c68ca-a927-4109-af9a-f076c2eb6b23)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ac6654d5723095cbf8e8a67de95aaadaa5a82af58fe0e9c4a7ea8ef049b5398d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7d565b7c7c-954tf" podUID="419c68ca-a927-4109-af9a-f076c2eb6b23" Dec 12 17:29:41.302656 containerd[1652]: time="2025-12-12T17:29:41.302540147Z" level=error msg="Failed to destroy network for sandbox \"24c20bd4ae2135a6a166d033ca84154740dfd986b1b79caaa27fd318f67aaed7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:41.303797 containerd[1652]: time="2025-12-12T17:29:41.303756070Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7bccd8547-ssn5z,Uid:7a593db9-1db8-4942-815e-ae24e8a457d5,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"24c20bd4ae2135a6a166d033ca84154740dfd986b1b79caaa27fd318f67aaed7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:41.304182 kubelet[3235]: E1212 17:29:41.304134 3235 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"24c20bd4ae2135a6a166d033ca84154740dfd986b1b79caaa27fd318f67aaed7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:41.304249 kubelet[3235]: E1212 17:29:41.304190 3235 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"24c20bd4ae2135a6a166d033ca84154740dfd986b1b79caaa27fd318f67aaed7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7bccd8547-ssn5z" Dec 12 17:29:41.304249 kubelet[3235]: E1212 17:29:41.304210 3235 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"24c20bd4ae2135a6a166d033ca84154740dfd986b1b79caaa27fd318f67aaed7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7bccd8547-ssn5z" Dec 12 17:29:41.304293 kubelet[3235]: E1212 17:29:41.304265 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7bccd8547-ssn5z_calico-apiserver(7a593db9-1db8-4942-815e-ae24e8a457d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7bccd8547-ssn5z_calico-apiserver(7a593db9-1db8-4942-815e-ae24e8a457d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"24c20bd4ae2135a6a166d033ca84154740dfd986b1b79caaa27fd318f67aaed7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7bccd8547-ssn5z" podUID="7a593db9-1db8-4942-815e-ae24e8a457d5" Dec 12 17:29:41.314873 containerd[1652]: time="2025-12-12T17:29:41.314334377Z" level=error msg="Failed to destroy network for sandbox \"ca8db7c03de5d19ac417c4e396b900366b9dee1dd0cdb677152269d697dd18c3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:41.316336 containerd[1652]: time="2025-12-12T17:29:41.316265822Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7bccd8547-rp9k9,Uid:f10953b5-1654-43b7-bb1b-acb41105d201,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca8db7c03de5d19ac417c4e396b900366b9dee1dd0cdb677152269d697dd18c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:41.316574 kubelet[3235]: E1212 17:29:41.316535 3235 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca8db7c03de5d19ac417c4e396b900366b9dee1dd0cdb677152269d697dd18c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:41.316622 kubelet[3235]: E1212 17:29:41.316595 3235 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca8db7c03de5d19ac417c4e396b900366b9dee1dd0cdb677152269d697dd18c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7bccd8547-rp9k9" Dec 12 17:29:41.316622 kubelet[3235]: E1212 17:29:41.316617 3235 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca8db7c03de5d19ac417c4e396b900366b9dee1dd0cdb677152269d697dd18c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7bccd8547-rp9k9" Dec 12 17:29:41.316725 kubelet[3235]: E1212 17:29:41.316662 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7bccd8547-rp9k9_calico-apiserver(f10953b5-1654-43b7-bb1b-acb41105d201)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7bccd8547-rp9k9_calico-apiserver(f10953b5-1654-43b7-bb1b-acb41105d201)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ca8db7c03de5d19ac417c4e396b900366b9dee1dd0cdb677152269d697dd18c3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7bccd8547-rp9k9" podUID="f10953b5-1654-43b7-bb1b-acb41105d201" Dec 12 17:29:41.324806 containerd[1652]: time="2025-12-12T17:29:41.324760684Z" level=error msg="Failed to destroy network for sandbox \"c2e6d1b33e569309f54531e5cb2d5c9a2fa5070f4854e0d3648f67ece32f2942\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:41.326580 containerd[1652]: time="2025-12-12T17:29:41.326548289Z" level=error msg="Failed to destroy network for sandbox \"f88d553ae5965497053774144d9d93c35ac5c0c4d994db9f8eb71589667838cb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:41.327006 containerd[1652]: time="2025-12-12T17:29:41.326960970Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-fvf46,Uid:afc05240-ed5c-4c99-8e0c-0cca61ebd35a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2e6d1b33e569309f54531e5cb2d5c9a2fa5070f4854e0d3648f67ece32f2942\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:41.327230 kubelet[3235]: E1212 17:29:41.327174 3235 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2e6d1b33e569309f54531e5cb2d5c9a2fa5070f4854e0d3648f67ece32f2942\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:41.327278 kubelet[3235]: E1212 17:29:41.327244 3235 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2e6d1b33e569309f54531e5cb2d5c9a2fa5070f4854e0d3648f67ece32f2942\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-fvf46" Dec 12 17:29:41.327278 kubelet[3235]: E1212 17:29:41.327263 3235 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2e6d1b33e569309f54531e5cb2d5c9a2fa5070f4854e0d3648f67ece32f2942\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-fvf46" Dec 12 17:29:41.327332 kubelet[3235]: E1212 17:29:41.327310 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7c778bb748-fvf46_calico-system(afc05240-ed5c-4c99-8e0c-0cca61ebd35a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7c778bb748-fvf46_calico-system(afc05240-ed5c-4c99-8e0c-0cca61ebd35a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c2e6d1b33e569309f54531e5cb2d5c9a2fa5070f4854e0d3648f67ece32f2942\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7c778bb748-fvf46" podUID="afc05240-ed5c-4c99-8e0c-0cca61ebd35a" Dec 12 17:29:41.329567 containerd[1652]: time="2025-12-12T17:29:41.329363496Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68c844c6f6-fvk5j,Uid:f134713e-1f0f-4a16-9202-6adaa12db341,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f88d553ae5965497053774144d9d93c35ac5c0c4d994db9f8eb71589667838cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:41.330050 kubelet[3235]: E1212 17:29:41.330004 3235 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f88d553ae5965497053774144d9d93c35ac5c0c4d994db9f8eb71589667838cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:29:41.330108 kubelet[3235]: E1212 17:29:41.330066 3235 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f88d553ae5965497053774144d9d93c35ac5c0c4d994db9f8eb71589667838cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-68c844c6f6-fvk5j" Dec 12 17:29:41.330108 kubelet[3235]: E1212 17:29:41.330085 3235 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f88d553ae5965497053774144d9d93c35ac5c0c4d994db9f8eb71589667838cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-68c844c6f6-fvk5j" Dec 12 17:29:41.330307 kubelet[3235]: E1212 17:29:41.330137 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-68c844c6f6-fvk5j_calico-system(f134713e-1f0f-4a16-9202-6adaa12db341)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-68c844c6f6-fvk5j_calico-system(f134713e-1f0f-4a16-9202-6adaa12db341)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f88d553ae5965497053774144d9d93c35ac5c0c4d994db9f8eb71589667838cb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68c844c6f6-fvk5j" podUID="f134713e-1f0f-4a16-9202-6adaa12db341" Dec 12 17:29:41.721870 containerd[1652]: time="2025-12-12T17:29:41.721724476Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Dec 12 17:29:46.159099 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1858627331.mount: Deactivated successfully. Dec 12 17:29:46.178542 containerd[1652]: time="2025-12-12T17:29:46.178478532Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:29:46.179768 containerd[1652]: time="2025-12-12T17:29:46.179708816Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150934562" Dec 12 17:29:46.180975 containerd[1652]: time="2025-12-12T17:29:46.180921379Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:29:46.182995 containerd[1652]: time="2025-12-12T17:29:46.182964104Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:29:46.183526 containerd[1652]: time="2025-12-12T17:29:46.183492545Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 4.461711589s" Dec 12 17:29:46.183564 containerd[1652]: time="2025-12-12T17:29:46.183529826Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Dec 12 17:29:46.199665 containerd[1652]: time="2025-12-12T17:29:46.199597027Z" level=info msg="CreateContainer within sandbox \"f5549e7f7e7f4a92c6ef9bf6b444d6730b23980719b88ff7f459adee442f1381\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Dec 12 17:29:46.212405 containerd[1652]: time="2025-12-12T17:29:46.212154540Z" level=info msg="Container 27248a4fe89d64ec358dc29eae295d66510f5a5b09b7b236bf0433d6696607cc: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:29:46.223494 containerd[1652]: time="2025-12-12T17:29:46.223399929Z" level=info msg="CreateContainer within sandbox \"f5549e7f7e7f4a92c6ef9bf6b444d6730b23980719b88ff7f459adee442f1381\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"27248a4fe89d64ec358dc29eae295d66510f5a5b09b7b236bf0433d6696607cc\"" Dec 12 17:29:46.224385 containerd[1652]: time="2025-12-12T17:29:46.224192411Z" level=info msg="StartContainer for \"27248a4fe89d64ec358dc29eae295d66510f5a5b09b7b236bf0433d6696607cc\"" Dec 12 17:29:46.225960 containerd[1652]: time="2025-12-12T17:29:46.225929256Z" level=info msg="connecting to shim 27248a4fe89d64ec358dc29eae295d66510f5a5b09b7b236bf0433d6696607cc" address="unix:///run/containerd/s/cedd41179a2cfd09185367434a52cb2eed496483a28c88ca533400bbe5bcf22f" protocol=ttrpc version=3 Dec 12 17:29:46.262613 systemd[1]: Started cri-containerd-27248a4fe89d64ec358dc29eae295d66510f5a5b09b7b236bf0433d6696607cc.scope - libcontainer container 27248a4fe89d64ec358dc29eae295d66510f5a5b09b7b236bf0433d6696607cc. Dec 12 17:29:46.341808 containerd[1652]: time="2025-12-12T17:29:46.341753997Z" level=info msg="StartContainer for \"27248a4fe89d64ec358dc29eae295d66510f5a5b09b7b236bf0433d6696607cc\" returns successfully" Dec 12 17:29:46.473325 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Dec 12 17:29:46.473461 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Dec 12 17:29:46.673572 kubelet[3235]: I1212 17:29:46.673528 3235 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgnjm\" (UniqueName: \"kubernetes.io/projected/fc15d043-74d0-4be2-a3f9-2946b5a0ba7c-kube-api-access-mgnjm\") pod \"fc15d043-74d0-4be2-a3f9-2946b5a0ba7c\" (UID: \"fc15d043-74d0-4be2-a3f9-2946b5a0ba7c\") " Dec 12 17:29:46.673908 kubelet[3235]: I1212 17:29:46.673583 3235 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/fc15d043-74d0-4be2-a3f9-2946b5a0ba7c-whisker-backend-key-pair\") pod \"fc15d043-74d0-4be2-a3f9-2946b5a0ba7c\" (UID: \"fc15d043-74d0-4be2-a3f9-2946b5a0ba7c\") " Dec 12 17:29:46.673908 kubelet[3235]: I1212 17:29:46.673616 3235 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc15d043-74d0-4be2-a3f9-2946b5a0ba7c-whisker-ca-bundle\") pod \"fc15d043-74d0-4be2-a3f9-2946b5a0ba7c\" (UID: \"fc15d043-74d0-4be2-a3f9-2946b5a0ba7c\") " Dec 12 17:29:46.674380 kubelet[3235]: I1212 17:29:46.673951 3235 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc15d043-74d0-4be2-a3f9-2946b5a0ba7c-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "fc15d043-74d0-4be2-a3f9-2946b5a0ba7c" (UID: "fc15d043-74d0-4be2-a3f9-2946b5a0ba7c"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Dec 12 17:29:46.677012 kubelet[3235]: I1212 17:29:46.676968 3235 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc15d043-74d0-4be2-a3f9-2946b5a0ba7c-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "fc15d043-74d0-4be2-a3f9-2946b5a0ba7c" (UID: "fc15d043-74d0-4be2-a3f9-2946b5a0ba7c"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Dec 12 17:29:46.678085 kubelet[3235]: I1212 17:29:46.678037 3235 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc15d043-74d0-4be2-a3f9-2946b5a0ba7c-kube-api-access-mgnjm" (OuterVolumeSpecName: "kube-api-access-mgnjm") pod "fc15d043-74d0-4be2-a3f9-2946b5a0ba7c" (UID: "fc15d043-74d0-4be2-a3f9-2946b5a0ba7c"). InnerVolumeSpecName "kube-api-access-mgnjm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Dec 12 17:29:46.741142 systemd[1]: Removed slice kubepods-besteffort-podfc15d043_74d0_4be2_a3f9_2946b5a0ba7c.slice - libcontainer container kubepods-besteffort-podfc15d043_74d0_4be2_a3f9_2946b5a0ba7c.slice. Dec 12 17:29:46.753232 kubelet[3235]: I1212 17:29:46.753121 3235 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-rsntg" podStartSLOduration=1.620612159 podStartE2EDuration="17.753105865s" podCreationTimestamp="2025-12-12 17:29:29 +0000 UTC" firstStartedPulling="2025-12-12 17:29:30.051722281 +0000 UTC m=+26.533961766" lastFinishedPulling="2025-12-12 17:29:46.184215987 +0000 UTC m=+42.666455472" observedRunningTime="2025-12-12 17:29:46.752252063 +0000 UTC m=+43.234491548" watchObservedRunningTime="2025-12-12 17:29:46.753105865 +0000 UTC m=+43.235345350" Dec 12 17:29:46.774268 kubelet[3235]: I1212 17:29:46.774220 3235 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc15d043-74d0-4be2-a3f9-2946b5a0ba7c-whisker-ca-bundle\") on node \"ci-4459-2-2-d-e796afb129\" DevicePath \"\"" Dec 12 17:29:46.774268 kubelet[3235]: I1212 17:29:46.774251 3235 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mgnjm\" (UniqueName: \"kubernetes.io/projected/fc15d043-74d0-4be2-a3f9-2946b5a0ba7c-kube-api-access-mgnjm\") on node \"ci-4459-2-2-d-e796afb129\" DevicePath \"\"" Dec 12 17:29:46.774268 kubelet[3235]: I1212 17:29:46.774262 3235 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/fc15d043-74d0-4be2-a3f9-2946b5a0ba7c-whisker-backend-key-pair\") on node \"ci-4459-2-2-d-e796afb129\" DevicePath \"\"" Dec 12 17:29:46.815826 systemd[1]: Created slice kubepods-besteffort-pod1aa8e517_9efe_4339_b128_b444ff23b3fb.slice - libcontainer container kubepods-besteffort-pod1aa8e517_9efe_4339_b128_b444ff23b3fb.slice. Dec 12 17:29:46.874677 kubelet[3235]: I1212 17:29:46.874579 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1aa8e517-9efe-4339-b128-b444ff23b3fb-whisker-ca-bundle\") pod \"whisker-6646878486-nf54l\" (UID: \"1aa8e517-9efe-4339-b128-b444ff23b3fb\") " pod="calico-system/whisker-6646878486-nf54l" Dec 12 17:29:46.874677 kubelet[3235]: I1212 17:29:46.874625 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/1aa8e517-9efe-4339-b128-b444ff23b3fb-whisker-backend-key-pair\") pod \"whisker-6646878486-nf54l\" (UID: \"1aa8e517-9efe-4339-b128-b444ff23b3fb\") " pod="calico-system/whisker-6646878486-nf54l" Dec 12 17:29:46.874677 kubelet[3235]: I1212 17:29:46.874643 3235 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nvqd\" (UniqueName: \"kubernetes.io/projected/1aa8e517-9efe-4339-b128-b444ff23b3fb-kube-api-access-8nvqd\") pod \"whisker-6646878486-nf54l\" (UID: \"1aa8e517-9efe-4339-b128-b444ff23b3fb\") " pod="calico-system/whisker-6646878486-nf54l" Dec 12 17:29:47.121514 containerd[1652]: time="2025-12-12T17:29:47.121452662Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6646878486-nf54l,Uid:1aa8e517-9efe-4339-b128-b444ff23b3fb,Namespace:calico-system,Attempt:0,}" Dec 12 17:29:47.163937 systemd[1]: var-lib-kubelet-pods-fc15d043\x2d74d0\x2d4be2\x2da3f9\x2d2946b5a0ba7c-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dmgnjm.mount: Deactivated successfully. Dec 12 17:29:47.164032 systemd[1]: var-lib-kubelet-pods-fc15d043\x2d74d0\x2d4be2\x2da3f9\x2d2946b5a0ba7c-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Dec 12 17:29:47.258284 systemd-networkd[1513]: cali22615902be6: Link UP Dec 12 17:29:47.258479 systemd-networkd[1513]: cali22615902be6: Gained carrier Dec 12 17:29:47.275797 containerd[1652]: 2025-12-12 17:29:47.141 [INFO][4496] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 12 17:29:47.275797 containerd[1652]: 2025-12-12 17:29:47.166 [INFO][4496] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--d--e796afb129-k8s-whisker--6646878486--nf54l-eth0 whisker-6646878486- calico-system 1aa8e517-9efe-4339-b128-b444ff23b3fb 934 0 2025-12-12 17:29:46 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6646878486 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-2-d-e796afb129 whisker-6646878486-nf54l eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali22615902be6 [] [] }} ContainerID="f59d398eabed056f2bcc69c9b725e09785e96a0a97fc1081be921a8d222e34a0" Namespace="calico-system" Pod="whisker-6646878486-nf54l" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-whisker--6646878486--nf54l-" Dec 12 17:29:47.275797 containerd[1652]: 2025-12-12 17:29:47.167 [INFO][4496] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f59d398eabed056f2bcc69c9b725e09785e96a0a97fc1081be921a8d222e34a0" Namespace="calico-system" Pod="whisker-6646878486-nf54l" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-whisker--6646878486--nf54l-eth0" Dec 12 17:29:47.275797 containerd[1652]: 2025-12-12 17:29:47.212 [INFO][4510] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f59d398eabed056f2bcc69c9b725e09785e96a0a97fc1081be921a8d222e34a0" HandleID="k8s-pod-network.f59d398eabed056f2bcc69c9b725e09785e96a0a97fc1081be921a8d222e34a0" Workload="ci--4459--2--2--d--e796afb129-k8s-whisker--6646878486--nf54l-eth0" Dec 12 17:29:47.276220 containerd[1652]: 2025-12-12 17:29:47.212 [INFO][4510] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="f59d398eabed056f2bcc69c9b725e09785e96a0a97fc1081be921a8d222e34a0" HandleID="k8s-pod-network.f59d398eabed056f2bcc69c9b725e09785e96a0a97fc1081be921a8d222e34a0" Workload="ci--4459--2--2--d--e796afb129-k8s-whisker--6646878486--nf54l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004daa0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-d-e796afb129", "pod":"whisker-6646878486-nf54l", "timestamp":"2025-12-12 17:29:47.212525459 +0000 UTC"}, Hostname:"ci-4459-2-2-d-e796afb129", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:29:47.276220 containerd[1652]: 2025-12-12 17:29:47.213 [INFO][4510] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:29:47.276220 containerd[1652]: 2025-12-12 17:29:47.213 [INFO][4510] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:29:47.276220 containerd[1652]: 2025-12-12 17:29:47.213 [INFO][4510] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-d-e796afb129' Dec 12 17:29:47.276220 containerd[1652]: 2025-12-12 17:29:47.223 [INFO][4510] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f59d398eabed056f2bcc69c9b725e09785e96a0a97fc1081be921a8d222e34a0" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:47.276220 containerd[1652]: 2025-12-12 17:29:47.229 [INFO][4510] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:47.276220 containerd[1652]: 2025-12-12 17:29:47.234 [INFO][4510] ipam/ipam.go 511: Trying affinity for 192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:47.276220 containerd[1652]: 2025-12-12 17:29:47.236 [INFO][4510] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:47.276220 containerd[1652]: 2025-12-12 17:29:47.238 [INFO][4510] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:47.276761 containerd[1652]: 2025-12-12 17:29:47.238 [INFO][4510] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.8.0/26 handle="k8s-pod-network.f59d398eabed056f2bcc69c9b725e09785e96a0a97fc1081be921a8d222e34a0" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:47.276761 containerd[1652]: 2025-12-12 17:29:47.240 [INFO][4510] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.f59d398eabed056f2bcc69c9b725e09785e96a0a97fc1081be921a8d222e34a0 Dec 12 17:29:47.276761 containerd[1652]: 2025-12-12 17:29:47.244 [INFO][4510] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.8.0/26 handle="k8s-pod-network.f59d398eabed056f2bcc69c9b725e09785e96a0a97fc1081be921a8d222e34a0" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:47.276761 containerd[1652]: 2025-12-12 17:29:47.249 [INFO][4510] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.8.1/26] block=192.168.8.0/26 handle="k8s-pod-network.f59d398eabed056f2bcc69c9b725e09785e96a0a97fc1081be921a8d222e34a0" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:47.276761 containerd[1652]: 2025-12-12 17:29:47.249 [INFO][4510] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.1/26] handle="k8s-pod-network.f59d398eabed056f2bcc69c9b725e09785e96a0a97fc1081be921a8d222e34a0" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:47.276761 containerd[1652]: 2025-12-12 17:29:47.249 [INFO][4510] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:29:47.276761 containerd[1652]: 2025-12-12 17:29:47.249 [INFO][4510] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.8.1/26] IPv6=[] ContainerID="f59d398eabed056f2bcc69c9b725e09785e96a0a97fc1081be921a8d222e34a0" HandleID="k8s-pod-network.f59d398eabed056f2bcc69c9b725e09785e96a0a97fc1081be921a8d222e34a0" Workload="ci--4459--2--2--d--e796afb129-k8s-whisker--6646878486--nf54l-eth0" Dec 12 17:29:47.277013 containerd[1652]: 2025-12-12 17:29:47.252 [INFO][4496] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f59d398eabed056f2bcc69c9b725e09785e96a0a97fc1081be921a8d222e34a0" Namespace="calico-system" Pod="whisker-6646878486-nf54l" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-whisker--6646878486--nf54l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--d--e796afb129-k8s-whisker--6646878486--nf54l-eth0", GenerateName:"whisker-6646878486-", Namespace:"calico-system", SelfLink:"", UID:"1aa8e517-9efe-4339-b128-b444ff23b3fb", ResourceVersion:"934", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 29, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6646878486", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-d-e796afb129", ContainerID:"", Pod:"whisker-6646878486-nf54l", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.8.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali22615902be6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:29:47.277013 containerd[1652]: 2025-12-12 17:29:47.252 [INFO][4496] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.1/32] ContainerID="f59d398eabed056f2bcc69c9b725e09785e96a0a97fc1081be921a8d222e34a0" Namespace="calico-system" Pod="whisker-6646878486-nf54l" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-whisker--6646878486--nf54l-eth0" Dec 12 17:29:47.277091 containerd[1652]: 2025-12-12 17:29:47.252 [INFO][4496] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali22615902be6 ContainerID="f59d398eabed056f2bcc69c9b725e09785e96a0a97fc1081be921a8d222e34a0" Namespace="calico-system" Pod="whisker-6646878486-nf54l" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-whisker--6646878486--nf54l-eth0" Dec 12 17:29:47.277091 containerd[1652]: 2025-12-12 17:29:47.259 [INFO][4496] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f59d398eabed056f2bcc69c9b725e09785e96a0a97fc1081be921a8d222e34a0" Namespace="calico-system" Pod="whisker-6646878486-nf54l" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-whisker--6646878486--nf54l-eth0" Dec 12 17:29:47.277131 containerd[1652]: 2025-12-12 17:29:47.259 [INFO][4496] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f59d398eabed056f2bcc69c9b725e09785e96a0a97fc1081be921a8d222e34a0" Namespace="calico-system" Pod="whisker-6646878486-nf54l" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-whisker--6646878486--nf54l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--d--e796afb129-k8s-whisker--6646878486--nf54l-eth0", GenerateName:"whisker-6646878486-", Namespace:"calico-system", SelfLink:"", UID:"1aa8e517-9efe-4339-b128-b444ff23b3fb", ResourceVersion:"934", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 29, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6646878486", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-d-e796afb129", ContainerID:"f59d398eabed056f2bcc69c9b725e09785e96a0a97fc1081be921a8d222e34a0", Pod:"whisker-6646878486-nf54l", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.8.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali22615902be6", MAC:"66:c0:35:36:63:eb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:29:47.277179 containerd[1652]: 2025-12-12 17:29:47.272 [INFO][4496] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f59d398eabed056f2bcc69c9b725e09785e96a0a97fc1081be921a8d222e34a0" Namespace="calico-system" Pod="whisker-6646878486-nf54l" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-whisker--6646878486--nf54l-eth0" Dec 12 17:29:47.294399 containerd[1652]: time="2025-12-12T17:29:47.294330751Z" level=info msg="connecting to shim f59d398eabed056f2bcc69c9b725e09785e96a0a97fc1081be921a8d222e34a0" address="unix:///run/containerd/s/ce4a79cc902719e8f771f6f7a16c46ca0cf5560336d8631350f3490c589eab5a" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:29:47.313599 systemd[1]: Started cri-containerd-f59d398eabed056f2bcc69c9b725e09785e96a0a97fc1081be921a8d222e34a0.scope - libcontainer container f59d398eabed056f2bcc69c9b725e09785e96a0a97fc1081be921a8d222e34a0. Dec 12 17:29:47.344337 containerd[1652]: time="2025-12-12T17:29:47.344284361Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6646878486-nf54l,Uid:1aa8e517-9efe-4339-b128-b444ff23b3fb,Namespace:calico-system,Attempt:0,} returns sandbox id \"f59d398eabed056f2bcc69c9b725e09785e96a0a97fc1081be921a8d222e34a0\"" Dec 12 17:29:47.346105 containerd[1652]: time="2025-12-12T17:29:47.346073285Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:29:47.622014 kubelet[3235]: I1212 17:29:47.621961 3235 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc15d043-74d0-4be2-a3f9-2946b5a0ba7c" path="/var/lib/kubelet/pods/fc15d043-74d0-4be2-a3f9-2946b5a0ba7c/volumes" Dec 12 17:29:47.676975 containerd[1652]: time="2025-12-12T17:29:47.676907145Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:47.678191 containerd[1652]: time="2025-12-12T17:29:47.678098628Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:29:47.678191 containerd[1652]: time="2025-12-12T17:29:47.678156428Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:29:47.678490 kubelet[3235]: E1212 17:29:47.678448 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:29:47.678758 kubelet[3235]: E1212 17:29:47.678501 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:29:47.678758 kubelet[3235]: E1212 17:29:47.678583 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-6646878486-nf54l_calico-system(1aa8e517-9efe-4339-b128-b444ff23b3fb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:47.679708 containerd[1652]: time="2025-12-12T17:29:47.679513792Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:29:48.015731 containerd[1652]: time="2025-12-12T17:29:48.015656625Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:48.017049 containerd[1652]: time="2025-12-12T17:29:48.016976468Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:29:48.017049 containerd[1652]: time="2025-12-12T17:29:48.017020668Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:29:48.017267 kubelet[3235]: E1212 17:29:48.017219 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:29:48.017309 kubelet[3235]: E1212 17:29:48.017277 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:29:48.017428 kubelet[3235]: E1212 17:29:48.017405 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-6646878486-nf54l_calico-system(1aa8e517-9efe-4339-b128-b444ff23b3fb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:48.017494 kubelet[3235]: E1212 17:29:48.017453 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6646878486-nf54l" podUID="1aa8e517-9efe-4339-b128-b444ff23b3fb" Dec 12 17:29:48.366622 systemd-networkd[1513]: cali22615902be6: Gained IPv6LL Dec 12 17:29:48.740813 kubelet[3235]: E1212 17:29:48.740767 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6646878486-nf54l" podUID="1aa8e517-9efe-4339-b128-b444ff23b3fb" Dec 12 17:29:49.305058 kubelet[3235]: I1212 17:29:49.304994 3235 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 12 17:29:50.269030 systemd-networkd[1513]: vxlan.calico: Link UP Dec 12 17:29:50.269045 systemd-networkd[1513]: vxlan.calico: Gained carrier Dec 12 17:29:51.623240 containerd[1652]: time="2025-12-12T17:29:51.623129556Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-fvf46,Uid:afc05240-ed5c-4c99-8e0c-0cca61ebd35a,Namespace:calico-system,Attempt:0,}" Dec 12 17:29:51.733559 systemd-networkd[1513]: cali74652a43a2f: Link UP Dec 12 17:29:51.734872 systemd-networkd[1513]: cali74652a43a2f: Gained carrier Dec 12 17:29:51.755446 containerd[1652]: 2025-12-12 17:29:51.662 [INFO][4865] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--d--e796afb129-k8s-goldmane--7c778bb748--fvf46-eth0 goldmane-7c778bb748- calico-system afc05240-ed5c-4c99-8e0c-0cca61ebd35a 871 0 2025-12-12 17:29:27 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7c778bb748 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-2-d-e796afb129 goldmane-7c778bb748-fvf46 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali74652a43a2f [] [] }} ContainerID="a3fa7ea5dab00f056ef47e84e7958069b77e61dbdb230021d9c1a6a8c5c72ee2" Namespace="calico-system" Pod="goldmane-7c778bb748-fvf46" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-goldmane--7c778bb748--fvf46-" Dec 12 17:29:51.755446 containerd[1652]: 2025-12-12 17:29:51.662 [INFO][4865] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a3fa7ea5dab00f056ef47e84e7958069b77e61dbdb230021d9c1a6a8c5c72ee2" Namespace="calico-system" Pod="goldmane-7c778bb748-fvf46" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-goldmane--7c778bb748--fvf46-eth0" Dec 12 17:29:51.755446 containerd[1652]: 2025-12-12 17:29:51.685 [INFO][4879] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a3fa7ea5dab00f056ef47e84e7958069b77e61dbdb230021d9c1a6a8c5c72ee2" HandleID="k8s-pod-network.a3fa7ea5dab00f056ef47e84e7958069b77e61dbdb230021d9c1a6a8c5c72ee2" Workload="ci--4459--2--2--d--e796afb129-k8s-goldmane--7c778bb748--fvf46-eth0" Dec 12 17:29:51.755654 containerd[1652]: 2025-12-12 17:29:51.685 [INFO][4879] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="a3fa7ea5dab00f056ef47e84e7958069b77e61dbdb230021d9c1a6a8c5c72ee2" HandleID="k8s-pod-network.a3fa7ea5dab00f056ef47e84e7958069b77e61dbdb230021d9c1a6a8c5c72ee2" Workload="ci--4459--2--2--d--e796afb129-k8s-goldmane--7c778bb748--fvf46-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000592a60), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-d-e796afb129", "pod":"goldmane-7c778bb748-fvf46", "timestamp":"2025-12-12 17:29:51.685263117 +0000 UTC"}, Hostname:"ci-4459-2-2-d-e796afb129", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:29:51.755654 containerd[1652]: 2025-12-12 17:29:51.685 [INFO][4879] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:29:51.755654 containerd[1652]: 2025-12-12 17:29:51.685 [INFO][4879] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:29:51.755654 containerd[1652]: 2025-12-12 17:29:51.685 [INFO][4879] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-d-e796afb129' Dec 12 17:29:51.755654 containerd[1652]: 2025-12-12 17:29:51.695 [INFO][4879] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a3fa7ea5dab00f056ef47e84e7958069b77e61dbdb230021d9c1a6a8c5c72ee2" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:51.755654 containerd[1652]: 2025-12-12 17:29:51.700 [INFO][4879] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:51.755654 containerd[1652]: 2025-12-12 17:29:51.704 [INFO][4879] ipam/ipam.go 511: Trying affinity for 192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:51.755654 containerd[1652]: 2025-12-12 17:29:51.705 [INFO][4879] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:51.755654 containerd[1652]: 2025-12-12 17:29:51.708 [INFO][4879] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:51.755864 containerd[1652]: 2025-12-12 17:29:51.708 [INFO][4879] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.8.0/26 handle="k8s-pod-network.a3fa7ea5dab00f056ef47e84e7958069b77e61dbdb230021d9c1a6a8c5c72ee2" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:51.755864 containerd[1652]: 2025-12-12 17:29:51.709 [INFO][4879] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.a3fa7ea5dab00f056ef47e84e7958069b77e61dbdb230021d9c1a6a8c5c72ee2 Dec 12 17:29:51.755864 containerd[1652]: 2025-12-12 17:29:51.714 [INFO][4879] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.8.0/26 handle="k8s-pod-network.a3fa7ea5dab00f056ef47e84e7958069b77e61dbdb230021d9c1a6a8c5c72ee2" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:51.755864 containerd[1652]: 2025-12-12 17:29:51.722 [INFO][4879] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.8.2/26] block=192.168.8.0/26 handle="k8s-pod-network.a3fa7ea5dab00f056ef47e84e7958069b77e61dbdb230021d9c1a6a8c5c72ee2" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:51.755864 containerd[1652]: 2025-12-12 17:29:51.723 [INFO][4879] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.2/26] handle="k8s-pod-network.a3fa7ea5dab00f056ef47e84e7958069b77e61dbdb230021d9c1a6a8c5c72ee2" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:51.755864 containerd[1652]: 2025-12-12 17:29:51.724 [INFO][4879] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:29:51.755864 containerd[1652]: 2025-12-12 17:29:51.724 [INFO][4879] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.8.2/26] IPv6=[] ContainerID="a3fa7ea5dab00f056ef47e84e7958069b77e61dbdb230021d9c1a6a8c5c72ee2" HandleID="k8s-pod-network.a3fa7ea5dab00f056ef47e84e7958069b77e61dbdb230021d9c1a6a8c5c72ee2" Workload="ci--4459--2--2--d--e796afb129-k8s-goldmane--7c778bb748--fvf46-eth0" Dec 12 17:29:51.755994 containerd[1652]: 2025-12-12 17:29:51.729 [INFO][4865] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a3fa7ea5dab00f056ef47e84e7958069b77e61dbdb230021d9c1a6a8c5c72ee2" Namespace="calico-system" Pod="goldmane-7c778bb748-fvf46" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-goldmane--7c778bb748--fvf46-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--d--e796afb129-k8s-goldmane--7c778bb748--fvf46-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"afc05240-ed5c-4c99-8e0c-0cca61ebd35a", ResourceVersion:"871", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 29, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-d-e796afb129", ContainerID:"", Pod:"goldmane-7c778bb748-fvf46", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.8.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali74652a43a2f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:29:51.756044 containerd[1652]: 2025-12-12 17:29:51.729 [INFO][4865] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.2/32] ContainerID="a3fa7ea5dab00f056ef47e84e7958069b77e61dbdb230021d9c1a6a8c5c72ee2" Namespace="calico-system" Pod="goldmane-7c778bb748-fvf46" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-goldmane--7c778bb748--fvf46-eth0" Dec 12 17:29:51.756044 containerd[1652]: 2025-12-12 17:29:51.729 [INFO][4865] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali74652a43a2f ContainerID="a3fa7ea5dab00f056ef47e84e7958069b77e61dbdb230021d9c1a6a8c5c72ee2" Namespace="calico-system" Pod="goldmane-7c778bb748-fvf46" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-goldmane--7c778bb748--fvf46-eth0" Dec 12 17:29:51.756044 containerd[1652]: 2025-12-12 17:29:51.734 [INFO][4865] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a3fa7ea5dab00f056ef47e84e7958069b77e61dbdb230021d9c1a6a8c5c72ee2" Namespace="calico-system" Pod="goldmane-7c778bb748-fvf46" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-goldmane--7c778bb748--fvf46-eth0" Dec 12 17:29:51.756104 containerd[1652]: 2025-12-12 17:29:51.735 [INFO][4865] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a3fa7ea5dab00f056ef47e84e7958069b77e61dbdb230021d9c1a6a8c5c72ee2" Namespace="calico-system" Pod="goldmane-7c778bb748-fvf46" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-goldmane--7c778bb748--fvf46-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--d--e796afb129-k8s-goldmane--7c778bb748--fvf46-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"afc05240-ed5c-4c99-8e0c-0cca61ebd35a", ResourceVersion:"871", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 29, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-d-e796afb129", ContainerID:"a3fa7ea5dab00f056ef47e84e7958069b77e61dbdb230021d9c1a6a8c5c72ee2", Pod:"goldmane-7c778bb748-fvf46", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.8.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali74652a43a2f", MAC:"d6:89:4d:78:69:41", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:29:51.756149 containerd[1652]: 2025-12-12 17:29:51.749 [INFO][4865] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a3fa7ea5dab00f056ef47e84e7958069b77e61dbdb230021d9c1a6a8c5c72ee2" Namespace="calico-system" Pod="goldmane-7c778bb748-fvf46" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-goldmane--7c778bb748--fvf46-eth0" Dec 12 17:29:51.791607 containerd[1652]: time="2025-12-12T17:29:51.791554193Z" level=info msg="connecting to shim a3fa7ea5dab00f056ef47e84e7958069b77e61dbdb230021d9c1a6a8c5c72ee2" address="unix:///run/containerd/s/8f900e75dcbf7e5a10631ab496205edf786a5c40c084a5a2c012a10774b11aab" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:29:51.816670 systemd[1]: Started cri-containerd-a3fa7ea5dab00f056ef47e84e7958069b77e61dbdb230021d9c1a6a8c5c72ee2.scope - libcontainer container a3fa7ea5dab00f056ef47e84e7958069b77e61dbdb230021d9c1a6a8c5c72ee2. Dec 12 17:29:51.860130 containerd[1652]: time="2025-12-12T17:29:51.860090371Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-fvf46,Uid:afc05240-ed5c-4c99-8e0c-0cca61ebd35a,Namespace:calico-system,Attempt:0,} returns sandbox id \"a3fa7ea5dab00f056ef47e84e7958069b77e61dbdb230021d9c1a6a8c5c72ee2\"" Dec 12 17:29:51.862260 containerd[1652]: time="2025-12-12T17:29:51.862176577Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:29:52.197819 containerd[1652]: time="2025-12-12T17:29:52.197734528Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:52.199183 containerd[1652]: time="2025-12-12T17:29:52.199077252Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:29:52.199183 containerd[1652]: time="2025-12-12T17:29:52.199098692Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:29:52.199437 kubelet[3235]: E1212 17:29:52.199394 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:29:52.199788 kubelet[3235]: E1212 17:29:52.199443 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:29:52.199788 kubelet[3235]: E1212 17:29:52.199520 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-fvf46_calico-system(afc05240-ed5c-4c99-8e0c-0cca61ebd35a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:52.199788 kubelet[3235]: E1212 17:29:52.199549 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-fvf46" podUID="afc05240-ed5c-4c99-8e0c-0cca61ebd35a" Dec 12 17:29:52.206653 systemd-networkd[1513]: vxlan.calico: Gained IPv6LL Dec 12 17:29:52.622597 containerd[1652]: time="2025-12-12T17:29:52.622399831Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-zmb9q,Uid:2a50682a-2098-46b3-a84d-a1bc6d985fd9,Namespace:kube-system,Attempt:0,}" Dec 12 17:29:52.626405 containerd[1652]: time="2025-12-12T17:29:52.624467517Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d565b7c7c-954tf,Uid:419c68ca-a927-4109-af9a-f076c2eb6b23,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:29:52.751856 kubelet[3235]: E1212 17:29:52.751798 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-fvf46" podUID="afc05240-ed5c-4c99-8e0c-0cca61ebd35a" Dec 12 17:29:52.762784 systemd-networkd[1513]: califc72c1de89a: Link UP Dec 12 17:29:52.762923 systemd-networkd[1513]: califc72c1de89a: Gained carrier Dec 12 17:29:52.779409 containerd[1652]: 2025-12-12 17:29:52.680 [INFO][4945] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--zmb9q-eth0 coredns-66bc5c9577- kube-system 2a50682a-2098-46b3-a84d-a1bc6d985fd9 865 0 2025-12-12 17:29:11 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-d-e796afb129 coredns-66bc5c9577-zmb9q eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] califc72c1de89a [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba" Namespace="kube-system" Pod="coredns-66bc5c9577-zmb9q" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--zmb9q-" Dec 12 17:29:52.779409 containerd[1652]: 2025-12-12 17:29:52.681 [INFO][4945] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba" Namespace="kube-system" Pod="coredns-66bc5c9577-zmb9q" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--zmb9q-eth0" Dec 12 17:29:52.779409 containerd[1652]: 2025-12-12 17:29:52.713 [INFO][4974] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba" HandleID="k8s-pod-network.2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba" Workload="ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--zmb9q-eth0" Dec 12 17:29:52.779648 containerd[1652]: 2025-12-12 17:29:52.713 [INFO][4974] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba" HandleID="k8s-pod-network.2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba" Workload="ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--zmb9q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c3390), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-d-e796afb129", "pod":"coredns-66bc5c9577-zmb9q", "timestamp":"2025-12-12 17:29:52.713585588 +0000 UTC"}, Hostname:"ci-4459-2-2-d-e796afb129", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:29:52.779648 containerd[1652]: 2025-12-12 17:29:52.713 [INFO][4974] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:29:52.779648 containerd[1652]: 2025-12-12 17:29:52.713 [INFO][4974] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:29:52.779648 containerd[1652]: 2025-12-12 17:29:52.713 [INFO][4974] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-d-e796afb129' Dec 12 17:29:52.779648 containerd[1652]: 2025-12-12 17:29:52.726 [INFO][4974] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:52.779648 containerd[1652]: 2025-12-12 17:29:52.731 [INFO][4974] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:52.779648 containerd[1652]: 2025-12-12 17:29:52.736 [INFO][4974] ipam/ipam.go 511: Trying affinity for 192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:52.779648 containerd[1652]: 2025-12-12 17:29:52.739 [INFO][4974] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:52.779648 containerd[1652]: 2025-12-12 17:29:52.741 [INFO][4974] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:52.779948 containerd[1652]: 2025-12-12 17:29:52.741 [INFO][4974] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.8.0/26 handle="k8s-pod-network.2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:52.779948 containerd[1652]: 2025-12-12 17:29:52.743 [INFO][4974] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba Dec 12 17:29:52.779948 containerd[1652]: 2025-12-12 17:29:52.747 [INFO][4974] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.8.0/26 handle="k8s-pod-network.2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:52.779948 containerd[1652]: 2025-12-12 17:29:52.755 [INFO][4974] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.8.3/26] block=192.168.8.0/26 handle="k8s-pod-network.2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:52.779948 containerd[1652]: 2025-12-12 17:29:52.755 [INFO][4974] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.3/26] handle="k8s-pod-network.2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:52.779948 containerd[1652]: 2025-12-12 17:29:52.755 [INFO][4974] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:29:52.779948 containerd[1652]: 2025-12-12 17:29:52.755 [INFO][4974] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.8.3/26] IPv6=[] ContainerID="2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba" HandleID="k8s-pod-network.2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba" Workload="ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--zmb9q-eth0" Dec 12 17:29:52.780158 containerd[1652]: 2025-12-12 17:29:52.759 [INFO][4945] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba" Namespace="kube-system" Pod="coredns-66bc5c9577-zmb9q" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--zmb9q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--zmb9q-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"2a50682a-2098-46b3-a84d-a1bc6d985fd9", ResourceVersion:"865", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 29, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-d-e796afb129", ContainerID:"", Pod:"coredns-66bc5c9577-zmb9q", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.8.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califc72c1de89a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:29:52.780158 containerd[1652]: 2025-12-12 17:29:52.760 [INFO][4945] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.3/32] ContainerID="2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba" Namespace="kube-system" Pod="coredns-66bc5c9577-zmb9q" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--zmb9q-eth0" Dec 12 17:29:52.780158 containerd[1652]: 2025-12-12 17:29:52.760 [INFO][4945] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califc72c1de89a ContainerID="2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba" Namespace="kube-system" Pod="coredns-66bc5c9577-zmb9q" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--zmb9q-eth0" Dec 12 17:29:52.780158 containerd[1652]: 2025-12-12 17:29:52.762 [INFO][4945] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba" Namespace="kube-system" Pod="coredns-66bc5c9577-zmb9q" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--zmb9q-eth0" Dec 12 17:29:52.780158 containerd[1652]: 2025-12-12 17:29:52.762 [INFO][4945] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba" Namespace="kube-system" Pod="coredns-66bc5c9577-zmb9q" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--zmb9q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--zmb9q-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"2a50682a-2098-46b3-a84d-a1bc6d985fd9", ResourceVersion:"865", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 29, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-d-e796afb129", ContainerID:"2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba", Pod:"coredns-66bc5c9577-zmb9q", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.8.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califc72c1de89a", MAC:"22:4e:b9:a3:ec:50", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:29:52.781117 containerd[1652]: 2025-12-12 17:29:52.777 [INFO][4945] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba" Namespace="kube-system" Pod="coredns-66bc5c9577-zmb9q" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--zmb9q-eth0" Dec 12 17:29:52.782764 systemd-networkd[1513]: cali74652a43a2f: Gained IPv6LL Dec 12 17:29:52.802605 containerd[1652]: time="2025-12-12T17:29:52.802560139Z" level=info msg="connecting to shim 2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba" address="unix:///run/containerd/s/ac6beed990058380a3c02e061ca0498be7076cfdeeb9bd9c414b405146eecf44" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:29:52.828668 systemd[1]: Started cri-containerd-2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba.scope - libcontainer container 2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba. Dec 12 17:29:52.878634 systemd-networkd[1513]: cali1ed00bf3d68: Link UP Dec 12 17:29:52.880313 systemd-networkd[1513]: cali1ed00bf3d68: Gained carrier Dec 12 17:29:52.881607 containerd[1652]: time="2025-12-12T17:29:52.881568225Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-zmb9q,Uid:2a50682a-2098-46b3-a84d-a1bc6d985fd9,Namespace:kube-system,Attempt:0,} returns sandbox id \"2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba\"" Dec 12 17:29:52.888666 containerd[1652]: time="2025-12-12T17:29:52.888602443Z" level=info msg="CreateContainer within sandbox \"2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 12 17:29:52.899141 containerd[1652]: 2025-12-12 17:29:52.685 [INFO][4948] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7d565b7c7c--954tf-eth0 calico-apiserver-7d565b7c7c- calico-apiserver 419c68ca-a927-4109-af9a-f076c2eb6b23 867 0 2025-12-12 17:29:23 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7d565b7c7c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-d-e796afb129 calico-apiserver-7d565b7c7c-954tf eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali1ed00bf3d68 [] [] }} ContainerID="244a563b75f733503cb86d937a08775151b85c2d0c991db24f396ba2ea425536" Namespace="calico-apiserver" Pod="calico-apiserver-7d565b7c7c-954tf" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7d565b7c7c--954tf-" Dec 12 17:29:52.899141 containerd[1652]: 2025-12-12 17:29:52.685 [INFO][4948] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="244a563b75f733503cb86d937a08775151b85c2d0c991db24f396ba2ea425536" Namespace="calico-apiserver" Pod="calico-apiserver-7d565b7c7c-954tf" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7d565b7c7c--954tf-eth0" Dec 12 17:29:52.899141 containerd[1652]: 2025-12-12 17:29:52.722 [INFO][4980] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="244a563b75f733503cb86d937a08775151b85c2d0c991db24f396ba2ea425536" HandleID="k8s-pod-network.244a563b75f733503cb86d937a08775151b85c2d0c991db24f396ba2ea425536" Workload="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7d565b7c7c--954tf-eth0" Dec 12 17:29:52.899141 containerd[1652]: 2025-12-12 17:29:52.723 [INFO][4980] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="244a563b75f733503cb86d937a08775151b85c2d0c991db24f396ba2ea425536" HandleID="k8s-pod-network.244a563b75f733503cb86d937a08775151b85c2d0c991db24f396ba2ea425536" Workload="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7d565b7c7c--954tf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000137df0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-d-e796afb129", "pod":"calico-apiserver-7d565b7c7c-954tf", "timestamp":"2025-12-12 17:29:52.722805452 +0000 UTC"}, Hostname:"ci-4459-2-2-d-e796afb129", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:29:52.899141 containerd[1652]: 2025-12-12 17:29:52.723 [INFO][4980] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:29:52.899141 containerd[1652]: 2025-12-12 17:29:52.755 [INFO][4980] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:29:52.899141 containerd[1652]: 2025-12-12 17:29:52.756 [INFO][4980] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-d-e796afb129' Dec 12 17:29:52.899141 containerd[1652]: 2025-12-12 17:29:52.826 [INFO][4980] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.244a563b75f733503cb86d937a08775151b85c2d0c991db24f396ba2ea425536" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:52.899141 containerd[1652]: 2025-12-12 17:29:52.839 [INFO][4980] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:52.899141 containerd[1652]: 2025-12-12 17:29:52.846 [INFO][4980] ipam/ipam.go 511: Trying affinity for 192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:52.899141 containerd[1652]: 2025-12-12 17:29:52.850 [INFO][4980] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:52.899141 containerd[1652]: 2025-12-12 17:29:52.852 [INFO][4980] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:52.899141 containerd[1652]: 2025-12-12 17:29:52.852 [INFO][4980] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.8.0/26 handle="k8s-pod-network.244a563b75f733503cb86d937a08775151b85c2d0c991db24f396ba2ea425536" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:52.899141 containerd[1652]: 2025-12-12 17:29:52.854 [INFO][4980] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.244a563b75f733503cb86d937a08775151b85c2d0c991db24f396ba2ea425536 Dec 12 17:29:52.899141 containerd[1652]: 2025-12-12 17:29:52.859 [INFO][4980] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.8.0/26 handle="k8s-pod-network.244a563b75f733503cb86d937a08775151b85c2d0c991db24f396ba2ea425536" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:52.899141 containerd[1652]: 2025-12-12 17:29:52.868 [INFO][4980] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.8.4/26] block=192.168.8.0/26 handle="k8s-pod-network.244a563b75f733503cb86d937a08775151b85c2d0c991db24f396ba2ea425536" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:52.899141 containerd[1652]: 2025-12-12 17:29:52.869 [INFO][4980] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.4/26] handle="k8s-pod-network.244a563b75f733503cb86d937a08775151b85c2d0c991db24f396ba2ea425536" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:52.899141 containerd[1652]: 2025-12-12 17:29:52.869 [INFO][4980] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:29:52.899141 containerd[1652]: 2025-12-12 17:29:52.869 [INFO][4980] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.8.4/26] IPv6=[] ContainerID="244a563b75f733503cb86d937a08775151b85c2d0c991db24f396ba2ea425536" HandleID="k8s-pod-network.244a563b75f733503cb86d937a08775151b85c2d0c991db24f396ba2ea425536" Workload="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7d565b7c7c--954tf-eth0" Dec 12 17:29:52.900224 containerd[1652]: 2025-12-12 17:29:52.875 [INFO][4948] cni-plugin/k8s.go 418: Populated endpoint ContainerID="244a563b75f733503cb86d937a08775151b85c2d0c991db24f396ba2ea425536" Namespace="calico-apiserver" Pod="calico-apiserver-7d565b7c7c-954tf" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7d565b7c7c--954tf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7d565b7c7c--954tf-eth0", GenerateName:"calico-apiserver-7d565b7c7c-", Namespace:"calico-apiserver", SelfLink:"", UID:"419c68ca-a927-4109-af9a-f076c2eb6b23", ResourceVersion:"867", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 29, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d565b7c7c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-d-e796afb129", ContainerID:"", Pod:"calico-apiserver-7d565b7c7c-954tf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.8.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1ed00bf3d68", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:29:52.900224 containerd[1652]: 2025-12-12 17:29:52.875 [INFO][4948] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.4/32] ContainerID="244a563b75f733503cb86d937a08775151b85c2d0c991db24f396ba2ea425536" Namespace="calico-apiserver" Pod="calico-apiserver-7d565b7c7c-954tf" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7d565b7c7c--954tf-eth0" Dec 12 17:29:52.900224 containerd[1652]: 2025-12-12 17:29:52.875 [INFO][4948] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1ed00bf3d68 ContainerID="244a563b75f733503cb86d937a08775151b85c2d0c991db24f396ba2ea425536" Namespace="calico-apiserver" Pod="calico-apiserver-7d565b7c7c-954tf" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7d565b7c7c--954tf-eth0" Dec 12 17:29:52.900224 containerd[1652]: 2025-12-12 17:29:52.881 [INFO][4948] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="244a563b75f733503cb86d937a08775151b85c2d0c991db24f396ba2ea425536" Namespace="calico-apiserver" Pod="calico-apiserver-7d565b7c7c-954tf" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7d565b7c7c--954tf-eth0" Dec 12 17:29:52.900224 containerd[1652]: 2025-12-12 17:29:52.881 [INFO][4948] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="244a563b75f733503cb86d937a08775151b85c2d0c991db24f396ba2ea425536" Namespace="calico-apiserver" Pod="calico-apiserver-7d565b7c7c-954tf" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7d565b7c7c--954tf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7d565b7c7c--954tf-eth0", GenerateName:"calico-apiserver-7d565b7c7c-", Namespace:"calico-apiserver", SelfLink:"", UID:"419c68ca-a927-4109-af9a-f076c2eb6b23", ResourceVersion:"867", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 29, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d565b7c7c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-d-e796afb129", ContainerID:"244a563b75f733503cb86d937a08775151b85c2d0c991db24f396ba2ea425536", Pod:"calico-apiserver-7d565b7c7c-954tf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.8.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1ed00bf3d68", MAC:"da:7b:d0:16:2a:46", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:29:52.900224 containerd[1652]: 2025-12-12 17:29:52.896 [INFO][4948] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="244a563b75f733503cb86d937a08775151b85c2d0c991db24f396ba2ea425536" Namespace="calico-apiserver" Pod="calico-apiserver-7d565b7c7c-954tf" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7d565b7c7c--954tf-eth0" Dec 12 17:29:52.907196 containerd[1652]: time="2025-12-12T17:29:52.906583890Z" level=info msg="Container d27535bd70df89db900cea04e509f24255abc0b24e3f0baf90ce5acdd9cd3aad: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:29:52.917037 containerd[1652]: time="2025-12-12T17:29:52.916973877Z" level=info msg="CreateContainer within sandbox \"2e3d8db177cfefb2fa1b05702403fbb797a0d400b465ea0d5e6c961c3cc94bba\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d27535bd70df89db900cea04e509f24255abc0b24e3f0baf90ce5acdd9cd3aad\"" Dec 12 17:29:52.917571 containerd[1652]: time="2025-12-12T17:29:52.917543998Z" level=info msg="StartContainer for \"d27535bd70df89db900cea04e509f24255abc0b24e3f0baf90ce5acdd9cd3aad\"" Dec 12 17:29:52.919097 containerd[1652]: time="2025-12-12T17:29:52.919053922Z" level=info msg="connecting to shim d27535bd70df89db900cea04e509f24255abc0b24e3f0baf90ce5acdd9cd3aad" address="unix:///run/containerd/s/ac6beed990058380a3c02e061ca0498be7076cfdeeb9bd9c414b405146eecf44" protocol=ttrpc version=3 Dec 12 17:29:52.928061 containerd[1652]: time="2025-12-12T17:29:52.928017425Z" level=info msg="connecting to shim 244a563b75f733503cb86d937a08775151b85c2d0c991db24f396ba2ea425536" address="unix:///run/containerd/s/d3b76d6b85d353d19679ec2ad4a4ec962304c3d764cef50d3459fc2d76c77dad" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:29:52.937599 systemd[1]: Started cri-containerd-d27535bd70df89db900cea04e509f24255abc0b24e3f0baf90ce5acdd9cd3aad.scope - libcontainer container d27535bd70df89db900cea04e509f24255abc0b24e3f0baf90ce5acdd9cd3aad. Dec 12 17:29:52.961956 systemd[1]: Started cri-containerd-244a563b75f733503cb86d937a08775151b85c2d0c991db24f396ba2ea425536.scope - libcontainer container 244a563b75f733503cb86d937a08775151b85c2d0c991db24f396ba2ea425536. Dec 12 17:29:52.978976 containerd[1652]: time="2025-12-12T17:29:52.978911437Z" level=info msg="StartContainer for \"d27535bd70df89db900cea04e509f24255abc0b24e3f0baf90ce5acdd9cd3aad\" returns successfully" Dec 12 17:29:53.001202 containerd[1652]: time="2025-12-12T17:29:53.001144535Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d565b7c7c-954tf,Uid:419c68ca-a927-4109-af9a-f076c2eb6b23,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"244a563b75f733503cb86d937a08775151b85c2d0c991db24f396ba2ea425536\"" Dec 12 17:29:53.004091 containerd[1652]: time="2025-12-12T17:29:53.004031903Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:29:53.334668 containerd[1652]: time="2025-12-12T17:29:53.334590961Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:53.335960 containerd[1652]: time="2025-12-12T17:29:53.335910325Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:29:53.336405 containerd[1652]: time="2025-12-12T17:29:53.335981365Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:29:53.336481 kubelet[3235]: E1212 17:29:53.336207 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:29:53.336481 kubelet[3235]: E1212 17:29:53.336247 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:29:53.336481 kubelet[3235]: E1212 17:29:53.336322 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-7d565b7c7c-954tf_calico-apiserver(419c68ca-a927-4109-af9a-f076c2eb6b23): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:53.336481 kubelet[3235]: E1212 17:29:53.336353 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7d565b7c7c-954tf" podUID="419c68ca-a927-4109-af9a-f076c2eb6b23" Dec 12 17:29:53.622327 containerd[1652]: time="2025-12-12T17:29:53.622231388Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-6r5pt,Uid:ef22260f-549b-4cab-ad32-7abdf107199f,Namespace:kube-system,Attempt:0,}" Dec 12 17:29:53.725653 systemd-networkd[1513]: calid2e780b3078: Link UP Dec 12 17:29:53.726075 systemd-networkd[1513]: calid2e780b3078: Gained carrier Dec 12 17:29:53.744612 containerd[1652]: 2025-12-12 17:29:53.659 [INFO][5137] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--6r5pt-eth0 coredns-66bc5c9577- kube-system ef22260f-549b-4cab-ad32-7abdf107199f 870 0 2025-12-12 17:29:11 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-d-e796afb129 coredns-66bc5c9577-6r5pt eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calid2e780b3078 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898" Namespace="kube-system" Pod="coredns-66bc5c9577-6r5pt" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--6r5pt-" Dec 12 17:29:53.744612 containerd[1652]: 2025-12-12 17:29:53.659 [INFO][5137] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898" Namespace="kube-system" Pod="coredns-66bc5c9577-6r5pt" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--6r5pt-eth0" Dec 12 17:29:53.744612 containerd[1652]: 2025-12-12 17:29:53.682 [INFO][5152] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898" HandleID="k8s-pod-network.19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898" Workload="ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--6r5pt-eth0" Dec 12 17:29:53.744612 containerd[1652]: 2025-12-12 17:29:53.682 [INFO][5152] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898" HandleID="k8s-pod-network.19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898" Workload="ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--6r5pt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c31f0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-d-e796afb129", "pod":"coredns-66bc5c9577-6r5pt", "timestamp":"2025-12-12 17:29:53.682555825 +0000 UTC"}, Hostname:"ci-4459-2-2-d-e796afb129", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:29:53.744612 containerd[1652]: 2025-12-12 17:29:53.682 [INFO][5152] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:29:53.744612 containerd[1652]: 2025-12-12 17:29:53.682 [INFO][5152] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:29:53.744612 containerd[1652]: 2025-12-12 17:29:53.682 [INFO][5152] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-d-e796afb129' Dec 12 17:29:53.744612 containerd[1652]: 2025-12-12 17:29:53.693 [INFO][5152] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:53.744612 containerd[1652]: 2025-12-12 17:29:53.698 [INFO][5152] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:53.744612 containerd[1652]: 2025-12-12 17:29:53.703 [INFO][5152] ipam/ipam.go 511: Trying affinity for 192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:53.744612 containerd[1652]: 2025-12-12 17:29:53.705 [INFO][5152] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:53.744612 containerd[1652]: 2025-12-12 17:29:53.708 [INFO][5152] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:53.744612 containerd[1652]: 2025-12-12 17:29:53.708 [INFO][5152] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.8.0/26 handle="k8s-pod-network.19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:53.744612 containerd[1652]: 2025-12-12 17:29:53.709 [INFO][5152] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898 Dec 12 17:29:53.744612 containerd[1652]: 2025-12-12 17:29:53.715 [INFO][5152] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.8.0/26 handle="k8s-pod-network.19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:53.744612 containerd[1652]: 2025-12-12 17:29:53.721 [INFO][5152] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.8.5/26] block=192.168.8.0/26 handle="k8s-pod-network.19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:53.744612 containerd[1652]: 2025-12-12 17:29:53.721 [INFO][5152] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.5/26] handle="k8s-pod-network.19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:53.744612 containerd[1652]: 2025-12-12 17:29:53.721 [INFO][5152] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:29:53.744612 containerd[1652]: 2025-12-12 17:29:53.721 [INFO][5152] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.8.5/26] IPv6=[] ContainerID="19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898" HandleID="k8s-pod-network.19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898" Workload="ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--6r5pt-eth0" Dec 12 17:29:53.745341 containerd[1652]: 2025-12-12 17:29:53.723 [INFO][5137] cni-plugin/k8s.go 418: Populated endpoint ContainerID="19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898" Namespace="kube-system" Pod="coredns-66bc5c9577-6r5pt" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--6r5pt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--6r5pt-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"ef22260f-549b-4cab-ad32-7abdf107199f", ResourceVersion:"870", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 29, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-d-e796afb129", ContainerID:"", Pod:"coredns-66bc5c9577-6r5pt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.8.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid2e780b3078", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:29:53.745341 containerd[1652]: 2025-12-12 17:29:53.724 [INFO][5137] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.5/32] ContainerID="19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898" Namespace="kube-system" Pod="coredns-66bc5c9577-6r5pt" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--6r5pt-eth0" Dec 12 17:29:53.745341 containerd[1652]: 2025-12-12 17:29:53.724 [INFO][5137] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid2e780b3078 ContainerID="19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898" Namespace="kube-system" Pod="coredns-66bc5c9577-6r5pt" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--6r5pt-eth0" Dec 12 17:29:53.745341 containerd[1652]: 2025-12-12 17:29:53.726 [INFO][5137] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898" Namespace="kube-system" Pod="coredns-66bc5c9577-6r5pt" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--6r5pt-eth0" Dec 12 17:29:53.745341 containerd[1652]: 2025-12-12 17:29:53.726 [INFO][5137] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898" Namespace="kube-system" Pod="coredns-66bc5c9577-6r5pt" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--6r5pt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--6r5pt-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"ef22260f-549b-4cab-ad32-7abdf107199f", ResourceVersion:"870", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 29, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-d-e796afb129", ContainerID:"19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898", Pod:"coredns-66bc5c9577-6r5pt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.8.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid2e780b3078", MAC:"d6:43:dc:20:c2:78", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:29:53.745716 containerd[1652]: 2025-12-12 17:29:53.741 [INFO][5137] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898" Namespace="kube-system" Pod="coredns-66bc5c9577-6r5pt" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-coredns--66bc5c9577--6r5pt-eth0" Dec 12 17:29:53.754077 kubelet[3235]: E1212 17:29:53.754015 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7d565b7c7c-954tf" podUID="419c68ca-a927-4109-af9a-f076c2eb6b23" Dec 12 17:29:53.760060 kubelet[3235]: E1212 17:29:53.759930 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-fvf46" podUID="afc05240-ed5c-4c99-8e0c-0cca61ebd35a" Dec 12 17:29:53.775362 containerd[1652]: time="2025-12-12T17:29:53.775170866Z" level=info msg="connecting to shim 19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898" address="unix:///run/containerd/s/361a1b1a55e945967caf053085908fb2071750b2a9c838ad4606f7825d6be094" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:29:53.795739 kubelet[3235]: I1212 17:29:53.795524 3235 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-zmb9q" podStartSLOduration=42.795502079 podStartE2EDuration="42.795502079s" podCreationTimestamp="2025-12-12 17:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:29:53.791719509 +0000 UTC m=+50.273958994" watchObservedRunningTime="2025-12-12 17:29:53.795502079 +0000 UTC m=+50.277741524" Dec 12 17:29:53.817598 systemd[1]: Started cri-containerd-19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898.scope - libcontainer container 19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898. Dec 12 17:29:53.862874 containerd[1652]: time="2025-12-12T17:29:53.862800893Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-6r5pt,Uid:ef22260f-549b-4cab-ad32-7abdf107199f,Namespace:kube-system,Attempt:0,} returns sandbox id \"19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898\"" Dec 12 17:29:53.869960 containerd[1652]: time="2025-12-12T17:29:53.869920832Z" level=info msg="CreateContainer within sandbox \"19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 12 17:29:53.886194 containerd[1652]: time="2025-12-12T17:29:53.886092994Z" level=info msg="Container 016b55816fb255cc9f7a593d6d14ef682e1050b64bb361be5f488489e977bd80: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:29:53.897872 containerd[1652]: time="2025-12-12T17:29:53.897824264Z" level=info msg="CreateContainer within sandbox \"19536b66a4034bff1d38e97a15c12dfbda5ec1f17e29237ff162c25130fc3898\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"016b55816fb255cc9f7a593d6d14ef682e1050b64bb361be5f488489e977bd80\"" Dec 12 17:29:53.899941 containerd[1652]: time="2025-12-12T17:29:53.898677427Z" level=info msg="StartContainer for \"016b55816fb255cc9f7a593d6d14ef682e1050b64bb361be5f488489e977bd80\"" Dec 12 17:29:53.899941 containerd[1652]: time="2025-12-12T17:29:53.899626949Z" level=info msg="connecting to shim 016b55816fb255cc9f7a593d6d14ef682e1050b64bb361be5f488489e977bd80" address="unix:///run/containerd/s/361a1b1a55e945967caf053085908fb2071750b2a9c838ad4606f7825d6be094" protocol=ttrpc version=3 Dec 12 17:29:53.918569 systemd[1]: Started cri-containerd-016b55816fb255cc9f7a593d6d14ef682e1050b64bb361be5f488489e977bd80.scope - libcontainer container 016b55816fb255cc9f7a593d6d14ef682e1050b64bb361be5f488489e977bd80. Dec 12 17:29:53.952118 containerd[1652]: time="2025-12-12T17:29:53.952078885Z" level=info msg="StartContainer for \"016b55816fb255cc9f7a593d6d14ef682e1050b64bb361be5f488489e977bd80\" returns successfully" Dec 12 17:29:54.062690 systemd-networkd[1513]: califc72c1de89a: Gained IPv6LL Dec 12 17:29:54.622020 containerd[1652]: time="2025-12-12T17:29:54.621984905Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7bccd8547-rp9k9,Uid:f10953b5-1654-43b7-bb1b-acb41105d201,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:29:54.623573 containerd[1652]: time="2025-12-12T17:29:54.623545670Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7bccd8547-ssn5z,Uid:7a593db9-1db8-4942-815e-ae24e8a457d5,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:29:54.628326 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3610174254.mount: Deactivated successfully. Dec 12 17:29:54.638796 systemd-networkd[1513]: cali1ed00bf3d68: Gained IPv6LL Dec 12 17:29:54.743751 systemd-networkd[1513]: calib178262964d: Link UP Dec 12 17:29:54.744209 systemd-networkd[1513]: calib178262964d: Gained carrier Dec 12 17:29:54.761850 containerd[1652]: 2025-12-12 17:29:54.671 [INFO][5253] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--rp9k9-eth0 calico-apiserver-7bccd8547- calico-apiserver f10953b5-1654-43b7-bb1b-acb41105d201 868 0 2025-12-12 17:29:20 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7bccd8547 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-d-e796afb129 calico-apiserver-7bccd8547-rp9k9 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib178262964d [] [] }} ContainerID="51ca818ac480a7337cf3f7c84c3db314d75238f539c91bcb705d90a373b9074a" Namespace="calico-apiserver" Pod="calico-apiserver-7bccd8547-rp9k9" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--rp9k9-" Dec 12 17:29:54.761850 containerd[1652]: 2025-12-12 17:29:54.671 [INFO][5253] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="51ca818ac480a7337cf3f7c84c3db314d75238f539c91bcb705d90a373b9074a" Namespace="calico-apiserver" Pod="calico-apiserver-7bccd8547-rp9k9" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--rp9k9-eth0" Dec 12 17:29:54.761850 containerd[1652]: 2025-12-12 17:29:54.698 [INFO][5283] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="51ca818ac480a7337cf3f7c84c3db314d75238f539c91bcb705d90a373b9074a" HandleID="k8s-pod-network.51ca818ac480a7337cf3f7c84c3db314d75238f539c91bcb705d90a373b9074a" Workload="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--rp9k9-eth0" Dec 12 17:29:54.761850 containerd[1652]: 2025-12-12 17:29:54.698 [INFO][5283] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="51ca818ac480a7337cf3f7c84c3db314d75238f539c91bcb705d90a373b9074a" HandleID="k8s-pod-network.51ca818ac480a7337cf3f7c84c3db314d75238f539c91bcb705d90a373b9074a" Workload="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--rp9k9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000323390), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-d-e796afb129", "pod":"calico-apiserver-7bccd8547-rp9k9", "timestamp":"2025-12-12 17:29:54.698347024 +0000 UTC"}, Hostname:"ci-4459-2-2-d-e796afb129", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:29:54.761850 containerd[1652]: 2025-12-12 17:29:54.698 [INFO][5283] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:29:54.761850 containerd[1652]: 2025-12-12 17:29:54.698 [INFO][5283] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:29:54.761850 containerd[1652]: 2025-12-12 17:29:54.698 [INFO][5283] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-d-e796afb129' Dec 12 17:29:54.761850 containerd[1652]: 2025-12-12 17:29:54.709 [INFO][5283] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.51ca818ac480a7337cf3f7c84c3db314d75238f539c91bcb705d90a373b9074a" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:54.761850 containerd[1652]: 2025-12-12 17:29:54.714 [INFO][5283] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:54.761850 containerd[1652]: 2025-12-12 17:29:54.719 [INFO][5283] ipam/ipam.go 511: Trying affinity for 192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:54.761850 containerd[1652]: 2025-12-12 17:29:54.722 [INFO][5283] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:54.761850 containerd[1652]: 2025-12-12 17:29:54.725 [INFO][5283] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:54.761850 containerd[1652]: 2025-12-12 17:29:54.725 [INFO][5283] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.8.0/26 handle="k8s-pod-network.51ca818ac480a7337cf3f7c84c3db314d75238f539c91bcb705d90a373b9074a" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:54.761850 containerd[1652]: 2025-12-12 17:29:54.727 [INFO][5283] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.51ca818ac480a7337cf3f7c84c3db314d75238f539c91bcb705d90a373b9074a Dec 12 17:29:54.761850 containerd[1652]: 2025-12-12 17:29:54.732 [INFO][5283] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.8.0/26 handle="k8s-pod-network.51ca818ac480a7337cf3f7c84c3db314d75238f539c91bcb705d90a373b9074a" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:54.761850 containerd[1652]: 2025-12-12 17:29:54.738 [INFO][5283] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.8.6/26] block=192.168.8.0/26 handle="k8s-pod-network.51ca818ac480a7337cf3f7c84c3db314d75238f539c91bcb705d90a373b9074a" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:54.761850 containerd[1652]: 2025-12-12 17:29:54.738 [INFO][5283] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.6/26] handle="k8s-pod-network.51ca818ac480a7337cf3f7c84c3db314d75238f539c91bcb705d90a373b9074a" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:54.761850 containerd[1652]: 2025-12-12 17:29:54.739 [INFO][5283] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:29:54.761850 containerd[1652]: 2025-12-12 17:29:54.739 [INFO][5283] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.8.6/26] IPv6=[] ContainerID="51ca818ac480a7337cf3f7c84c3db314d75238f539c91bcb705d90a373b9074a" HandleID="k8s-pod-network.51ca818ac480a7337cf3f7c84c3db314d75238f539c91bcb705d90a373b9074a" Workload="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--rp9k9-eth0" Dec 12 17:29:54.762622 containerd[1652]: 2025-12-12 17:29:54.741 [INFO][5253] cni-plugin/k8s.go 418: Populated endpoint ContainerID="51ca818ac480a7337cf3f7c84c3db314d75238f539c91bcb705d90a373b9074a" Namespace="calico-apiserver" Pod="calico-apiserver-7bccd8547-rp9k9" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--rp9k9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--rp9k9-eth0", GenerateName:"calico-apiserver-7bccd8547-", Namespace:"calico-apiserver", SelfLink:"", UID:"f10953b5-1654-43b7-bb1b-acb41105d201", ResourceVersion:"868", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 29, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7bccd8547", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-d-e796afb129", ContainerID:"", Pod:"calico-apiserver-7bccd8547-rp9k9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.8.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib178262964d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:29:54.762622 containerd[1652]: 2025-12-12 17:29:54.741 [INFO][5253] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.6/32] ContainerID="51ca818ac480a7337cf3f7c84c3db314d75238f539c91bcb705d90a373b9074a" Namespace="calico-apiserver" Pod="calico-apiserver-7bccd8547-rp9k9" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--rp9k9-eth0" Dec 12 17:29:54.762622 containerd[1652]: 2025-12-12 17:29:54.741 [INFO][5253] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib178262964d ContainerID="51ca818ac480a7337cf3f7c84c3db314d75238f539c91bcb705d90a373b9074a" Namespace="calico-apiserver" Pod="calico-apiserver-7bccd8547-rp9k9" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--rp9k9-eth0" Dec 12 17:29:54.762622 containerd[1652]: 2025-12-12 17:29:54.744 [INFO][5253] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="51ca818ac480a7337cf3f7c84c3db314d75238f539c91bcb705d90a373b9074a" Namespace="calico-apiserver" Pod="calico-apiserver-7bccd8547-rp9k9" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--rp9k9-eth0" Dec 12 17:29:54.762622 containerd[1652]: 2025-12-12 17:29:54.745 [INFO][5253] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="51ca818ac480a7337cf3f7c84c3db314d75238f539c91bcb705d90a373b9074a" Namespace="calico-apiserver" Pod="calico-apiserver-7bccd8547-rp9k9" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--rp9k9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--rp9k9-eth0", GenerateName:"calico-apiserver-7bccd8547-", Namespace:"calico-apiserver", SelfLink:"", UID:"f10953b5-1654-43b7-bb1b-acb41105d201", ResourceVersion:"868", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 29, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7bccd8547", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-d-e796afb129", ContainerID:"51ca818ac480a7337cf3f7c84c3db314d75238f539c91bcb705d90a373b9074a", Pod:"calico-apiserver-7bccd8547-rp9k9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.8.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib178262964d", MAC:"36:da:75:85:7b:0f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:29:54.762622 containerd[1652]: 2025-12-12 17:29:54.759 [INFO][5253] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="51ca818ac480a7337cf3f7c84c3db314d75238f539c91bcb705d90a373b9074a" Namespace="calico-apiserver" Pod="calico-apiserver-7bccd8547-rp9k9" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--rp9k9-eth0" Dec 12 17:29:54.769026 kubelet[3235]: E1212 17:29:54.768889 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7d565b7c7c-954tf" podUID="419c68ca-a927-4109-af9a-f076c2eb6b23" Dec 12 17:29:54.789902 containerd[1652]: time="2025-12-12T17:29:54.789857782Z" level=info msg="connecting to shim 51ca818ac480a7337cf3f7c84c3db314d75238f539c91bcb705d90a373b9074a" address="unix:///run/containerd/s/a4b0906374d26a070614fc93db34d587f5847f4ef4375e9479215c16acf59343" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:29:54.804980 kubelet[3235]: I1212 17:29:54.804905 3235 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-6r5pt" podStartSLOduration=43.804885901 podStartE2EDuration="43.804885901s" podCreationTimestamp="2025-12-12 17:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:29:54.801893853 +0000 UTC m=+51.284133338" watchObservedRunningTime="2025-12-12 17:29:54.804885901 +0000 UTC m=+51.287125386" Dec 12 17:29:54.820648 systemd[1]: Started cri-containerd-51ca818ac480a7337cf3f7c84c3db314d75238f539c91bcb705d90a373b9074a.scope - libcontainer container 51ca818ac480a7337cf3f7c84c3db314d75238f539c91bcb705d90a373b9074a. Dec 12 17:29:54.850284 systemd-networkd[1513]: cali73e6ac1f29a: Link UP Dec 12 17:29:54.850665 systemd-networkd[1513]: cali73e6ac1f29a: Gained carrier Dec 12 17:29:54.867882 containerd[1652]: 2025-12-12 17:29:54.677 [INFO][5260] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--ssn5z-eth0 calico-apiserver-7bccd8547- calico-apiserver 7a593db9-1db8-4942-815e-ae24e8a457d5 866 0 2025-12-12 17:29:20 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7bccd8547 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-d-e796afb129 calico-apiserver-7bccd8547-ssn5z eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali73e6ac1f29a [] [] }} ContainerID="55000cfbf9bf55cf6d8b906c6e2121007fd9ec7a572c2e0d6ce40f2203f9c286" Namespace="calico-apiserver" Pod="calico-apiserver-7bccd8547-ssn5z" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--ssn5z-" Dec 12 17:29:54.867882 containerd[1652]: 2025-12-12 17:29:54.677 [INFO][5260] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="55000cfbf9bf55cf6d8b906c6e2121007fd9ec7a572c2e0d6ce40f2203f9c286" Namespace="calico-apiserver" Pod="calico-apiserver-7bccd8547-ssn5z" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--ssn5z-eth0" Dec 12 17:29:54.867882 containerd[1652]: 2025-12-12 17:29:54.706 [INFO][5289] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="55000cfbf9bf55cf6d8b906c6e2121007fd9ec7a572c2e0d6ce40f2203f9c286" HandleID="k8s-pod-network.55000cfbf9bf55cf6d8b906c6e2121007fd9ec7a572c2e0d6ce40f2203f9c286" Workload="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--ssn5z-eth0" Dec 12 17:29:54.867882 containerd[1652]: 2025-12-12 17:29:54.706 [INFO][5289] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="55000cfbf9bf55cf6d8b906c6e2121007fd9ec7a572c2e0d6ce40f2203f9c286" HandleID="k8s-pod-network.55000cfbf9bf55cf6d8b906c6e2121007fd9ec7a572c2e0d6ce40f2203f9c286" Workload="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--ssn5z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000323390), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-d-e796afb129", "pod":"calico-apiserver-7bccd8547-ssn5z", "timestamp":"2025-12-12 17:29:54.706278004 +0000 UTC"}, Hostname:"ci-4459-2-2-d-e796afb129", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:29:54.867882 containerd[1652]: 2025-12-12 17:29:54.706 [INFO][5289] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:29:54.867882 containerd[1652]: 2025-12-12 17:29:54.739 [INFO][5289] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:29:54.867882 containerd[1652]: 2025-12-12 17:29:54.739 [INFO][5289] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-d-e796afb129' Dec 12 17:29:54.867882 containerd[1652]: 2025-12-12 17:29:54.810 [INFO][5289] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.55000cfbf9bf55cf6d8b906c6e2121007fd9ec7a572c2e0d6ce40f2203f9c286" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:54.867882 containerd[1652]: 2025-12-12 17:29:54.815 [INFO][5289] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:54.867882 containerd[1652]: 2025-12-12 17:29:54.823 [INFO][5289] ipam/ipam.go 511: Trying affinity for 192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:54.867882 containerd[1652]: 2025-12-12 17:29:54.826 [INFO][5289] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:54.867882 containerd[1652]: 2025-12-12 17:29:54.829 [INFO][5289] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:54.867882 containerd[1652]: 2025-12-12 17:29:54.829 [INFO][5289] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.8.0/26 handle="k8s-pod-network.55000cfbf9bf55cf6d8b906c6e2121007fd9ec7a572c2e0d6ce40f2203f9c286" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:54.867882 containerd[1652]: 2025-12-12 17:29:54.831 [INFO][5289] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.55000cfbf9bf55cf6d8b906c6e2121007fd9ec7a572c2e0d6ce40f2203f9c286 Dec 12 17:29:54.867882 containerd[1652]: 2025-12-12 17:29:54.837 [INFO][5289] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.8.0/26 handle="k8s-pod-network.55000cfbf9bf55cf6d8b906c6e2121007fd9ec7a572c2e0d6ce40f2203f9c286" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:54.867882 containerd[1652]: 2025-12-12 17:29:54.843 [INFO][5289] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.8.7/26] block=192.168.8.0/26 handle="k8s-pod-network.55000cfbf9bf55cf6d8b906c6e2121007fd9ec7a572c2e0d6ce40f2203f9c286" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:54.867882 containerd[1652]: 2025-12-12 17:29:54.843 [INFO][5289] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.7/26] handle="k8s-pod-network.55000cfbf9bf55cf6d8b906c6e2121007fd9ec7a572c2e0d6ce40f2203f9c286" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:54.867882 containerd[1652]: 2025-12-12 17:29:54.844 [INFO][5289] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:29:54.867882 containerd[1652]: 2025-12-12 17:29:54.844 [INFO][5289] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.8.7/26] IPv6=[] ContainerID="55000cfbf9bf55cf6d8b906c6e2121007fd9ec7a572c2e0d6ce40f2203f9c286" HandleID="k8s-pod-network.55000cfbf9bf55cf6d8b906c6e2121007fd9ec7a572c2e0d6ce40f2203f9c286" Workload="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--ssn5z-eth0" Dec 12 17:29:54.868645 containerd[1652]: 2025-12-12 17:29:54.848 [INFO][5260] cni-plugin/k8s.go 418: Populated endpoint ContainerID="55000cfbf9bf55cf6d8b906c6e2121007fd9ec7a572c2e0d6ce40f2203f9c286" Namespace="calico-apiserver" Pod="calico-apiserver-7bccd8547-ssn5z" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--ssn5z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--ssn5z-eth0", GenerateName:"calico-apiserver-7bccd8547-", Namespace:"calico-apiserver", SelfLink:"", UID:"7a593db9-1db8-4942-815e-ae24e8a457d5", ResourceVersion:"866", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 29, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7bccd8547", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-d-e796afb129", ContainerID:"", Pod:"calico-apiserver-7bccd8547-ssn5z", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.8.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali73e6ac1f29a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:29:54.868645 containerd[1652]: 2025-12-12 17:29:54.848 [INFO][5260] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.7/32] ContainerID="55000cfbf9bf55cf6d8b906c6e2121007fd9ec7a572c2e0d6ce40f2203f9c286" Namespace="calico-apiserver" Pod="calico-apiserver-7bccd8547-ssn5z" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--ssn5z-eth0" Dec 12 17:29:54.868645 containerd[1652]: 2025-12-12 17:29:54.848 [INFO][5260] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali73e6ac1f29a ContainerID="55000cfbf9bf55cf6d8b906c6e2121007fd9ec7a572c2e0d6ce40f2203f9c286" Namespace="calico-apiserver" Pod="calico-apiserver-7bccd8547-ssn5z" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--ssn5z-eth0" Dec 12 17:29:54.868645 containerd[1652]: 2025-12-12 17:29:54.850 [INFO][5260] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="55000cfbf9bf55cf6d8b906c6e2121007fd9ec7a572c2e0d6ce40f2203f9c286" Namespace="calico-apiserver" Pod="calico-apiserver-7bccd8547-ssn5z" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--ssn5z-eth0" Dec 12 17:29:54.868645 containerd[1652]: 2025-12-12 17:29:54.851 [INFO][5260] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="55000cfbf9bf55cf6d8b906c6e2121007fd9ec7a572c2e0d6ce40f2203f9c286" Namespace="calico-apiserver" Pod="calico-apiserver-7bccd8547-ssn5z" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--ssn5z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--ssn5z-eth0", GenerateName:"calico-apiserver-7bccd8547-", Namespace:"calico-apiserver", SelfLink:"", UID:"7a593db9-1db8-4942-815e-ae24e8a457d5", ResourceVersion:"866", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 29, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7bccd8547", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-d-e796afb129", ContainerID:"55000cfbf9bf55cf6d8b906c6e2121007fd9ec7a572c2e0d6ce40f2203f9c286", Pod:"calico-apiserver-7bccd8547-ssn5z", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.8.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali73e6ac1f29a", MAC:"32:11:8a:51:c0:20", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:29:54.868645 containerd[1652]: 2025-12-12 17:29:54.864 [INFO][5260] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="55000cfbf9bf55cf6d8b906c6e2121007fd9ec7a572c2e0d6ce40f2203f9c286" Namespace="calico-apiserver" Pod="calico-apiserver-7bccd8547-ssn5z" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--apiserver--7bccd8547--ssn5z-eth0" Dec 12 17:29:54.883726 containerd[1652]: time="2025-12-12T17:29:54.883614545Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7bccd8547-rp9k9,Uid:f10953b5-1654-43b7-bb1b-acb41105d201,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"51ca818ac480a7337cf3f7c84c3db314d75238f539c91bcb705d90a373b9074a\"" Dec 12 17:29:54.887720 containerd[1652]: time="2025-12-12T17:29:54.886295072Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:29:54.895890 containerd[1652]: time="2025-12-12T17:29:54.895847337Z" level=info msg="connecting to shim 55000cfbf9bf55cf6d8b906c6e2121007fd9ec7a572c2e0d6ce40f2203f9c286" address="unix:///run/containerd/s/bfd005f8d35310e4ef56d14bb753a955f061af46f83edb3aa155150600801f1d" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:29:54.919588 systemd[1]: Started cri-containerd-55000cfbf9bf55cf6d8b906c6e2121007fd9ec7a572c2e0d6ce40f2203f9c286.scope - libcontainer container 55000cfbf9bf55cf6d8b906c6e2121007fd9ec7a572c2e0d6ce40f2203f9c286. Dec 12 17:29:54.951118 containerd[1652]: time="2025-12-12T17:29:54.951068680Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7bccd8547-ssn5z,Uid:7a593db9-1db8-4942-815e-ae24e8a457d5,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"55000cfbf9bf55cf6d8b906c6e2121007fd9ec7a572c2e0d6ce40f2203f9c286\"" Dec 12 17:29:55.224564 containerd[1652]: time="2025-12-12T17:29:55.224424150Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:55.225885 containerd[1652]: time="2025-12-12T17:29:55.225781114Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:29:55.225885 containerd[1652]: time="2025-12-12T17:29:55.225866714Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:29:55.226064 kubelet[3235]: E1212 17:29:55.226007 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:29:55.226128 kubelet[3235]: E1212 17:29:55.226066 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:29:55.226301 kubelet[3235]: E1212 17:29:55.226241 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-7bccd8547-rp9k9_calico-apiserver(f10953b5-1654-43b7-bb1b-acb41105d201): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:55.226345 kubelet[3235]: E1212 17:29:55.226295 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-rp9k9" podUID="f10953b5-1654-43b7-bb1b-acb41105d201" Dec 12 17:29:55.226685 containerd[1652]: time="2025-12-12T17:29:55.226647916Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:29:55.558591 containerd[1652]: time="2025-12-12T17:29:55.558429458Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:55.559773 containerd[1652]: time="2025-12-12T17:29:55.559735661Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:29:55.559833 containerd[1652]: time="2025-12-12T17:29:55.559791982Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:29:55.560056 kubelet[3235]: E1212 17:29:55.560021 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:29:55.560115 kubelet[3235]: E1212 17:29:55.560075 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:29:55.560505 kubelet[3235]: E1212 17:29:55.560226 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-7bccd8547-ssn5z_calico-apiserver(7a593db9-1db8-4942-815e-ae24e8a457d5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:55.560505 kubelet[3235]: E1212 17:29:55.560278 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-ssn5z" podUID="7a593db9-1db8-4942-815e-ae24e8a457d5" Dec 12 17:29:55.623158 containerd[1652]: time="2025-12-12T17:29:55.623115626Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68c844c6f6-fvk5j,Uid:f134713e-1f0f-4a16-9202-6adaa12db341,Namespace:calico-system,Attempt:0,}" Dec 12 17:29:55.624233 containerd[1652]: time="2025-12-12T17:29:55.624206949Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ksbxj,Uid:5bf56c15-e8b7-4324-9dd0-89444eba43fb,Namespace:calico-system,Attempt:0,}" Dec 12 17:29:55.727035 systemd-networkd[1513]: calid2e780b3078: Gained IPv6LL Dec 12 17:29:55.757919 systemd-networkd[1513]: cali3eabb8f771b: Link UP Dec 12 17:29:55.758596 systemd-networkd[1513]: cali3eabb8f771b: Gained carrier Dec 12 17:29:55.771647 containerd[1652]: 2025-12-12 17:29:55.668 [INFO][5417] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--d--e796afb129-k8s-calico--kube--controllers--68c844c6f6--fvk5j-eth0 calico-kube-controllers-68c844c6f6- calico-system f134713e-1f0f-4a16-9202-6adaa12db341 869 0 2025-12-12 17:29:29 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:68c844c6f6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-2-d-e796afb129 calico-kube-controllers-68c844c6f6-fvk5j eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali3eabb8f771b [] [] }} ContainerID="6974daa79e3fbcc4dbf8d7ea8367133196091f7a3a28489c8ef10592f8037aa3" Namespace="calico-system" Pod="calico-kube-controllers-68c844c6f6-fvk5j" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--kube--controllers--68c844c6f6--fvk5j-" Dec 12 17:29:55.771647 containerd[1652]: 2025-12-12 17:29:55.669 [INFO][5417] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6974daa79e3fbcc4dbf8d7ea8367133196091f7a3a28489c8ef10592f8037aa3" Namespace="calico-system" Pod="calico-kube-controllers-68c844c6f6-fvk5j" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--kube--controllers--68c844c6f6--fvk5j-eth0" Dec 12 17:29:55.771647 containerd[1652]: 2025-12-12 17:29:55.701 [INFO][5449] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6974daa79e3fbcc4dbf8d7ea8367133196091f7a3a28489c8ef10592f8037aa3" HandleID="k8s-pod-network.6974daa79e3fbcc4dbf8d7ea8367133196091f7a3a28489c8ef10592f8037aa3" Workload="ci--4459--2--2--d--e796afb129-k8s-calico--kube--controllers--68c844c6f6--fvk5j-eth0" Dec 12 17:29:55.771647 containerd[1652]: 2025-12-12 17:29:55.701 [INFO][5449] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="6974daa79e3fbcc4dbf8d7ea8367133196091f7a3a28489c8ef10592f8037aa3" HandleID="k8s-pod-network.6974daa79e3fbcc4dbf8d7ea8367133196091f7a3a28489c8ef10592f8037aa3" Workload="ci--4459--2--2--d--e796afb129-k8s-calico--kube--controllers--68c844c6f6--fvk5j-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000136dd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-d-e796afb129", "pod":"calico-kube-controllers-68c844c6f6-fvk5j", "timestamp":"2025-12-12 17:29:55.701353189 +0000 UTC"}, Hostname:"ci-4459-2-2-d-e796afb129", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:29:55.771647 containerd[1652]: 2025-12-12 17:29:55.701 [INFO][5449] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:29:55.771647 containerd[1652]: 2025-12-12 17:29:55.701 [INFO][5449] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:29:55.771647 containerd[1652]: 2025-12-12 17:29:55.701 [INFO][5449] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-d-e796afb129' Dec 12 17:29:55.771647 containerd[1652]: 2025-12-12 17:29:55.711 [INFO][5449] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6974daa79e3fbcc4dbf8d7ea8367133196091f7a3a28489c8ef10592f8037aa3" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:55.771647 containerd[1652]: 2025-12-12 17:29:55.718 [INFO][5449] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:55.771647 containerd[1652]: 2025-12-12 17:29:55.722 [INFO][5449] ipam/ipam.go 511: Trying affinity for 192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:55.771647 containerd[1652]: 2025-12-12 17:29:55.724 [INFO][5449] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:55.771647 containerd[1652]: 2025-12-12 17:29:55.728 [INFO][5449] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:55.771647 containerd[1652]: 2025-12-12 17:29:55.729 [INFO][5449] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.8.0/26 handle="k8s-pod-network.6974daa79e3fbcc4dbf8d7ea8367133196091f7a3a28489c8ef10592f8037aa3" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:55.771647 containerd[1652]: 2025-12-12 17:29:55.731 [INFO][5449] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.6974daa79e3fbcc4dbf8d7ea8367133196091f7a3a28489c8ef10592f8037aa3 Dec 12 17:29:55.771647 containerd[1652]: 2025-12-12 17:29:55.745 [INFO][5449] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.8.0/26 handle="k8s-pod-network.6974daa79e3fbcc4dbf8d7ea8367133196091f7a3a28489c8ef10592f8037aa3" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:55.771647 containerd[1652]: 2025-12-12 17:29:55.753 [INFO][5449] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.8.8/26] block=192.168.8.0/26 handle="k8s-pod-network.6974daa79e3fbcc4dbf8d7ea8367133196091f7a3a28489c8ef10592f8037aa3" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:55.771647 containerd[1652]: 2025-12-12 17:29:55.753 [INFO][5449] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.8/26] handle="k8s-pod-network.6974daa79e3fbcc4dbf8d7ea8367133196091f7a3a28489c8ef10592f8037aa3" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:55.771647 containerd[1652]: 2025-12-12 17:29:55.753 [INFO][5449] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:29:55.771647 containerd[1652]: 2025-12-12 17:29:55.753 [INFO][5449] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.8.8/26] IPv6=[] ContainerID="6974daa79e3fbcc4dbf8d7ea8367133196091f7a3a28489c8ef10592f8037aa3" HandleID="k8s-pod-network.6974daa79e3fbcc4dbf8d7ea8367133196091f7a3a28489c8ef10592f8037aa3" Workload="ci--4459--2--2--d--e796afb129-k8s-calico--kube--controllers--68c844c6f6--fvk5j-eth0" Dec 12 17:29:55.774769 containerd[1652]: 2025-12-12 17:29:55.755 [INFO][5417] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6974daa79e3fbcc4dbf8d7ea8367133196091f7a3a28489c8ef10592f8037aa3" Namespace="calico-system" Pod="calico-kube-controllers-68c844c6f6-fvk5j" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--kube--controllers--68c844c6f6--fvk5j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--d--e796afb129-k8s-calico--kube--controllers--68c844c6f6--fvk5j-eth0", GenerateName:"calico-kube-controllers-68c844c6f6-", Namespace:"calico-system", SelfLink:"", UID:"f134713e-1f0f-4a16-9202-6adaa12db341", ResourceVersion:"869", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 29, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"68c844c6f6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-d-e796afb129", ContainerID:"", Pod:"calico-kube-controllers-68c844c6f6-fvk5j", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.8.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3eabb8f771b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:29:55.774769 containerd[1652]: 2025-12-12 17:29:55.755 [INFO][5417] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.8/32] ContainerID="6974daa79e3fbcc4dbf8d7ea8367133196091f7a3a28489c8ef10592f8037aa3" Namespace="calico-system" Pod="calico-kube-controllers-68c844c6f6-fvk5j" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--kube--controllers--68c844c6f6--fvk5j-eth0" Dec 12 17:29:55.774769 containerd[1652]: 2025-12-12 17:29:55.755 [INFO][5417] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3eabb8f771b ContainerID="6974daa79e3fbcc4dbf8d7ea8367133196091f7a3a28489c8ef10592f8037aa3" Namespace="calico-system" Pod="calico-kube-controllers-68c844c6f6-fvk5j" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--kube--controllers--68c844c6f6--fvk5j-eth0" Dec 12 17:29:55.774769 containerd[1652]: 2025-12-12 17:29:55.759 [INFO][5417] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6974daa79e3fbcc4dbf8d7ea8367133196091f7a3a28489c8ef10592f8037aa3" Namespace="calico-system" Pod="calico-kube-controllers-68c844c6f6-fvk5j" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--kube--controllers--68c844c6f6--fvk5j-eth0" Dec 12 17:29:55.774769 containerd[1652]: 2025-12-12 17:29:55.759 [INFO][5417] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6974daa79e3fbcc4dbf8d7ea8367133196091f7a3a28489c8ef10592f8037aa3" Namespace="calico-system" Pod="calico-kube-controllers-68c844c6f6-fvk5j" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--kube--controllers--68c844c6f6--fvk5j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--d--e796afb129-k8s-calico--kube--controllers--68c844c6f6--fvk5j-eth0", GenerateName:"calico-kube-controllers-68c844c6f6-", Namespace:"calico-system", SelfLink:"", UID:"f134713e-1f0f-4a16-9202-6adaa12db341", ResourceVersion:"869", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 29, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"68c844c6f6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-d-e796afb129", ContainerID:"6974daa79e3fbcc4dbf8d7ea8367133196091f7a3a28489c8ef10592f8037aa3", Pod:"calico-kube-controllers-68c844c6f6-fvk5j", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.8.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3eabb8f771b", MAC:"da:42:a0:b5:3d:93", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:29:55.774769 containerd[1652]: 2025-12-12 17:29:55.767 [INFO][5417] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6974daa79e3fbcc4dbf8d7ea8367133196091f7a3a28489c8ef10592f8037aa3" Namespace="calico-system" Pod="calico-kube-controllers-68c844c6f6-fvk5j" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-calico--kube--controllers--68c844c6f6--fvk5j-eth0" Dec 12 17:29:55.782513 kubelet[3235]: E1212 17:29:55.782232 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-ssn5z" podUID="7a593db9-1db8-4942-815e-ae24e8a457d5" Dec 12 17:29:55.788473 kubelet[3235]: E1212 17:29:55.788426 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-rp9k9" podUID="f10953b5-1654-43b7-bb1b-acb41105d201" Dec 12 17:29:55.824979 containerd[1652]: time="2025-12-12T17:29:55.824340909Z" level=info msg="connecting to shim 6974daa79e3fbcc4dbf8d7ea8367133196091f7a3a28489c8ef10592f8037aa3" address="unix:///run/containerd/s/fc45049b36f707b1cb99ed2855ba4fca731e81a6a2c57bdd8ee05dd7435b00b9" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:29:55.857573 systemd[1]: Started cri-containerd-6974daa79e3fbcc4dbf8d7ea8367133196091f7a3a28489c8ef10592f8037aa3.scope - libcontainer container 6974daa79e3fbcc4dbf8d7ea8367133196091f7a3a28489c8ef10592f8037aa3. Dec 12 17:29:55.876766 systemd-networkd[1513]: calid6d5fd2598d: Link UP Dec 12 17:29:55.878674 systemd-networkd[1513]: calid6d5fd2598d: Gained carrier Dec 12 17:29:55.897549 containerd[1652]: 2025-12-12 17:29:55.674 [INFO][5431] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--d--e796afb129-k8s-csi--node--driver--ksbxj-eth0 csi-node-driver- calico-system 5bf56c15-e8b7-4324-9dd0-89444eba43fb 767 0 2025-12-12 17:29:29 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:9d99788f7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-2-d-e796afb129 csi-node-driver-ksbxj eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calid6d5fd2598d [] [] }} ContainerID="aad66d38329d06d58cd84ee0f472502809b380d0c4caa95306fdf193372681d3" Namespace="calico-system" Pod="csi-node-driver-ksbxj" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-csi--node--driver--ksbxj-" Dec 12 17:29:55.897549 containerd[1652]: 2025-12-12 17:29:55.676 [INFO][5431] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="aad66d38329d06d58cd84ee0f472502809b380d0c4caa95306fdf193372681d3" Namespace="calico-system" Pod="csi-node-driver-ksbxj" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-csi--node--driver--ksbxj-eth0" Dec 12 17:29:55.897549 containerd[1652]: 2025-12-12 17:29:55.702 [INFO][5456] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="aad66d38329d06d58cd84ee0f472502809b380d0c4caa95306fdf193372681d3" HandleID="k8s-pod-network.aad66d38329d06d58cd84ee0f472502809b380d0c4caa95306fdf193372681d3" Workload="ci--4459--2--2--d--e796afb129-k8s-csi--node--driver--ksbxj-eth0" Dec 12 17:29:55.897549 containerd[1652]: 2025-12-12 17:29:55.703 [INFO][5456] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="aad66d38329d06d58cd84ee0f472502809b380d0c4caa95306fdf193372681d3" HandleID="k8s-pod-network.aad66d38329d06d58cd84ee0f472502809b380d0c4caa95306fdf193372681d3" Workload="ci--4459--2--2--d--e796afb129-k8s-csi--node--driver--ksbxj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c2fd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-d-e796afb129", "pod":"csi-node-driver-ksbxj", "timestamp":"2025-12-12 17:29:55.702987233 +0000 UTC"}, Hostname:"ci-4459-2-2-d-e796afb129", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:29:55.897549 containerd[1652]: 2025-12-12 17:29:55.703 [INFO][5456] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:29:55.897549 containerd[1652]: 2025-12-12 17:29:55.753 [INFO][5456] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:29:55.897549 containerd[1652]: 2025-12-12 17:29:55.754 [INFO][5456] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-d-e796afb129' Dec 12 17:29:55.897549 containerd[1652]: 2025-12-12 17:29:55.812 [INFO][5456] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.aad66d38329d06d58cd84ee0f472502809b380d0c4caa95306fdf193372681d3" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:55.897549 containerd[1652]: 2025-12-12 17:29:55.833 [INFO][5456] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:55.897549 containerd[1652]: 2025-12-12 17:29:55.844 [INFO][5456] ipam/ipam.go 511: Trying affinity for 192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:55.897549 containerd[1652]: 2025-12-12 17:29:55.847 [INFO][5456] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:55.897549 containerd[1652]: 2025-12-12 17:29:55.852 [INFO][5456] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.0/26 host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:55.897549 containerd[1652]: 2025-12-12 17:29:55.852 [INFO][5456] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.8.0/26 handle="k8s-pod-network.aad66d38329d06d58cd84ee0f472502809b380d0c4caa95306fdf193372681d3" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:55.897549 containerd[1652]: 2025-12-12 17:29:55.854 [INFO][5456] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.aad66d38329d06d58cd84ee0f472502809b380d0c4caa95306fdf193372681d3 Dec 12 17:29:55.897549 containerd[1652]: 2025-12-12 17:29:55.861 [INFO][5456] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.8.0/26 handle="k8s-pod-network.aad66d38329d06d58cd84ee0f472502809b380d0c4caa95306fdf193372681d3" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:55.897549 containerd[1652]: 2025-12-12 17:29:55.869 [INFO][5456] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.8.9/26] block=192.168.8.0/26 handle="k8s-pod-network.aad66d38329d06d58cd84ee0f472502809b380d0c4caa95306fdf193372681d3" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:55.897549 containerd[1652]: 2025-12-12 17:29:55.869 [INFO][5456] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.9/26] handle="k8s-pod-network.aad66d38329d06d58cd84ee0f472502809b380d0c4caa95306fdf193372681d3" host="ci-4459-2-2-d-e796afb129" Dec 12 17:29:55.897549 containerd[1652]: 2025-12-12 17:29:55.869 [INFO][5456] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:29:55.897549 containerd[1652]: 2025-12-12 17:29:55.869 [INFO][5456] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.8.9/26] IPv6=[] ContainerID="aad66d38329d06d58cd84ee0f472502809b380d0c4caa95306fdf193372681d3" HandleID="k8s-pod-network.aad66d38329d06d58cd84ee0f472502809b380d0c4caa95306fdf193372681d3" Workload="ci--4459--2--2--d--e796afb129-k8s-csi--node--driver--ksbxj-eth0" Dec 12 17:29:55.898288 containerd[1652]: 2025-12-12 17:29:55.872 [INFO][5431] cni-plugin/k8s.go 418: Populated endpoint ContainerID="aad66d38329d06d58cd84ee0f472502809b380d0c4caa95306fdf193372681d3" Namespace="calico-system" Pod="csi-node-driver-ksbxj" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-csi--node--driver--ksbxj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--d--e796afb129-k8s-csi--node--driver--ksbxj-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"5bf56c15-e8b7-4324-9dd0-89444eba43fb", ResourceVersion:"767", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 29, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-d-e796afb129", ContainerID:"", Pod:"csi-node-driver-ksbxj", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.8.9/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid6d5fd2598d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:29:55.898288 containerd[1652]: 2025-12-12 17:29:55.873 [INFO][5431] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.9/32] ContainerID="aad66d38329d06d58cd84ee0f472502809b380d0c4caa95306fdf193372681d3" Namespace="calico-system" Pod="csi-node-driver-ksbxj" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-csi--node--driver--ksbxj-eth0" Dec 12 17:29:55.898288 containerd[1652]: 2025-12-12 17:29:55.873 [INFO][5431] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid6d5fd2598d ContainerID="aad66d38329d06d58cd84ee0f472502809b380d0c4caa95306fdf193372681d3" Namespace="calico-system" Pod="csi-node-driver-ksbxj" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-csi--node--driver--ksbxj-eth0" Dec 12 17:29:55.898288 containerd[1652]: 2025-12-12 17:29:55.878 [INFO][5431] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="aad66d38329d06d58cd84ee0f472502809b380d0c4caa95306fdf193372681d3" Namespace="calico-system" Pod="csi-node-driver-ksbxj" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-csi--node--driver--ksbxj-eth0" Dec 12 17:29:55.898288 containerd[1652]: 2025-12-12 17:29:55.879 [INFO][5431] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="aad66d38329d06d58cd84ee0f472502809b380d0c4caa95306fdf193372681d3" Namespace="calico-system" Pod="csi-node-driver-ksbxj" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-csi--node--driver--ksbxj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--d--e796afb129-k8s-csi--node--driver--ksbxj-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"5bf56c15-e8b7-4324-9dd0-89444eba43fb", ResourceVersion:"767", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 29, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-d-e796afb129", ContainerID:"aad66d38329d06d58cd84ee0f472502809b380d0c4caa95306fdf193372681d3", Pod:"csi-node-driver-ksbxj", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.8.9/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid6d5fd2598d", MAC:"5a:87:93:2a:47:3b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:29:55.898288 containerd[1652]: 2025-12-12 17:29:55.893 [INFO][5431] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="aad66d38329d06d58cd84ee0f472502809b380d0c4caa95306fdf193372681d3" Namespace="calico-system" Pod="csi-node-driver-ksbxj" WorkloadEndpoint="ci--4459--2--2--d--e796afb129-k8s-csi--node--driver--ksbxj-eth0" Dec 12 17:29:55.930663 containerd[1652]: time="2025-12-12T17:29:55.930528865Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68c844c6f6-fvk5j,Uid:f134713e-1f0f-4a16-9202-6adaa12db341,Namespace:calico-system,Attempt:0,} returns sandbox id \"6974daa79e3fbcc4dbf8d7ea8367133196091f7a3a28489c8ef10592f8037aa3\"" Dec 12 17:29:55.934640 containerd[1652]: time="2025-12-12T17:29:55.934237434Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:29:55.937722 containerd[1652]: time="2025-12-12T17:29:55.937563043Z" level=info msg="connecting to shim aad66d38329d06d58cd84ee0f472502809b380d0c4caa95306fdf193372681d3" address="unix:///run/containerd/s/602fb76f0f4bf2552b31c9342fd50c6a7f33ad049b451d3b131f5460d144d9a9" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:29:55.962594 systemd[1]: Started cri-containerd-aad66d38329d06d58cd84ee0f472502809b380d0c4caa95306fdf193372681d3.scope - libcontainer container aad66d38329d06d58cd84ee0f472502809b380d0c4caa95306fdf193372681d3. Dec 12 17:29:55.987223 containerd[1652]: time="2025-12-12T17:29:55.987181492Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ksbxj,Uid:5bf56c15-e8b7-4324-9dd0-89444eba43fb,Namespace:calico-system,Attempt:0,} returns sandbox id \"aad66d38329d06d58cd84ee0f472502809b380d0c4caa95306fdf193372681d3\"" Dec 12 17:29:56.260487 containerd[1652]: time="2025-12-12T17:29:56.260447762Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:56.262238 containerd[1652]: time="2025-12-12T17:29:56.262162366Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:29:56.262344 containerd[1652]: time="2025-12-12T17:29:56.262229846Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:29:56.263539 kubelet[3235]: E1212 17:29:56.262445 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:29:56.263539 kubelet[3235]: E1212 17:29:56.262490 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:29:56.263539 kubelet[3235]: E1212 17:29:56.262628 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-68c844c6f6-fvk5j_calico-system(f134713e-1f0f-4a16-9202-6adaa12db341): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:56.263539 kubelet[3235]: E1212 17:29:56.262660 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68c844c6f6-fvk5j" podUID="f134713e-1f0f-4a16-9202-6adaa12db341" Dec 12 17:29:56.263802 containerd[1652]: time="2025-12-12T17:29:56.263274689Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:29:56.366934 systemd-networkd[1513]: cali73e6ac1f29a: Gained IPv6LL Dec 12 17:29:56.430630 systemd-networkd[1513]: calib178262964d: Gained IPv6LL Dec 12 17:29:56.584141 containerd[1652]: time="2025-12-12T17:29:56.584013162Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:56.585571 containerd[1652]: time="2025-12-12T17:29:56.585525246Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:29:56.585636 containerd[1652]: time="2025-12-12T17:29:56.585601086Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:29:56.585823 kubelet[3235]: E1212 17:29:56.585784 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:29:56.585909 kubelet[3235]: E1212 17:29:56.585892 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:29:56.586033 kubelet[3235]: E1212 17:29:56.586017 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-ksbxj_calico-system(5bf56c15-e8b7-4324-9dd0-89444eba43fb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:56.589384 containerd[1652]: time="2025-12-12T17:29:56.589134975Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:29:56.789199 kubelet[3235]: E1212 17:29:56.789158 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-rp9k9" podUID="f10953b5-1654-43b7-bb1b-acb41105d201" Dec 12 17:29:56.790693 kubelet[3235]: E1212 17:29:56.789197 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68c844c6f6-fvk5j" podUID="f134713e-1f0f-4a16-9202-6adaa12db341" Dec 12 17:29:56.790693 kubelet[3235]: E1212 17:29:56.789273 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-ssn5z" podUID="7a593db9-1db8-4942-815e-ae24e8a457d5" Dec 12 17:29:56.928505 containerd[1652]: time="2025-12-12T17:29:56.928388057Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:56.929762 containerd[1652]: time="2025-12-12T17:29:56.929696540Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:29:56.929858 containerd[1652]: time="2025-12-12T17:29:56.929797460Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:29:56.930081 kubelet[3235]: E1212 17:29:56.930039 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:29:56.930240 kubelet[3235]: E1212 17:29:56.930088 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:29:56.930240 kubelet[3235]: E1212 17:29:56.930167 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-ksbxj_calico-system(5bf56c15-e8b7-4324-9dd0-89444eba43fb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:56.930240 kubelet[3235]: E1212 17:29:56.930214 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ksbxj" podUID="5bf56c15-e8b7-4324-9dd0-89444eba43fb" Dec 12 17:29:56.942909 systemd-networkd[1513]: calid6d5fd2598d: Gained IPv6LL Dec 12 17:29:57.006673 systemd-networkd[1513]: cali3eabb8f771b: Gained IPv6LL Dec 12 17:29:57.793139 kubelet[3235]: E1212 17:29:57.793063 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68c844c6f6-fvk5j" podUID="f134713e-1f0f-4a16-9202-6adaa12db341" Dec 12 17:29:57.794359 kubelet[3235]: E1212 17:29:57.793088 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ksbxj" podUID="5bf56c15-e8b7-4324-9dd0-89444eba43fb" Dec 12 17:29:59.621556 containerd[1652]: time="2025-12-12T17:29:59.621522532Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:29:59.974237 containerd[1652]: time="2025-12-12T17:29:59.974071048Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:59.975501 containerd[1652]: time="2025-12-12T17:29:59.975457412Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:29:59.975597 containerd[1652]: time="2025-12-12T17:29:59.975540052Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:29:59.975706 kubelet[3235]: E1212 17:29:59.975668 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:29:59.975987 kubelet[3235]: E1212 17:29:59.975718 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:29:59.975987 kubelet[3235]: E1212 17:29:59.975793 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-6646878486-nf54l_calico-system(1aa8e517-9efe-4339-b128-b444ff23b3fb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:59.976748 containerd[1652]: time="2025-12-12T17:29:59.976532815Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:30:00.296821 containerd[1652]: time="2025-12-12T17:30:00.296710286Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:30:00.298305 containerd[1652]: time="2025-12-12T17:30:00.298267010Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:30:00.298399 containerd[1652]: time="2025-12-12T17:30:00.298314010Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:30:00.298735 kubelet[3235]: E1212 17:30:00.298520 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:30:00.298735 kubelet[3235]: E1212 17:30:00.298566 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:30:00.298735 kubelet[3235]: E1212 17:30:00.298640 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-6646878486-nf54l_calico-system(1aa8e517-9efe-4339-b128-b444ff23b3fb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:30:00.298835 kubelet[3235]: E1212 17:30:00.298692 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6646878486-nf54l" podUID="1aa8e517-9efe-4339-b128-b444ff23b3fb" Dec 12 17:30:07.621603 containerd[1652]: time="2025-12-12T17:30:07.621499074Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:30:07.959984 containerd[1652]: time="2025-12-12T17:30:07.959790193Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:30:07.961547 containerd[1652]: time="2025-12-12T17:30:07.961502478Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:30:07.961638 containerd[1652]: time="2025-12-12T17:30:07.961539838Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:30:07.961776 kubelet[3235]: E1212 17:30:07.961720 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:30:07.961776 kubelet[3235]: E1212 17:30:07.961768 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:30:07.962082 kubelet[3235]: E1212 17:30:07.961901 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-7d565b7c7c-954tf_calico-apiserver(419c68ca-a927-4109-af9a-f076c2eb6b23): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:30:07.962082 kubelet[3235]: E1212 17:30:07.961937 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7d565b7c7c-954tf" podUID="419c68ca-a927-4109-af9a-f076c2eb6b23" Dec 12 17:30:07.962396 containerd[1652]: time="2025-12-12T17:30:07.962358840Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:30:08.301409 containerd[1652]: time="2025-12-12T17:30:08.301293320Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:30:08.302884 containerd[1652]: time="2025-12-12T17:30:08.302837844Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:30:08.302986 containerd[1652]: time="2025-12-12T17:30:08.302894724Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:30:08.303102 kubelet[3235]: E1212 17:30:08.303046 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:30:08.303102 kubelet[3235]: E1212 17:30:08.303090 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:30:08.303183 kubelet[3235]: E1212 17:30:08.303163 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-7bccd8547-ssn5z_calico-apiserver(7a593db9-1db8-4942-815e-ae24e8a457d5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:30:08.303212 kubelet[3235]: E1212 17:30:08.303195 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-ssn5z" podUID="7a593db9-1db8-4942-815e-ae24e8a457d5" Dec 12 17:30:08.620829 containerd[1652]: time="2025-12-12T17:30:08.620726630Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:30:08.960671 containerd[1652]: time="2025-12-12T17:30:08.960491313Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:30:08.962573 containerd[1652]: time="2025-12-12T17:30:08.962539078Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:30:08.962656 containerd[1652]: time="2025-12-12T17:30:08.962614398Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:30:08.962815 kubelet[3235]: E1212 17:30:08.962754 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:30:08.963006 kubelet[3235]: E1212 17:30:08.962838 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:30:08.963006 kubelet[3235]: E1212 17:30:08.962979 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-fvf46_calico-system(afc05240-ed5c-4c99-8e0c-0cca61ebd35a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:30:08.963067 kubelet[3235]: E1212 17:30:08.963052 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-fvf46" podUID="afc05240-ed5c-4c99-8e0c-0cca61ebd35a" Dec 12 17:30:09.622829 containerd[1652]: time="2025-12-12T17:30:09.622790153Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:30:09.948635 containerd[1652]: time="2025-12-12T17:30:09.948501799Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:30:09.950164 containerd[1652]: time="2025-12-12T17:30:09.950130443Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:30:09.950232 containerd[1652]: time="2025-12-12T17:30:09.950157603Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:30:09.950466 kubelet[3235]: E1212 17:30:09.950416 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:30:09.950523 kubelet[3235]: E1212 17:30:09.950467 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:30:09.950566 kubelet[3235]: E1212 17:30:09.950541 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-7bccd8547-rp9k9_calico-apiserver(f10953b5-1654-43b7-bb1b-acb41105d201): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:30:09.950600 kubelet[3235]: E1212 17:30:09.950575 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-rp9k9" podUID="f10953b5-1654-43b7-bb1b-acb41105d201" Dec 12 17:30:10.621427 containerd[1652]: time="2025-12-12T17:30:10.621121346Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:30:10.933332 containerd[1652]: time="2025-12-12T17:30:10.933185997Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:30:10.937946 containerd[1652]: time="2025-12-12T17:30:10.937892929Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:30:10.938056 containerd[1652]: time="2025-12-12T17:30:10.937968449Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:30:10.938200 kubelet[3235]: E1212 17:30:10.938140 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:30:10.938200 kubelet[3235]: E1212 17:30:10.938197 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:30:10.938626 kubelet[3235]: E1212 17:30:10.938271 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-68c844c6f6-fvk5j_calico-system(f134713e-1f0f-4a16-9202-6adaa12db341): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:30:10.938626 kubelet[3235]: E1212 17:30:10.938302 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68c844c6f6-fvk5j" podUID="f134713e-1f0f-4a16-9202-6adaa12db341" Dec 12 17:30:11.620765 containerd[1652]: time="2025-12-12T17:30:11.620700143Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:30:11.974479 containerd[1652]: time="2025-12-12T17:30:11.974297741Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:30:11.976606 containerd[1652]: time="2025-12-12T17:30:11.976558667Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:30:11.976678 containerd[1652]: time="2025-12-12T17:30:11.976662148Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:30:11.977039 kubelet[3235]: E1212 17:30:11.976801 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:30:11.977039 kubelet[3235]: E1212 17:30:11.976844 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:30:11.977039 kubelet[3235]: E1212 17:30:11.976918 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-ksbxj_calico-system(5bf56c15-e8b7-4324-9dd0-89444eba43fb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:30:11.978662 containerd[1652]: time="2025-12-12T17:30:11.978629593Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:30:12.504798 containerd[1652]: time="2025-12-12T17:30:12.504735599Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:30:12.509185 containerd[1652]: time="2025-12-12T17:30:12.509131451Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:30:12.509246 containerd[1652]: time="2025-12-12T17:30:12.509175651Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:30:12.509449 kubelet[3235]: E1212 17:30:12.509411 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:30:12.509521 kubelet[3235]: E1212 17:30:12.509460 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:30:12.509565 kubelet[3235]: E1212 17:30:12.509536 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-ksbxj_calico-system(5bf56c15-e8b7-4324-9dd0-89444eba43fb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:30:12.509670 kubelet[3235]: E1212 17:30:12.509583 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ksbxj" podUID="5bf56c15-e8b7-4324-9dd0-89444eba43fb" Dec 12 17:30:13.622021 kubelet[3235]: E1212 17:30:13.621960 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6646878486-nf54l" podUID="1aa8e517-9efe-4339-b128-b444ff23b3fb" Dec 12 17:30:19.622431 kubelet[3235]: E1212 17:30:19.622303 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-ssn5z" podUID="7a593db9-1db8-4942-815e-ae24e8a457d5" Dec 12 17:30:21.622120 kubelet[3235]: E1212 17:30:21.621465 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68c844c6f6-fvk5j" podUID="f134713e-1f0f-4a16-9202-6adaa12db341" Dec 12 17:30:22.621250 kubelet[3235]: E1212 17:30:22.621208 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7d565b7c7c-954tf" podUID="419c68ca-a927-4109-af9a-f076c2eb6b23" Dec 12 17:30:22.623834 kubelet[3235]: E1212 17:30:22.623445 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ksbxj" podUID="5bf56c15-e8b7-4324-9dd0-89444eba43fb" Dec 12 17:30:24.620708 kubelet[3235]: E1212 17:30:24.620644 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-fvf46" podUID="afc05240-ed5c-4c99-8e0c-0cca61ebd35a" Dec 12 17:30:24.620708 kubelet[3235]: E1212 17:30:24.620686 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-rp9k9" podUID="f10953b5-1654-43b7-bb1b-acb41105d201" Dec 12 17:30:25.624076 containerd[1652]: time="2025-12-12T17:30:25.624037358Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:30:25.959407 containerd[1652]: time="2025-12-12T17:30:25.959151069Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:30:25.960952 containerd[1652]: time="2025-12-12T17:30:25.960830393Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:30:25.960952 containerd[1652]: time="2025-12-12T17:30:25.960918433Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:30:25.961108 kubelet[3235]: E1212 17:30:25.961070 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:30:25.961475 kubelet[3235]: E1212 17:30:25.961118 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:30:25.961475 kubelet[3235]: E1212 17:30:25.961195 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-6646878486-nf54l_calico-system(1aa8e517-9efe-4339-b128-b444ff23b3fb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:30:25.963045 containerd[1652]: time="2025-12-12T17:30:25.962863358Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:30:26.303871 containerd[1652]: time="2025-12-12T17:30:26.303661164Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:30:26.306692 containerd[1652]: time="2025-12-12T17:30:26.306644611Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:30:26.306889 containerd[1652]: time="2025-12-12T17:30:26.306674452Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:30:26.307031 kubelet[3235]: E1212 17:30:26.306994 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:30:26.307102 kubelet[3235]: E1212 17:30:26.307041 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:30:26.307131 kubelet[3235]: E1212 17:30:26.307113 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-6646878486-nf54l_calico-system(1aa8e517-9efe-4339-b128-b444ff23b3fb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:30:26.307184 kubelet[3235]: E1212 17:30:26.307153 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6646878486-nf54l" podUID="1aa8e517-9efe-4339-b128-b444ff23b3fb" Dec 12 17:30:32.620569 containerd[1652]: time="2025-12-12T17:30:32.620526813Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:30:32.962835 containerd[1652]: time="2025-12-12T17:30:32.962680821Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:30:32.964941 containerd[1652]: time="2025-12-12T17:30:32.964876987Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:30:32.965135 containerd[1652]: time="2025-12-12T17:30:32.964884027Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:30:32.965221 kubelet[3235]: E1212 17:30:32.965155 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:30:32.965221 kubelet[3235]: E1212 17:30:32.965213 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:30:32.966050 kubelet[3235]: E1212 17:30:32.965293 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-68c844c6f6-fvk5j_calico-system(f134713e-1f0f-4a16-9202-6adaa12db341): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:30:32.966050 kubelet[3235]: E1212 17:30:32.965326 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68c844c6f6-fvk5j" podUID="f134713e-1f0f-4a16-9202-6adaa12db341" Dec 12 17:30:34.620294 containerd[1652]: time="2025-12-12T17:30:34.620241007Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:30:34.960103 containerd[1652]: time="2025-12-12T17:30:34.959866009Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:30:34.964219 containerd[1652]: time="2025-12-12T17:30:34.964172981Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:30:34.964291 containerd[1652]: time="2025-12-12T17:30:34.964259621Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:30:34.964462 kubelet[3235]: E1212 17:30:34.964425 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:30:34.964803 kubelet[3235]: E1212 17:30:34.964471 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:30:34.964803 kubelet[3235]: E1212 17:30:34.964542 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-7bccd8547-ssn5z_calico-apiserver(7a593db9-1db8-4942-815e-ae24e8a457d5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:30:34.964803 kubelet[3235]: E1212 17:30:34.964573 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-ssn5z" podUID="7a593db9-1db8-4942-815e-ae24e8a457d5" Dec 12 17:30:35.621061 containerd[1652]: time="2025-12-12T17:30:35.620717086Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:30:35.959729 containerd[1652]: time="2025-12-12T17:30:35.959217965Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:30:35.962044 containerd[1652]: time="2025-12-12T17:30:35.961980613Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:30:35.962130 containerd[1652]: time="2025-12-12T17:30:35.962076213Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:30:35.962295 kubelet[3235]: E1212 17:30:35.962252 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:30:35.962355 kubelet[3235]: E1212 17:30:35.962303 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:30:35.962498 kubelet[3235]: E1212 17:30:35.962475 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-7bccd8547-rp9k9_calico-apiserver(f10953b5-1654-43b7-bb1b-acb41105d201): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:30:35.962600 kubelet[3235]: E1212 17:30:35.962579 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-rp9k9" podUID="f10953b5-1654-43b7-bb1b-acb41105d201" Dec 12 17:30:35.962748 containerd[1652]: time="2025-12-12T17:30:35.962673454Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:30:36.300400 containerd[1652]: time="2025-12-12T17:30:36.300224371Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:30:36.303697 containerd[1652]: time="2025-12-12T17:30:36.303587340Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:30:36.303697 containerd[1652]: time="2025-12-12T17:30:36.303640020Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:30:36.303939 kubelet[3235]: E1212 17:30:36.303849 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:30:36.304202 kubelet[3235]: E1212 17:30:36.303944 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:30:36.304202 kubelet[3235]: E1212 17:30:36.304135 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-fvf46_calico-system(afc05240-ed5c-4c99-8e0c-0cca61ebd35a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:30:36.304202 kubelet[3235]: E1212 17:30:36.304163 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-fvf46" podUID="afc05240-ed5c-4c99-8e0c-0cca61ebd35a" Dec 12 17:30:37.620237 containerd[1652]: time="2025-12-12T17:30:37.620193720Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:30:37.950986 containerd[1652]: time="2025-12-12T17:30:37.950759579Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:30:37.953702 containerd[1652]: time="2025-12-12T17:30:37.953476426Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:30:37.953702 containerd[1652]: time="2025-12-12T17:30:37.953579066Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:30:37.953847 kubelet[3235]: E1212 17:30:37.953741 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:30:37.953847 kubelet[3235]: E1212 17:30:37.953788 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:30:37.954135 kubelet[3235]: E1212 17:30:37.953945 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-7d565b7c7c-954tf_calico-apiserver(419c68ca-a927-4109-af9a-f076c2eb6b23): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:30:37.954135 kubelet[3235]: E1212 17:30:37.953978 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7d565b7c7c-954tf" podUID="419c68ca-a927-4109-af9a-f076c2eb6b23" Dec 12 17:30:37.954640 containerd[1652]: time="2025-12-12T17:30:37.954483428Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:30:38.281651 containerd[1652]: time="2025-12-12T17:30:38.281565358Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:30:38.283585 containerd[1652]: time="2025-12-12T17:30:38.283478643Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:30:38.283666 containerd[1652]: time="2025-12-12T17:30:38.283525963Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:30:38.283806 kubelet[3235]: E1212 17:30:38.283763 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:30:38.283882 kubelet[3235]: E1212 17:30:38.283826 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:30:38.283910 kubelet[3235]: E1212 17:30:38.283891 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-ksbxj_calico-system(5bf56c15-e8b7-4324-9dd0-89444eba43fb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:30:38.284952 containerd[1652]: time="2025-12-12T17:30:38.284927847Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:30:38.624659 containerd[1652]: time="2025-12-12T17:30:38.624533089Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:30:38.626436 containerd[1652]: time="2025-12-12T17:30:38.626355174Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:30:38.626588 containerd[1652]: time="2025-12-12T17:30:38.626395734Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:30:38.626670 kubelet[3235]: E1212 17:30:38.626616 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:30:38.626670 kubelet[3235]: E1212 17:30:38.626667 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:30:38.626760 kubelet[3235]: E1212 17:30:38.626741 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-ksbxj_calico-system(5bf56c15-e8b7-4324-9dd0-89444eba43fb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:30:38.626820 kubelet[3235]: E1212 17:30:38.626788 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ksbxj" podUID="5bf56c15-e8b7-4324-9dd0-89444eba43fb" Dec 12 17:30:39.623570 kubelet[3235]: E1212 17:30:39.623524 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6646878486-nf54l" podUID="1aa8e517-9efe-4339-b128-b444ff23b3fb" Dec 12 17:30:46.620419 kubelet[3235]: E1212 17:30:46.620165 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-ssn5z" podUID="7a593db9-1db8-4942-815e-ae24e8a457d5" Dec 12 17:30:46.620948 kubelet[3235]: E1212 17:30:46.620590 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68c844c6f6-fvk5j" podUID="f134713e-1f0f-4a16-9202-6adaa12db341" Dec 12 17:30:49.621706 kubelet[3235]: E1212 17:30:49.621584 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-rp9k9" podUID="f10953b5-1654-43b7-bb1b-acb41105d201" Dec 12 17:30:51.624248 kubelet[3235]: E1212 17:30:51.624184 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-fvf46" podUID="afc05240-ed5c-4c99-8e0c-0cca61ebd35a" Dec 12 17:30:52.621901 kubelet[3235]: E1212 17:30:52.621792 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ksbxj" podUID="5bf56c15-e8b7-4324-9dd0-89444eba43fb" Dec 12 17:30:52.622710 kubelet[3235]: E1212 17:30:52.622626 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7d565b7c7c-954tf" podUID="419c68ca-a927-4109-af9a-f076c2eb6b23" Dec 12 17:30:52.623015 kubelet[3235]: E1212 17:30:52.622657 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6646878486-nf54l" podUID="1aa8e517-9efe-4339-b128-b444ff23b3fb" Dec 12 17:30:57.624145 kubelet[3235]: E1212 17:30:57.624099 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-ssn5z" podUID="7a593db9-1db8-4942-815e-ae24e8a457d5" Dec 12 17:31:01.622232 kubelet[3235]: E1212 17:31:01.621726 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68c844c6f6-fvk5j" podUID="f134713e-1f0f-4a16-9202-6adaa12db341" Dec 12 17:31:03.621175 kubelet[3235]: E1212 17:31:03.621072 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-rp9k9" podUID="f10953b5-1654-43b7-bb1b-acb41105d201" Dec 12 17:31:05.621485 kubelet[3235]: E1212 17:31:05.621423 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-fvf46" podUID="afc05240-ed5c-4c99-8e0c-0cca61ebd35a" Dec 12 17:31:06.620938 containerd[1652]: time="2025-12-12T17:31:06.620853453Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:31:06.621353 kubelet[3235]: E1212 17:31:06.621075 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ksbxj" podUID="5bf56c15-e8b7-4324-9dd0-89444eba43fb" Dec 12 17:31:06.959441 containerd[1652]: time="2025-12-12T17:31:06.959321092Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:31:06.961255 containerd[1652]: time="2025-12-12T17:31:06.961183777Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:31:06.961255 containerd[1652]: time="2025-12-12T17:31:06.961266657Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:31:06.961514 kubelet[3235]: E1212 17:31:06.961477 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:31:06.961774 kubelet[3235]: E1212 17:31:06.961524 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:31:06.961774 kubelet[3235]: E1212 17:31:06.961597 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-6646878486-nf54l_calico-system(1aa8e517-9efe-4339-b128-b444ff23b3fb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:31:06.963328 containerd[1652]: time="2025-12-12T17:31:06.963300983Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:31:07.302038 containerd[1652]: time="2025-12-12T17:31:07.301955542Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:31:07.303962 containerd[1652]: time="2025-12-12T17:31:07.303918348Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:31:07.304061 containerd[1652]: time="2025-12-12T17:31:07.303960708Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:31:07.304174 kubelet[3235]: E1212 17:31:07.304134 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:31:07.304225 kubelet[3235]: E1212 17:31:07.304183 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:31:07.304447 kubelet[3235]: E1212 17:31:07.304255 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-6646878486-nf54l_calico-system(1aa8e517-9efe-4339-b128-b444ff23b3fb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:31:07.304447 kubelet[3235]: E1212 17:31:07.304295 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6646878486-nf54l" podUID="1aa8e517-9efe-4339-b128-b444ff23b3fb" Dec 12 17:31:07.621762 kubelet[3235]: E1212 17:31:07.621626 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7d565b7c7c-954tf" podUID="419c68ca-a927-4109-af9a-f076c2eb6b23" Dec 12 17:31:12.620694 kubelet[3235]: E1212 17:31:12.620598 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-ssn5z" podUID="7a593db9-1db8-4942-815e-ae24e8a457d5" Dec 12 17:31:14.621382 containerd[1652]: time="2025-12-12T17:31:14.620955674Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:31:14.955074 containerd[1652]: time="2025-12-12T17:31:14.954956302Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:31:14.958512 containerd[1652]: time="2025-12-12T17:31:14.958434711Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:31:14.958616 containerd[1652]: time="2025-12-12T17:31:14.958478471Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:31:14.958707 kubelet[3235]: E1212 17:31:14.958662 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:31:14.958957 kubelet[3235]: E1212 17:31:14.958711 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:31:14.958957 kubelet[3235]: E1212 17:31:14.958786 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-68c844c6f6-fvk5j_calico-system(f134713e-1f0f-4a16-9202-6adaa12db341): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:31:14.958957 kubelet[3235]: E1212 17:31:14.958830 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68c844c6f6-fvk5j" podUID="f134713e-1f0f-4a16-9202-6adaa12db341" Dec 12 17:31:16.620053 containerd[1652]: time="2025-12-12T17:31:16.620014907Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:31:16.964915 containerd[1652]: time="2025-12-12T17:31:16.964776083Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:31:16.966047 containerd[1652]: time="2025-12-12T17:31:16.966006886Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:31:16.966165 containerd[1652]: time="2025-12-12T17:31:16.966077286Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:31:16.966249 kubelet[3235]: E1212 17:31:16.966214 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:31:16.966907 kubelet[3235]: E1212 17:31:16.966258 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:31:16.966907 kubelet[3235]: E1212 17:31:16.966352 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-7bccd8547-rp9k9_calico-apiserver(f10953b5-1654-43b7-bb1b-acb41105d201): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:31:16.966907 kubelet[3235]: E1212 17:31:16.966409 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-rp9k9" podUID="f10953b5-1654-43b7-bb1b-acb41105d201" Dec 12 17:31:17.620254 containerd[1652]: time="2025-12-12T17:31:17.620203865Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:31:17.973216 containerd[1652]: time="2025-12-12T17:31:17.973092982Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:31:17.974341 containerd[1652]: time="2025-12-12T17:31:17.974284665Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:31:17.974341 containerd[1652]: time="2025-12-12T17:31:17.974318985Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:31:17.974594 kubelet[3235]: E1212 17:31:17.974557 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:31:17.974876 kubelet[3235]: E1212 17:31:17.974604 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:31:17.974876 kubelet[3235]: E1212 17:31:17.974683 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-fvf46_calico-system(afc05240-ed5c-4c99-8e0c-0cca61ebd35a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:31:17.974876 kubelet[3235]: E1212 17:31:17.974712 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-fvf46" podUID="afc05240-ed5c-4c99-8e0c-0cca61ebd35a" Dec 12 17:31:19.621163 containerd[1652]: time="2025-12-12T17:31:19.621081783Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:31:19.622111 kubelet[3235]: E1212 17:31:19.622041 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6646878486-nf54l" podUID="1aa8e517-9efe-4339-b128-b444ff23b3fb" Dec 12 17:31:19.957109 containerd[1652]: time="2025-12-12T17:31:19.956895735Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:31:19.958125 containerd[1652]: time="2025-12-12T17:31:19.958087378Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:31:19.958200 containerd[1652]: time="2025-12-12T17:31:19.958098218Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:31:19.958376 kubelet[3235]: E1212 17:31:19.958337 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:31:19.958427 kubelet[3235]: E1212 17:31:19.958408 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:31:19.958515 kubelet[3235]: E1212 17:31:19.958496 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-7d565b7c7c-954tf_calico-apiserver(419c68ca-a927-4109-af9a-f076c2eb6b23): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:31:19.958556 kubelet[3235]: E1212 17:31:19.958530 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7d565b7c7c-954tf" podUID="419c68ca-a927-4109-af9a-f076c2eb6b23" Dec 12 17:31:21.621351 containerd[1652]: time="2025-12-12T17:31:21.621214858Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:31:21.944376 containerd[1652]: time="2025-12-12T17:31:21.944241338Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:31:21.946290 containerd[1652]: time="2025-12-12T17:31:21.946248743Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:31:21.946355 containerd[1652]: time="2025-12-12T17:31:21.946332863Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:31:21.946536 kubelet[3235]: E1212 17:31:21.946499 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:31:21.946930 kubelet[3235]: E1212 17:31:21.946548 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:31:21.946930 kubelet[3235]: E1212 17:31:21.946617 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-ksbxj_calico-system(5bf56c15-e8b7-4324-9dd0-89444eba43fb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:31:21.948558 containerd[1652]: time="2025-12-12T17:31:21.948525789Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:31:22.282431 containerd[1652]: time="2025-12-12T17:31:22.282335176Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:31:22.283525 containerd[1652]: time="2025-12-12T17:31:22.283475419Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:31:22.283587 containerd[1652]: time="2025-12-12T17:31:22.283518659Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:31:22.283740 kubelet[3235]: E1212 17:31:22.283703 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:31:22.283788 kubelet[3235]: E1212 17:31:22.283750 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:31:22.283849 kubelet[3235]: E1212 17:31:22.283828 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-ksbxj_calico-system(5bf56c15-e8b7-4324-9dd0-89444eba43fb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:31:22.283908 kubelet[3235]: E1212 17:31:22.283871 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ksbxj" podUID="5bf56c15-e8b7-4324-9dd0-89444eba43fb" Dec 12 17:31:22.558474 systemd[1]: Started sshd@7-10.0.10.18:22-147.75.109.163:48340.service - OpenSSH per-connection server daemon (147.75.109.163:48340). Dec 12 17:31:23.537618 sshd[5715]: Accepted publickey for core from 147.75.109.163 port 48340 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:31:23.538968 sshd-session[5715]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:31:23.543427 systemd-logind[1632]: New session 8 of user core. Dec 12 17:31:23.547550 systemd[1]: Started session-8.scope - Session 8 of User core. Dec 12 17:31:24.308486 sshd[5718]: Connection closed by 147.75.109.163 port 48340 Dec 12 17:31:24.308857 sshd-session[5715]: pam_unix(sshd:session): session closed for user core Dec 12 17:31:24.312956 systemd[1]: sshd@7-10.0.10.18:22-147.75.109.163:48340.service: Deactivated successfully. Dec 12 17:31:24.315021 systemd[1]: session-8.scope: Deactivated successfully. Dec 12 17:31:24.316826 systemd-logind[1632]: Session 8 logged out. Waiting for processes to exit. Dec 12 17:31:24.318719 systemd-logind[1632]: Removed session 8. Dec 12 17:31:24.621063 containerd[1652]: time="2025-12-12T17:31:24.620491849Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:31:24.953080 containerd[1652]: time="2025-12-12T17:31:24.952722872Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:31:24.954446 containerd[1652]: time="2025-12-12T17:31:24.954408597Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:31:24.954524 containerd[1652]: time="2025-12-12T17:31:24.954491037Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:31:24.954730 kubelet[3235]: E1212 17:31:24.954693 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:31:24.955011 kubelet[3235]: E1212 17:31:24.954742 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:31:24.955011 kubelet[3235]: E1212 17:31:24.954810 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-7bccd8547-ssn5z_calico-apiserver(7a593db9-1db8-4942-815e-ae24e8a457d5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:31:24.955011 kubelet[3235]: E1212 17:31:24.954840 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-ssn5z" podUID="7a593db9-1db8-4942-815e-ae24e8a457d5" Dec 12 17:31:29.486456 systemd[1]: Started sshd@8-10.0.10.18:22-147.75.109.163:48344.service - OpenSSH per-connection server daemon (147.75.109.163:48344). Dec 12 17:31:29.620300 kubelet[3235]: E1212 17:31:29.620239 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68c844c6f6-fvk5j" podUID="f134713e-1f0f-4a16-9202-6adaa12db341" Dec 12 17:31:30.444596 sshd[5747]: Accepted publickey for core from 147.75.109.163 port 48344 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:31:30.445886 sshd-session[5747]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:31:30.452047 systemd-logind[1632]: New session 9 of user core. Dec 12 17:31:30.458567 systemd[1]: Started session-9.scope - Session 9 of User core. Dec 12 17:31:30.620147 kubelet[3235]: E1212 17:31:30.620100 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-rp9k9" podUID="f10953b5-1654-43b7-bb1b-acb41105d201" Dec 12 17:31:31.172485 sshd[5750]: Connection closed by 147.75.109.163 port 48344 Dec 12 17:31:31.172846 sshd-session[5747]: pam_unix(sshd:session): session closed for user core Dec 12 17:31:31.175651 systemd[1]: sshd@8-10.0.10.18:22-147.75.109.163:48344.service: Deactivated successfully. Dec 12 17:31:31.177588 systemd[1]: session-9.scope: Deactivated successfully. Dec 12 17:31:31.179447 systemd-logind[1632]: Session 9 logged out. Waiting for processes to exit. Dec 12 17:31:31.180280 systemd-logind[1632]: Removed session 9. Dec 12 17:31:31.621301 kubelet[3235]: E1212 17:31:31.620996 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6646878486-nf54l" podUID="1aa8e517-9efe-4339-b128-b444ff23b3fb" Dec 12 17:31:32.620868 kubelet[3235]: E1212 17:31:32.620824 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-fvf46" podUID="afc05240-ed5c-4c99-8e0c-0cca61ebd35a" Dec 12 17:31:34.621323 kubelet[3235]: E1212 17:31:34.621272 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7d565b7c7c-954tf" podUID="419c68ca-a927-4109-af9a-f076c2eb6b23" Dec 12 17:31:36.372003 systemd[1]: Started sshd@9-10.0.10.18:22-147.75.109.163:34808.service - OpenSSH per-connection server daemon (147.75.109.163:34808). Dec 12 17:31:36.621307 kubelet[3235]: E1212 17:31:36.621249 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ksbxj" podUID="5bf56c15-e8b7-4324-9dd0-89444eba43fb" Dec 12 17:31:37.440999 sshd[5764]: Accepted publickey for core from 147.75.109.163 port 34808 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:31:37.442383 sshd-session[5764]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:31:37.447742 systemd-logind[1632]: New session 10 of user core. Dec 12 17:31:37.453540 systemd[1]: Started session-10.scope - Session 10 of User core. Dec 12 17:31:38.228958 sshd[5767]: Connection closed by 147.75.109.163 port 34808 Dec 12 17:31:38.229400 sshd-session[5764]: pam_unix(sshd:session): session closed for user core Dec 12 17:31:38.233629 systemd[1]: sshd@9-10.0.10.18:22-147.75.109.163:34808.service: Deactivated successfully. Dec 12 17:31:38.235578 systemd[1]: session-10.scope: Deactivated successfully. Dec 12 17:31:38.236505 systemd-logind[1632]: Session 10 logged out. Waiting for processes to exit. Dec 12 17:31:38.237899 systemd-logind[1632]: Removed session 10. Dec 12 17:31:38.621019 kubelet[3235]: E1212 17:31:38.620904 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-ssn5z" podUID="7a593db9-1db8-4942-815e-ae24e8a457d5" Dec 12 17:31:40.620441 kubelet[3235]: E1212 17:31:40.620381 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68c844c6f6-fvk5j" podUID="f134713e-1f0f-4a16-9202-6adaa12db341" Dec 12 17:31:43.383143 systemd[1]: Started sshd@10-10.0.10.18:22-147.75.109.163:39072.service - OpenSSH per-connection server daemon (147.75.109.163:39072). Dec 12 17:31:44.360396 sshd[5783]: Accepted publickey for core from 147.75.109.163 port 39072 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:31:44.361589 sshd-session[5783]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:31:44.365794 systemd-logind[1632]: New session 11 of user core. Dec 12 17:31:44.371496 systemd[1]: Started session-11.scope - Session 11 of User core. Dec 12 17:31:44.620407 kubelet[3235]: E1212 17:31:44.620075 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-rp9k9" podUID="f10953b5-1654-43b7-bb1b-acb41105d201" Dec 12 17:31:44.621132 kubelet[3235]: E1212 17:31:44.620850 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6646878486-nf54l" podUID="1aa8e517-9efe-4339-b128-b444ff23b3fb" Dec 12 17:31:45.087631 sshd[5786]: Connection closed by 147.75.109.163 port 39072 Dec 12 17:31:45.088161 sshd-session[5783]: pam_unix(sshd:session): session closed for user core Dec 12 17:31:45.091619 systemd[1]: sshd@10-10.0.10.18:22-147.75.109.163:39072.service: Deactivated successfully. Dec 12 17:31:45.094883 systemd[1]: session-11.scope: Deactivated successfully. Dec 12 17:31:45.098700 systemd-logind[1632]: Session 11 logged out. Waiting for processes to exit. Dec 12 17:31:45.100211 systemd-logind[1632]: Removed session 11. Dec 12 17:31:45.270262 systemd[1]: Started sshd@11-10.0.10.18:22-147.75.109.163:39074.service - OpenSSH per-connection server daemon (147.75.109.163:39074). Dec 12 17:31:45.620595 kubelet[3235]: E1212 17:31:45.620543 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-fvf46" podUID="afc05240-ed5c-4c99-8e0c-0cca61ebd35a" Dec 12 17:31:46.290829 sshd[5800]: Accepted publickey for core from 147.75.109.163 port 39074 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:31:46.292428 sshd-session[5800]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:31:46.298562 systemd-logind[1632]: New session 12 of user core. Dec 12 17:31:46.306533 systemd[1]: Started session-12.scope - Session 12 of User core. Dec 12 17:31:47.103544 sshd[5803]: Connection closed by 147.75.109.163 port 39074 Dec 12 17:31:47.103928 sshd-session[5800]: pam_unix(sshd:session): session closed for user core Dec 12 17:31:47.106797 systemd[1]: sshd@11-10.0.10.18:22-147.75.109.163:39074.service: Deactivated successfully. Dec 12 17:31:47.108749 systemd[1]: session-12.scope: Deactivated successfully. Dec 12 17:31:47.110632 systemd-logind[1632]: Session 12 logged out. Waiting for processes to exit. Dec 12 17:31:47.111582 systemd-logind[1632]: Removed session 12. Dec 12 17:31:47.271625 systemd[1]: Started sshd@12-10.0.10.18:22-147.75.109.163:39080.service - OpenSSH per-connection server daemon (147.75.109.163:39080). Dec 12 17:31:48.247298 sshd[5814]: Accepted publickey for core from 147.75.109.163 port 39080 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:31:48.248853 sshd-session[5814]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:31:48.252927 systemd-logind[1632]: New session 13 of user core. Dec 12 17:31:48.267858 systemd[1]: Started session-13.scope - Session 13 of User core. Dec 12 17:31:48.977327 sshd[5842]: Connection closed by 147.75.109.163 port 39080 Dec 12 17:31:48.977878 sshd-session[5814]: pam_unix(sshd:session): session closed for user core Dec 12 17:31:48.981259 systemd[1]: sshd@12-10.0.10.18:22-147.75.109.163:39080.service: Deactivated successfully. Dec 12 17:31:48.982942 systemd[1]: session-13.scope: Deactivated successfully. Dec 12 17:31:48.983991 systemd-logind[1632]: Session 13 logged out. Waiting for processes to exit. Dec 12 17:31:48.985134 systemd-logind[1632]: Removed session 13. Dec 12 17:31:49.621191 kubelet[3235]: E1212 17:31:49.621145 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7d565b7c7c-954tf" podUID="419c68ca-a927-4109-af9a-f076c2eb6b23" Dec 12 17:31:49.623612 kubelet[3235]: E1212 17:31:49.623560 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ksbxj" podUID="5bf56c15-e8b7-4324-9dd0-89444eba43fb" Dec 12 17:31:50.622308 kubelet[3235]: E1212 17:31:50.622257 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-ssn5z" podUID="7a593db9-1db8-4942-815e-ae24e8a457d5" Dec 12 17:31:53.621396 kubelet[3235]: E1212 17:31:53.621229 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68c844c6f6-fvk5j" podUID="f134713e-1f0f-4a16-9202-6adaa12db341" Dec 12 17:31:54.148865 systemd[1]: Started sshd@13-10.0.10.18:22-147.75.109.163:46272.service - OpenSSH per-connection server daemon (147.75.109.163:46272). Dec 12 17:31:54.806350 systemd[1]: Started sshd@14-10.0.10.18:22-192.227.134.84:60928.service - OpenSSH per-connection server daemon (192.227.134.84:60928). Dec 12 17:31:55.116860 sshd[5860]: Accepted publickey for core from 147.75.109.163 port 46272 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:31:55.118096 sshd-session[5860]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:31:55.122220 systemd-logind[1632]: New session 14 of user core. Dec 12 17:31:55.127505 systemd[1]: Started session-14.scope - Session 14 of User core. Dec 12 17:31:55.856735 sshd[5867]: Connection closed by 147.75.109.163 port 46272 Dec 12 17:31:55.857244 sshd-session[5860]: pam_unix(sshd:session): session closed for user core Dec 12 17:31:55.861380 systemd[1]: sshd@13-10.0.10.18:22-147.75.109.163:46272.service: Deactivated successfully. Dec 12 17:31:55.864000 systemd[1]: session-14.scope: Deactivated successfully. Dec 12 17:31:55.865042 systemd-logind[1632]: Session 14 logged out. Waiting for processes to exit. Dec 12 17:31:55.866307 systemd-logind[1632]: Removed session 14. Dec 12 17:31:56.113936 sshd[5864]: kex_exchange_identification: read: Connection reset by peer Dec 12 17:31:56.113936 sshd[5864]: Connection reset by 192.227.134.84 port 60928 Dec 12 17:31:56.115287 systemd[1]: sshd@14-10.0.10.18:22-192.227.134.84:60928.service: Deactivated successfully. Dec 12 17:31:57.619987 kubelet[3235]: E1212 17:31:57.619875 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-rp9k9" podUID="f10953b5-1654-43b7-bb1b-acb41105d201" Dec 12 17:31:58.620667 kubelet[3235]: E1212 17:31:58.620615 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-fvf46" podUID="afc05240-ed5c-4c99-8e0c-0cca61ebd35a" Dec 12 17:31:59.621761 kubelet[3235]: E1212 17:31:59.621705 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6646878486-nf54l" podUID="1aa8e517-9efe-4339-b128-b444ff23b3fb" Dec 12 17:32:01.020967 systemd[1]: Started sshd@15-10.0.10.18:22-147.75.109.163:46280.service - OpenSSH per-connection server daemon (147.75.109.163:46280). Dec 12 17:32:01.620445 kubelet[3235]: E1212 17:32:01.620338 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-ssn5z" podUID="7a593db9-1db8-4942-815e-ae24e8a457d5" Dec 12 17:32:01.996618 sshd[5886]: Accepted publickey for core from 147.75.109.163 port 46280 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:32:01.997893 sshd-session[5886]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:32:02.003184 systemd-logind[1632]: New session 15 of user core. Dec 12 17:32:02.011542 systemd[1]: Started session-15.scope - Session 15 of User core. Dec 12 17:32:02.730393 sshd[5889]: Connection closed by 147.75.109.163 port 46280 Dec 12 17:32:02.730597 sshd-session[5886]: pam_unix(sshd:session): session closed for user core Dec 12 17:32:02.734404 systemd-logind[1632]: Session 15 logged out. Waiting for processes to exit. Dec 12 17:32:02.734687 systemd[1]: sshd@15-10.0.10.18:22-147.75.109.163:46280.service: Deactivated successfully. Dec 12 17:32:02.736557 systemd[1]: session-15.scope: Deactivated successfully. Dec 12 17:32:02.738969 systemd-logind[1632]: Removed session 15. Dec 12 17:32:03.620705 kubelet[3235]: E1212 17:32:03.620662 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7d565b7c7c-954tf" podUID="419c68ca-a927-4109-af9a-f076c2eb6b23" Dec 12 17:32:04.620739 kubelet[3235]: E1212 17:32:04.620665 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ksbxj" podUID="5bf56c15-e8b7-4324-9dd0-89444eba43fb" Dec 12 17:32:05.622252 kubelet[3235]: E1212 17:32:05.622192 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68c844c6f6-fvk5j" podUID="f134713e-1f0f-4a16-9202-6adaa12db341" Dec 12 17:32:07.895841 systemd[1]: Started sshd@16-10.0.10.18:22-147.75.109.163:35774.service - OpenSSH per-connection server daemon (147.75.109.163:35774). Dec 12 17:32:08.867988 sshd[5905]: Accepted publickey for core from 147.75.109.163 port 35774 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:32:08.869103 sshd-session[5905]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:32:08.873243 systemd-logind[1632]: New session 16 of user core. Dec 12 17:32:08.881674 systemd[1]: Started session-16.scope - Session 16 of User core. Dec 12 17:32:09.603078 sshd[5908]: Connection closed by 147.75.109.163 port 35774 Dec 12 17:32:09.604750 sshd-session[5905]: pam_unix(sshd:session): session closed for user core Dec 12 17:32:09.608172 systemd[1]: sshd@16-10.0.10.18:22-147.75.109.163:35774.service: Deactivated successfully. Dec 12 17:32:09.609949 systemd[1]: session-16.scope: Deactivated successfully. Dec 12 17:32:09.610973 systemd-logind[1632]: Session 16 logged out. Waiting for processes to exit. Dec 12 17:32:09.612066 systemd-logind[1632]: Removed session 16. Dec 12 17:32:09.768809 systemd[1]: Started sshd@17-10.0.10.18:22-147.75.109.163:35780.service - OpenSSH per-connection server daemon (147.75.109.163:35780). Dec 12 17:32:10.749673 sshd[5921]: Accepted publickey for core from 147.75.109.163 port 35780 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:32:10.750949 sshd-session[5921]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:32:10.754706 systemd-logind[1632]: New session 17 of user core. Dec 12 17:32:10.762533 systemd[1]: Started session-17.scope - Session 17 of User core. Dec 12 17:32:11.575755 sshd[5924]: Connection closed by 147.75.109.163 port 35780 Dec 12 17:32:11.576132 sshd-session[5921]: pam_unix(sshd:session): session closed for user core Dec 12 17:32:11.579640 systemd[1]: sshd@17-10.0.10.18:22-147.75.109.163:35780.service: Deactivated successfully. Dec 12 17:32:11.581584 systemd[1]: session-17.scope: Deactivated successfully. Dec 12 17:32:11.582456 systemd-logind[1632]: Session 17 logged out. Waiting for processes to exit. Dec 12 17:32:11.583908 systemd-logind[1632]: Removed session 17. Dec 12 17:32:11.745503 systemd[1]: Started sshd@18-10.0.10.18:22-147.75.109.163:35788.service - OpenSSH per-connection server daemon (147.75.109.163:35788). Dec 12 17:32:12.620564 kubelet[3235]: E1212 17:32:12.620523 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-rp9k9" podUID="f10953b5-1654-43b7-bb1b-acb41105d201" Dec 12 17:32:12.746871 sshd[5938]: Accepted publickey for core from 147.75.109.163 port 35788 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:32:12.748242 sshd-session[5938]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:32:12.752020 systemd-logind[1632]: New session 18 of user core. Dec 12 17:32:12.763544 systemd[1]: Started session-18.scope - Session 18 of User core. Dec 12 17:32:13.309317 systemd[1]: Started sshd@19-10.0.10.18:22-193.46.255.159:36588.service - OpenSSH per-connection server daemon (193.46.255.159:36588). Dec 12 17:32:13.556540 sshd-session[5955]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159 user=root Dec 12 17:32:13.622442 kubelet[3235]: E1212 17:32:13.622070 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-fvf46" podUID="afc05240-ed5c-4c99-8e0c-0cca61ebd35a" Dec 12 17:32:13.625581 kubelet[3235]: E1212 17:32:13.625418 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6646878486-nf54l" podUID="1aa8e517-9efe-4339-b128-b444ff23b3fb" Dec 12 17:32:13.876157 sshd[5941]: Connection closed by 147.75.109.163 port 35788 Dec 12 17:32:13.876507 sshd-session[5938]: pam_unix(sshd:session): session closed for user core Dec 12 17:32:13.879589 systemd[1]: sshd@18-10.0.10.18:22-147.75.109.163:35788.service: Deactivated successfully. Dec 12 17:32:13.882037 systemd[1]: session-18.scope: Deactivated successfully. Dec 12 17:32:13.883682 systemd-logind[1632]: Session 18 logged out. Waiting for processes to exit. Dec 12 17:32:13.885561 systemd-logind[1632]: Removed session 18. Dec 12 17:32:14.041552 systemd[1]: Started sshd@20-10.0.10.18:22-147.75.109.163:39140.service - OpenSSH per-connection server daemon (147.75.109.163:39140). Dec 12 17:32:15.017691 sshd[5962]: Accepted publickey for core from 147.75.109.163 port 39140 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:32:15.020437 sshd-session[5962]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:32:15.024423 systemd-logind[1632]: New session 19 of user core. Dec 12 17:32:15.036713 systemd[1]: Started session-19.scope - Session 19 of User core. Dec 12 17:32:15.623515 kubelet[3235]: E1212 17:32:15.623463 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7d565b7c7c-954tf" podUID="419c68ca-a927-4109-af9a-f076c2eb6b23" Dec 12 17:32:15.879793 sshd[5967]: Connection closed by 147.75.109.163 port 39140 Dec 12 17:32:15.880595 sshd-session[5962]: pam_unix(sshd:session): session closed for user core Dec 12 17:32:15.885064 systemd[1]: sshd@20-10.0.10.18:22-147.75.109.163:39140.service: Deactivated successfully. Dec 12 17:32:15.887113 systemd[1]: session-19.scope: Deactivated successfully. Dec 12 17:32:15.888332 systemd-logind[1632]: Session 19 logged out. Waiting for processes to exit. Dec 12 17:32:15.890934 systemd-logind[1632]: Removed session 19. Dec 12 17:32:16.006956 sshd[5949]: PAM: Permission denied for root from 193.46.255.159 Dec 12 17:32:16.048629 systemd[1]: Started sshd@21-10.0.10.18:22-147.75.109.163:39146.service - OpenSSH per-connection server daemon (147.75.109.163:39146). Dec 12 17:32:16.058620 sshd-session[5978]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159 user=root Dec 12 17:32:16.599982 update_engine[1634]: I20251212 17:32:16.599920 1634 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Dec 12 17:32:16.599982 update_engine[1634]: I20251212 17:32:16.599969 1634 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Dec 12 17:32:16.600399 update_engine[1634]: I20251212 17:32:16.600193 1634 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Dec 12 17:32:16.600626 update_engine[1634]: I20251212 17:32:16.600584 1634 omaha_request_params.cc:62] Current group set to stable Dec 12 17:32:16.600692 update_engine[1634]: I20251212 17:32:16.600672 1634 update_attempter.cc:499] Already updated boot flags. Skipping. Dec 12 17:32:16.600692 update_engine[1634]: I20251212 17:32:16.600683 1634 update_attempter.cc:643] Scheduling an action processor start. Dec 12 17:32:16.600750 update_engine[1634]: I20251212 17:32:16.600697 1634 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Dec 12 17:32:16.600750 update_engine[1634]: I20251212 17:32:16.600721 1634 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Dec 12 17:32:16.600786 update_engine[1634]: I20251212 17:32:16.600767 1634 omaha_request_action.cc:271] Posting an Omaha request to disabled Dec 12 17:32:16.600786 update_engine[1634]: I20251212 17:32:16.600773 1634 omaha_request_action.cc:272] Request: Dec 12 17:32:16.600786 update_engine[1634]: Dec 12 17:32:16.600786 update_engine[1634]: Dec 12 17:32:16.600786 update_engine[1634]: Dec 12 17:32:16.600786 update_engine[1634]: Dec 12 17:32:16.600786 update_engine[1634]: Dec 12 17:32:16.600786 update_engine[1634]: Dec 12 17:32:16.600786 update_engine[1634]: Dec 12 17:32:16.600786 update_engine[1634]: Dec 12 17:32:16.600786 update_engine[1634]: I20251212 17:32:16.600778 1634 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 12 17:32:16.601578 locksmithd[1679]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Dec 12 17:32:16.602409 update_engine[1634]: I20251212 17:32:16.602356 1634 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 12 17:32:16.603128 update_engine[1634]: I20251212 17:32:16.603088 1634 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 12 17:32:16.609348 update_engine[1634]: E20251212 17:32:16.609313 1634 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Dec 12 17:32:16.609454 update_engine[1634]: I20251212 17:32:16.609400 1634 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Dec 12 17:32:16.620192 kubelet[3235]: E1212 17:32:16.620151 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-ssn5z" podUID="7a593db9-1db8-4942-815e-ae24e8a457d5" Dec 12 17:32:17.003789 sshd[5980]: Accepted publickey for core from 147.75.109.163 port 39146 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:32:17.005151 sshd-session[5980]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:32:17.010085 systemd-logind[1632]: New session 20 of user core. Dec 12 17:32:17.020536 systemd[1]: Started session-20.scope - Session 20 of User core. Dec 12 17:32:17.585785 sshd[5949]: PAM: Permission denied for root from 193.46.255.159 Dec 12 17:32:17.638115 sshd-session[5992]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159 user=root Dec 12 17:32:17.722566 sshd[5983]: Connection closed by 147.75.109.163 port 39146 Dec 12 17:32:17.722937 sshd-session[5980]: pam_unix(sshd:session): session closed for user core Dec 12 17:32:17.726046 systemd[1]: sshd@21-10.0.10.18:22-147.75.109.163:39146.service: Deactivated successfully. Dec 12 17:32:17.727913 systemd[1]: session-20.scope: Deactivated successfully. Dec 12 17:32:17.729248 systemd-logind[1632]: Session 20 logged out. Waiting for processes to exit. Dec 12 17:32:17.730693 systemd-logind[1632]: Removed session 20. Dec 12 17:32:18.620731 kubelet[3235]: E1212 17:32:18.620678 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ksbxj" podUID="5bf56c15-e8b7-4324-9dd0-89444eba43fb" Dec 12 17:32:19.301558 sshd[5949]: PAM: Permission denied for root from 193.46.255.159 Dec 12 17:32:19.327656 sshd[5949]: Received disconnect from 193.46.255.159 port 36588:11: [preauth] Dec 12 17:32:19.328277 sshd[5949]: Disconnected from authenticating user root 193.46.255.159 port 36588 [preauth] Dec 12 17:32:19.330243 systemd[1]: sshd@19-10.0.10.18:22-193.46.255.159:36588.service: Deactivated successfully. Dec 12 17:32:19.371729 systemd[1]: Started sshd@22-10.0.10.18:22-193.46.255.159:53116.service - OpenSSH per-connection server daemon (193.46.255.159:53116). Dec 12 17:32:19.620652 kubelet[3235]: E1212 17:32:19.620548 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68c844c6f6-fvk5j" podUID="f134713e-1f0f-4a16-9202-6adaa12db341" Dec 12 17:32:19.639867 sshd-session[6030]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159 user=root Dec 12 17:32:21.579558 sshd[6027]: PAM: Permission denied for root from 193.46.255.159 Dec 12 17:32:21.636983 sshd-session[6031]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159 user=root Dec 12 17:32:22.892554 systemd[1]: Started sshd@23-10.0.10.18:22-147.75.109.163:59038.service - OpenSSH per-connection server daemon (147.75.109.163:59038). Dec 12 17:32:23.858383 sshd[6033]: Accepted publickey for core from 147.75.109.163 port 59038 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:32:23.859784 sshd-session[6033]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:32:23.863932 systemd-logind[1632]: New session 21 of user core. Dec 12 17:32:23.874560 systemd[1]: Started session-21.scope - Session 21 of User core. Dec 12 17:32:24.518492 sshd[6027]: PAM: Permission denied for root from 193.46.255.159 Dec 12 17:32:24.576879 sshd-session[6045]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159 user=root Dec 12 17:32:24.585531 sshd[6036]: Connection closed by 147.75.109.163 port 59038 Dec 12 17:32:24.586025 sshd-session[6033]: pam_unix(sshd:session): session closed for user core Dec 12 17:32:24.589641 systemd[1]: sshd@23-10.0.10.18:22-147.75.109.163:59038.service: Deactivated successfully. Dec 12 17:32:24.594180 systemd[1]: session-21.scope: Deactivated successfully. Dec 12 17:32:24.595550 systemd-logind[1632]: Session 21 logged out. Waiting for processes to exit. Dec 12 17:32:24.596861 systemd-logind[1632]: Removed session 21. Dec 12 17:32:26.535993 sshd[6027]: PAM: Permission denied for root from 193.46.255.159 Dec 12 17:32:26.563295 sshd[6027]: Received disconnect from 193.46.255.159 port 53116:11: [preauth] Dec 12 17:32:26.563295 sshd[6027]: Disconnected from authenticating user root 193.46.255.159 port 53116 [preauth] Dec 12 17:32:26.565870 systemd[1]: sshd@22-10.0.10.18:22-193.46.255.159:53116.service: Deactivated successfully. Dec 12 17:32:26.600895 systemd[1]: Started sshd@24-10.0.10.18:22-193.46.255.159:53128.service - OpenSSH per-connection server daemon (193.46.255.159:53128). Dec 12 17:32:26.601947 update_engine[1634]: I20251212 17:32:26.601392 1634 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 12 17:32:26.601947 update_engine[1634]: I20251212 17:32:26.601470 1634 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 12 17:32:26.602185 update_engine[1634]: I20251212 17:32:26.602097 1634 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 12 17:32:26.609773 update_engine[1634]: E20251212 17:32:26.609733 1634 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Dec 12 17:32:26.609866 update_engine[1634]: I20251212 17:32:26.609807 1634 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Dec 12 17:32:26.620561 kubelet[3235]: E1212 17:32:26.620521 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-rp9k9" podUID="f10953b5-1654-43b7-bb1b-acb41105d201" Dec 12 17:32:26.842079 sshd-session[6055]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159 user=root Dec 12 17:32:27.620469 kubelet[3235]: E1212 17:32:27.620430 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-ssn5z" podUID="7a593db9-1db8-4942-815e-ae24e8a457d5" Dec 12 17:32:27.620926 containerd[1652]: time="2025-12-12T17:32:27.620513620Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:32:27.954097 containerd[1652]: time="2025-12-12T17:32:27.953893006Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:32:27.955732 containerd[1652]: time="2025-12-12T17:32:27.955644330Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:32:27.955732 containerd[1652]: time="2025-12-12T17:32:27.955704290Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:32:27.955934 kubelet[3235]: E1212 17:32:27.955893 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:32:27.956192 kubelet[3235]: E1212 17:32:27.955943 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:32:27.956192 kubelet[3235]: E1212 17:32:27.956014 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-6646878486-nf54l_calico-system(1aa8e517-9efe-4339-b128-b444ff23b3fb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:32:27.956929 containerd[1652]: time="2025-12-12T17:32:27.956903613Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:32:28.298196 containerd[1652]: time="2025-12-12T17:32:28.298142300Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:32:28.299758 containerd[1652]: time="2025-12-12T17:32:28.299653064Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:32:28.299758 containerd[1652]: time="2025-12-12T17:32:28.299735664Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:32:28.299923 kubelet[3235]: E1212 17:32:28.299885 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:32:28.299984 kubelet[3235]: E1212 17:32:28.299948 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:32:28.300042 kubelet[3235]: E1212 17:32:28.300024 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-6646878486-nf54l_calico-system(1aa8e517-9efe-4339-b128-b444ff23b3fb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:32:28.300093 kubelet[3235]: E1212 17:32:28.300067 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6646878486-nf54l" podUID="1aa8e517-9efe-4339-b128-b444ff23b3fb" Dec 12 17:32:28.620979 kubelet[3235]: E1212 17:32:28.620871 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7d565b7c7c-954tf" podUID="419c68ca-a927-4109-af9a-f076c2eb6b23" Dec 12 17:32:28.620979 kubelet[3235]: E1212 17:32:28.620895 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-fvf46" podUID="afc05240-ed5c-4c99-8e0c-0cca61ebd35a" Dec 12 17:32:29.408589 sshd[6052]: PAM: Permission denied for root from 193.46.255.159 Dec 12 17:32:29.456355 sshd-session[6056]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159 user=root Dec 12 17:32:29.627394 kubelet[3235]: E1212 17:32:29.624816 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ksbxj" podUID="5bf56c15-e8b7-4324-9dd0-89444eba43fb" Dec 12 17:32:29.768352 systemd[1]: Started sshd@25-10.0.10.18:22-147.75.109.163:59044.service - OpenSSH per-connection server daemon (147.75.109.163:59044). Dec 12 17:32:30.736843 sshd[6058]: Accepted publickey for core from 147.75.109.163 port 59044 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:32:30.738504 sshd-session[6058]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:32:30.742397 systemd-logind[1632]: New session 22 of user core. Dec 12 17:32:30.749616 systemd[1]: Started session-22.scope - Session 22 of User core. Dec 12 17:32:31.099645 sshd[6052]: PAM: Permission denied for root from 193.46.255.159 Dec 12 17:32:31.148394 sshd-session[6068]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159 user=root Dec 12 17:32:31.490763 sshd[6067]: Connection closed by 147.75.109.163 port 59044 Dec 12 17:32:31.489708 sshd-session[6058]: pam_unix(sshd:session): session closed for user core Dec 12 17:32:31.493135 systemd-logind[1632]: Session 22 logged out. Waiting for processes to exit. Dec 12 17:32:31.493421 systemd[1]: sshd@25-10.0.10.18:22-147.75.109.163:59044.service: Deactivated successfully. Dec 12 17:32:31.495718 systemd[1]: session-22.scope: Deactivated successfully. Dec 12 17:32:31.497145 systemd-logind[1632]: Removed session 22. Dec 12 17:32:32.735509 sshd[6052]: PAM: Permission denied for root from 193.46.255.159 Dec 12 17:32:32.759219 sshd[6052]: Received disconnect from 193.46.255.159 port 53128:11: [preauth] Dec 12 17:32:32.759219 sshd[6052]: Disconnected from authenticating user root 193.46.255.159 port 53128 [preauth] Dec 12 17:32:32.761904 systemd[1]: sshd@24-10.0.10.18:22-193.46.255.159:53128.service: Deactivated successfully. Dec 12 17:32:34.621424 kubelet[3235]: E1212 17:32:34.621301 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68c844c6f6-fvk5j" podUID="f134713e-1f0f-4a16-9202-6adaa12db341" Dec 12 17:32:36.604089 update_engine[1634]: I20251212 17:32:36.604000 1634 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 12 17:32:36.604458 update_engine[1634]: I20251212 17:32:36.604137 1634 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 12 17:32:36.604501 update_engine[1634]: I20251212 17:32:36.604475 1634 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 12 17:32:36.610542 update_engine[1634]: E20251212 17:32:36.610491 1634 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Dec 12 17:32:36.610618 update_engine[1634]: I20251212 17:32:36.610574 1634 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Dec 12 17:32:36.663472 systemd[1]: Started sshd@26-10.0.10.18:22-147.75.109.163:44114.service - OpenSSH per-connection server daemon (147.75.109.163:44114). Dec 12 17:32:37.655164 sshd[6085]: Accepted publickey for core from 147.75.109.163 port 44114 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:32:37.656875 sshd-session[6085]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:32:37.660746 systemd-logind[1632]: New session 23 of user core. Dec 12 17:32:37.669529 systemd[1]: Started session-23.scope - Session 23 of User core. Dec 12 17:32:38.396202 sshd[6088]: Connection closed by 147.75.109.163 port 44114 Dec 12 17:32:38.397273 sshd-session[6085]: pam_unix(sshd:session): session closed for user core Dec 12 17:32:38.401034 systemd[1]: sshd@26-10.0.10.18:22-147.75.109.163:44114.service: Deactivated successfully. Dec 12 17:32:38.403979 systemd[1]: session-23.scope: Deactivated successfully. Dec 12 17:32:38.404717 systemd-logind[1632]: Session 23 logged out. Waiting for processes to exit. Dec 12 17:32:38.405859 systemd-logind[1632]: Removed session 23. Dec 12 17:32:38.620426 kubelet[3235]: E1212 17:32:38.620349 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-ssn5z" podUID="7a593db9-1db8-4942-815e-ae24e8a457d5" Dec 12 17:32:40.620197 containerd[1652]: time="2025-12-12T17:32:40.620155908Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:32:40.973049 containerd[1652]: time="2025-12-12T17:32:40.972838624Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:32:40.974395 containerd[1652]: time="2025-12-12T17:32:40.974329988Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:32:40.974482 containerd[1652]: time="2025-12-12T17:32:40.974393708Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:32:40.975491 kubelet[3235]: E1212 17:32:40.975394 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:32:40.975491 kubelet[3235]: E1212 17:32:40.975449 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:32:40.975876 kubelet[3235]: E1212 17:32:40.975786 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-7d565b7c7c-954tf_calico-apiserver(419c68ca-a927-4109-af9a-f076c2eb6b23): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:32:40.975876 kubelet[3235]: E1212 17:32:40.975846 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7d565b7c7c-954tf" podUID="419c68ca-a927-4109-af9a-f076c2eb6b23" Dec 12 17:32:40.976105 containerd[1652]: time="2025-12-12T17:32:40.975753952Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:32:41.370245 containerd[1652]: time="2025-12-12T17:32:41.370194056Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:32:41.372402 containerd[1652]: time="2025-12-12T17:32:41.372335982Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:32:41.372449 containerd[1652]: time="2025-12-12T17:32:41.372392222Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:32:41.372619 kubelet[3235]: E1212 17:32:41.372574 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:32:41.372619 kubelet[3235]: E1212 17:32:41.372620 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:32:41.373158 kubelet[3235]: E1212 17:32:41.372805 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-7bccd8547-rp9k9_calico-apiserver(f10953b5-1654-43b7-bb1b-acb41105d201): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:32:41.373158 kubelet[3235]: E1212 17:32:41.372895 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-rp9k9" podUID="f10953b5-1654-43b7-bb1b-acb41105d201" Dec 12 17:32:41.373263 containerd[1652]: time="2025-12-12T17:32:41.372931423Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:32:41.694153 containerd[1652]: time="2025-12-12T17:32:41.693658777Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:32:41.695083 containerd[1652]: time="2025-12-12T17:32:41.695040420Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:32:41.695308 containerd[1652]: time="2025-12-12T17:32:41.695110980Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:32:41.696456 kubelet[3235]: E1212 17:32:41.696416 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:32:41.696525 kubelet[3235]: E1212 17:32:41.696468 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:32:41.696549 kubelet[3235]: E1212 17:32:41.696536 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-fvf46_calico-system(afc05240-ed5c-4c99-8e0c-0cca61ebd35a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:32:41.696581 kubelet[3235]: E1212 17:32:41.696564 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-fvf46" podUID="afc05240-ed5c-4c99-8e0c-0cca61ebd35a" Dec 12 17:32:42.621034 kubelet[3235]: E1212 17:32:42.620824 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6646878486-nf54l" podUID="1aa8e517-9efe-4339-b128-b444ff23b3fb" Dec 12 17:32:43.565887 systemd[1]: Started sshd@27-10.0.10.18:22-147.75.109.163:40918.service - OpenSSH per-connection server daemon (147.75.109.163:40918). Dec 12 17:32:44.537580 sshd[6105]: Accepted publickey for core from 147.75.109.163 port 40918 ssh2: RSA SHA256:34ENl0n5rhbzQOwlR2OROI4GqBuc4StAeVZUxGG9k6o Dec 12 17:32:44.538842 sshd-session[6105]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:32:44.544726 systemd-logind[1632]: New session 24 of user core. Dec 12 17:32:44.552577 systemd[1]: Started session-24.scope - Session 24 of User core. Dec 12 17:32:44.621467 containerd[1652]: time="2025-12-12T17:32:44.621430623Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:32:44.966783 containerd[1652]: time="2025-12-12T17:32:44.966580279Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:32:44.968137 containerd[1652]: time="2025-12-12T17:32:44.968072683Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:32:44.968207 containerd[1652]: time="2025-12-12T17:32:44.968130363Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:32:44.968287 kubelet[3235]: E1212 17:32:44.968252 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:32:44.968749 kubelet[3235]: E1212 17:32:44.968297 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:32:44.968749 kubelet[3235]: E1212 17:32:44.968361 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-ksbxj_calico-system(5bf56c15-e8b7-4324-9dd0-89444eba43fb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:32:44.969584 containerd[1652]: time="2025-12-12T17:32:44.969311086Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:32:45.268044 sshd[6108]: Connection closed by 147.75.109.163 port 40918 Dec 12 17:32:45.268339 sshd-session[6105]: pam_unix(sshd:session): session closed for user core Dec 12 17:32:45.272518 systemd[1]: sshd@27-10.0.10.18:22-147.75.109.163:40918.service: Deactivated successfully. Dec 12 17:32:45.275146 systemd[1]: session-24.scope: Deactivated successfully. Dec 12 17:32:45.276781 systemd-logind[1632]: Session 24 logged out. Waiting for processes to exit. Dec 12 17:32:45.278422 systemd-logind[1632]: Removed session 24. Dec 12 17:32:45.296095 containerd[1652]: time="2025-12-12T17:32:45.295790135Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:32:45.297265 containerd[1652]: time="2025-12-12T17:32:45.297222138Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:32:45.297518 containerd[1652]: time="2025-12-12T17:32:45.297298058Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:32:45.297648 kubelet[3235]: E1212 17:32:45.297593 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:32:45.297698 kubelet[3235]: E1212 17:32:45.297658 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:32:45.297748 kubelet[3235]: E1212 17:32:45.297730 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-ksbxj_calico-system(5bf56c15-e8b7-4324-9dd0-89444eba43fb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:32:45.297801 kubelet[3235]: E1212 17:32:45.297773 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ksbxj" podUID="5bf56c15-e8b7-4324-9dd0-89444eba43fb" Dec 12 17:32:45.620695 containerd[1652]: time="2025-12-12T17:32:45.620572338Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:32:45.966383 containerd[1652]: time="2025-12-12T17:32:45.966254556Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:32:45.967475 containerd[1652]: time="2025-12-12T17:32:45.967419839Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:32:45.967528 containerd[1652]: time="2025-12-12T17:32:45.967502679Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:32:45.967878 kubelet[3235]: E1212 17:32:45.967672 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:32:45.967878 kubelet[3235]: E1212 17:32:45.967720 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:32:45.967878 kubelet[3235]: E1212 17:32:45.967799 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-68c844c6f6-fvk5j_calico-system(f134713e-1f0f-4a16-9202-6adaa12db341): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:32:45.967878 kubelet[3235]: E1212 17:32:45.967843 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68c844c6f6-fvk5j" podUID="f134713e-1f0f-4a16-9202-6adaa12db341" Dec 12 17:32:46.600480 update_engine[1634]: I20251212 17:32:46.600411 1634 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 12 17:32:46.600816 update_engine[1634]: I20251212 17:32:46.600495 1634 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 12 17:32:46.600843 update_engine[1634]: I20251212 17:32:46.600822 1634 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 12 17:32:46.608737 update_engine[1634]: E20251212 17:32:46.608688 1634 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Dec 12 17:32:46.608892 update_engine[1634]: I20251212 17:32:46.608793 1634 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Dec 12 17:32:46.608892 update_engine[1634]: I20251212 17:32:46.608819 1634 omaha_request_action.cc:617] Omaha request response: Dec 12 17:32:46.609028 update_engine[1634]: E20251212 17:32:46.608966 1634 omaha_request_action.cc:636] Omaha request network transfer failed. Dec 12 17:32:46.609028 update_engine[1634]: I20251212 17:32:46.609000 1634 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Dec 12 17:32:46.609028 update_engine[1634]: I20251212 17:32:46.609014 1634 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Dec 12 17:32:46.609098 update_engine[1634]: I20251212 17:32:46.609027 1634 update_attempter.cc:306] Processing Done. Dec 12 17:32:46.609098 update_engine[1634]: E20251212 17:32:46.609050 1634 update_attempter.cc:619] Update failed. Dec 12 17:32:46.609098 update_engine[1634]: I20251212 17:32:46.609055 1634 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Dec 12 17:32:46.609098 update_engine[1634]: I20251212 17:32:46.609059 1634 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Dec 12 17:32:46.609098 update_engine[1634]: I20251212 17:32:46.609063 1634 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Dec 12 17:32:46.609202 update_engine[1634]: I20251212 17:32:46.609126 1634 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Dec 12 17:32:46.609202 update_engine[1634]: I20251212 17:32:46.609146 1634 omaha_request_action.cc:271] Posting an Omaha request to disabled Dec 12 17:32:46.609202 update_engine[1634]: I20251212 17:32:46.609151 1634 omaha_request_action.cc:272] Request: Dec 12 17:32:46.609202 update_engine[1634]: Dec 12 17:32:46.609202 update_engine[1634]: Dec 12 17:32:46.609202 update_engine[1634]: Dec 12 17:32:46.609202 update_engine[1634]: Dec 12 17:32:46.609202 update_engine[1634]: Dec 12 17:32:46.609202 update_engine[1634]: Dec 12 17:32:46.609202 update_engine[1634]: I20251212 17:32:46.609156 1634 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 12 17:32:46.609202 update_engine[1634]: I20251212 17:32:46.609174 1634 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 12 17:32:46.609482 update_engine[1634]: I20251212 17:32:46.609453 1634 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 12 17:32:46.609781 locksmithd[1679]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Dec 12 17:32:46.615698 update_engine[1634]: E20251212 17:32:46.615665 1634 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Dec 12 17:32:46.615752 update_engine[1634]: I20251212 17:32:46.615729 1634 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Dec 12 17:32:46.615752 update_engine[1634]: I20251212 17:32:46.615736 1634 omaha_request_action.cc:617] Omaha request response: Dec 12 17:32:46.615752 update_engine[1634]: I20251212 17:32:46.615742 1634 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Dec 12 17:32:46.615752 update_engine[1634]: I20251212 17:32:46.615746 1634 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Dec 12 17:32:46.615752 update_engine[1634]: I20251212 17:32:46.615751 1634 update_attempter.cc:306] Processing Done. Dec 12 17:32:46.615878 update_engine[1634]: I20251212 17:32:46.615756 1634 update_attempter.cc:310] Error event sent. Dec 12 17:32:46.615878 update_engine[1634]: I20251212 17:32:46.615762 1634 update_check_scheduler.cc:74] Next update check in 45m31s Dec 12 17:32:46.616227 locksmithd[1679]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Dec 12 17:32:51.621526 containerd[1652]: time="2025-12-12T17:32:51.621380606Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:32:51.966634 containerd[1652]: time="2025-12-12T17:32:51.966509663Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:32:51.968051 containerd[1652]: time="2025-12-12T17:32:51.968003227Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:32:51.968133 containerd[1652]: time="2025-12-12T17:32:51.968071827Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:32:51.970551 kubelet[3235]: E1212 17:32:51.970504 3235 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:32:51.970865 kubelet[3235]: E1212 17:32:51.970555 3235 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:32:51.970865 kubelet[3235]: E1212 17:32:51.970632 3235 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-7bccd8547-ssn5z_calico-apiserver(7a593db9-1db8-4942-815e-ae24e8a457d5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:32:51.970865 kubelet[3235]: E1212 17:32:51.970665 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-ssn5z" podUID="7a593db9-1db8-4942-815e-ae24e8a457d5" Dec 12 17:32:53.622908 kubelet[3235]: E1212 17:32:53.622852 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6646878486-nf54l" podUID="1aa8e517-9efe-4339-b128-b444ff23b3fb" Dec 12 17:32:55.622133 kubelet[3235]: E1212 17:32:55.622078 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7d565b7c7c-954tf" podUID="419c68ca-a927-4109-af9a-f076c2eb6b23" Dec 12 17:32:55.622133 kubelet[3235]: E1212 17:32:55.622106 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-rp9k9" podUID="f10953b5-1654-43b7-bb1b-acb41105d201" Dec 12 17:32:56.620312 kubelet[3235]: E1212 17:32:56.620271 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-fvf46" podUID="afc05240-ed5c-4c99-8e0c-0cca61ebd35a" Dec 12 17:32:58.621249 kubelet[3235]: E1212 17:32:58.621187 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ksbxj" podUID="5bf56c15-e8b7-4324-9dd0-89444eba43fb" Dec 12 17:32:59.620739 kubelet[3235]: E1212 17:32:59.620678 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68c844c6f6-fvk5j" podUID="f134713e-1f0f-4a16-9202-6adaa12db341" Dec 12 17:33:05.621514 kubelet[3235]: E1212 17:33:05.621447 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-ssn5z" podUID="7a593db9-1db8-4942-815e-ae24e8a457d5" Dec 12 17:33:07.621132 kubelet[3235]: E1212 17:33:07.621089 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7d565b7c7c-954tf" podUID="419c68ca-a927-4109-af9a-f076c2eb6b23" Dec 12 17:33:08.620717 kubelet[3235]: E1212 17:33:08.620663 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6646878486-nf54l" podUID="1aa8e517-9efe-4339-b128-b444ff23b3fb" Dec 12 17:33:09.620827 kubelet[3235]: E1212 17:33:09.620768 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-rp9k9" podUID="f10953b5-1654-43b7-bb1b-acb41105d201" Dec 12 17:33:10.322771 systemd[1]: cri-containerd-65590692c14e385ea9a2d636338f510af9688b7c152c7f3173d71a4d32405a66.scope: Deactivated successfully. Dec 12 17:33:10.323138 systemd[1]: cri-containerd-65590692c14e385ea9a2d636338f510af9688b7c152c7f3173d71a4d32405a66.scope: Consumed 46.431s CPU time, 106.3M memory peak. Dec 12 17:33:10.324761 containerd[1652]: time="2025-12-12T17:33:10.324718791Z" level=info msg="received container exit event container_id:\"65590692c14e385ea9a2d636338f510af9688b7c152c7f3173d71a4d32405a66\" id:\"65590692c14e385ea9a2d636338f510af9688b7c152c7f3173d71a4d32405a66\" pid:3564 exit_status:1 exited_at:{seconds:1765560790 nanos:324349990}" Dec 12 17:33:10.343573 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-65590692c14e385ea9a2d636338f510af9688b7c152c7f3173d71a4d32405a66-rootfs.mount: Deactivated successfully. Dec 12 17:33:10.620924 kubelet[3235]: E1212 17:33:10.620749 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-68c844c6f6-fvk5j" podUID="f134713e-1f0f-4a16-9202-6adaa12db341" Dec 12 17:33:10.621407 kubelet[3235]: E1212 17:33:10.621335 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-fvf46" podUID="afc05240-ed5c-4c99-8e0c-0cca61ebd35a" Dec 12 17:33:10.801293 kubelet[3235]: E1212 17:33:10.801257 3235 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.10.18:34108->10.0.10.111:2379: read: connection timed out" Dec 12 17:33:11.233373 kubelet[3235]: I1212 17:33:11.233344 3235 scope.go:117] "RemoveContainer" containerID="65590692c14e385ea9a2d636338f510af9688b7c152c7f3173d71a4d32405a66" Dec 12 17:33:11.235050 containerd[1652]: time="2025-12-12T17:33:11.235009075Z" level=info msg="CreateContainer within sandbox \"b32ff23a73fccd1dbaf065ea0e6420ed59971fc99f16f4f18b58844487fc14e8\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Dec 12 17:33:11.243584 containerd[1652]: time="2025-12-12T17:33:11.242176054Z" level=info msg="Container 463327ad0819679d089235504c741a3a57443849b4ae826b791270430cf15f44: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:33:11.250543 containerd[1652]: time="2025-12-12T17:33:11.250508556Z" level=info msg="CreateContainer within sandbox \"b32ff23a73fccd1dbaf065ea0e6420ed59971fc99f16f4f18b58844487fc14e8\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"463327ad0819679d089235504c741a3a57443849b4ae826b791270430cf15f44\"" Dec 12 17:33:11.251192 containerd[1652]: time="2025-12-12T17:33:11.251160117Z" level=info msg="StartContainer for \"463327ad0819679d089235504c741a3a57443849b4ae826b791270430cf15f44\"" Dec 12 17:33:11.252328 containerd[1652]: time="2025-12-12T17:33:11.252302440Z" level=info msg="connecting to shim 463327ad0819679d089235504c741a3a57443849b4ae826b791270430cf15f44" address="unix:///run/containerd/s/1237a6a65f87052829d8eb509a63708872b2794376d7c3d5b49bfc339ec2bc4c" protocol=ttrpc version=3 Dec 12 17:33:11.272561 systemd[1]: Started cri-containerd-463327ad0819679d089235504c741a3a57443849b4ae826b791270430cf15f44.scope - libcontainer container 463327ad0819679d089235504c741a3a57443849b4ae826b791270430cf15f44. Dec 12 17:33:11.296682 containerd[1652]: time="2025-12-12T17:33:11.296645555Z" level=info msg="StartContainer for \"463327ad0819679d089235504c741a3a57443849b4ae826b791270430cf15f44\" returns successfully" Dec 12 17:33:11.660166 systemd[1]: cri-containerd-a2421665c5754c36b73daf1574b84757f864cb4efa1c01d254a1613418902429.scope: Deactivated successfully. Dec 12 17:33:11.660514 systemd[1]: cri-containerd-a2421665c5754c36b73daf1574b84757f864cb4efa1c01d254a1613418902429.scope: Consumed 4.269s CPU time, 65.9M memory peak. Dec 12 17:33:11.661952 containerd[1652]: time="2025-12-12T17:33:11.661917384Z" level=info msg="received container exit event container_id:\"a2421665c5754c36b73daf1574b84757f864cb4efa1c01d254a1613418902429\" id:\"a2421665c5754c36b73daf1574b84757f864cb4efa1c01d254a1613418902429\" pid:3087 exit_status:1 exited_at:{seconds:1765560791 nanos:661683704}" Dec 12 17:33:11.681285 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a2421665c5754c36b73daf1574b84757f864cb4efa1c01d254a1613418902429-rootfs.mount: Deactivated successfully. Dec 12 17:33:12.079701 kubelet[3235]: E1212 17:33:12.079578 3235 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.10.18:33730->10.0.10.111:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4459-2-2-d-e796afb129.1880882e0b76efe8 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4459-2-2-d-e796afb129,UID:2d21f3c68ade6c5e71955b06908d1830,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4459-2-2-d-e796afb129,},FirstTimestamp:2025-12-12 17:33:05.271451624 +0000 UTC m=+241.753691149,LastTimestamp:2025-12-12 17:33:05.271451624 +0000 UTC m=+241.753691149,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-d-e796afb129,}" Dec 12 17:33:12.238865 kubelet[3235]: I1212 17:33:12.238840 3235 scope.go:117] "RemoveContainer" containerID="a2421665c5754c36b73daf1574b84757f864cb4efa1c01d254a1613418902429" Dec 12 17:33:12.240468 containerd[1652]: time="2025-12-12T17:33:12.240429247Z" level=info msg="CreateContainer within sandbox \"103bebf08f432d6627be308c0ef121d0f5220314a42c263e9fccf3682dafe7a7\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Dec 12 17:33:12.248254 containerd[1652]: time="2025-12-12T17:33:12.248211147Z" level=info msg="Container f26a5394010c6bf57dfaae7d6f03282df2919322a8e08ded387eb4967c43e905: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:33:12.255598 containerd[1652]: time="2025-12-12T17:33:12.255561926Z" level=info msg="CreateContainer within sandbox \"103bebf08f432d6627be308c0ef121d0f5220314a42c263e9fccf3682dafe7a7\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"f26a5394010c6bf57dfaae7d6f03282df2919322a8e08ded387eb4967c43e905\"" Dec 12 17:33:12.256436 containerd[1652]: time="2025-12-12T17:33:12.256058528Z" level=info msg="StartContainer for \"f26a5394010c6bf57dfaae7d6f03282df2919322a8e08ded387eb4967c43e905\"" Dec 12 17:33:12.257123 containerd[1652]: time="2025-12-12T17:33:12.257098490Z" level=info msg="connecting to shim f26a5394010c6bf57dfaae7d6f03282df2919322a8e08ded387eb4967c43e905" address="unix:///run/containerd/s/0f84d482106c4e36d1f4118ca3114299293212e0ccbf992f367f961a5e80e8f2" protocol=ttrpc version=3 Dec 12 17:33:12.274531 systemd[1]: Started cri-containerd-f26a5394010c6bf57dfaae7d6f03282df2919322a8e08ded387eb4967c43e905.scope - libcontainer container f26a5394010c6bf57dfaae7d6f03282df2919322a8e08ded387eb4967c43e905. Dec 12 17:33:12.309306 containerd[1652]: time="2025-12-12T17:33:12.309260466Z" level=info msg="StartContainer for \"f26a5394010c6bf57dfaae7d6f03282df2919322a8e08ded387eb4967c43e905\" returns successfully" Dec 12 17:33:13.621324 kubelet[3235]: E1212 17:33:13.621259 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ksbxj" podUID="5bf56c15-e8b7-4324-9dd0-89444eba43fb" Dec 12 17:33:16.772034 systemd[1]: cri-containerd-3631e6b2bbee2601d8f3e02bf8652523cd17bdfffb189050845d13a704131559.scope: Deactivated successfully. Dec 12 17:33:16.772677 systemd[1]: cri-containerd-3631e6b2bbee2601d8f3e02bf8652523cd17bdfffb189050845d13a704131559.scope: Consumed 4.631s CPU time, 24.8M memory peak. Dec 12 17:33:16.773627 containerd[1652]: time="2025-12-12T17:33:16.773589862Z" level=info msg="received container exit event container_id:\"3631e6b2bbee2601d8f3e02bf8652523cd17bdfffb189050845d13a704131559\" id:\"3631e6b2bbee2601d8f3e02bf8652523cd17bdfffb189050845d13a704131559\" pid:3098 exit_status:1 exited_at:{seconds:1765560796 nanos:773337862}" Dec 12 17:33:16.792214 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3631e6b2bbee2601d8f3e02bf8652523cd17bdfffb189050845d13a704131559-rootfs.mount: Deactivated successfully. Dec 12 17:33:17.253771 kubelet[3235]: I1212 17:33:17.253730 3235 scope.go:117] "RemoveContainer" containerID="3631e6b2bbee2601d8f3e02bf8652523cd17bdfffb189050845d13a704131559" Dec 12 17:33:17.255237 containerd[1652]: time="2025-12-12T17:33:17.255204874Z" level=info msg="CreateContainer within sandbox \"5d78bc2dc87b504e3456e91acc999ad532d01e462b5f7ef85eac864d6d85c5cc\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Dec 12 17:33:17.263399 containerd[1652]: time="2025-12-12T17:33:17.262966654Z" level=info msg="Container b324b760d9751c958f4ea4301d0556868151401adbfce4729f248fc4d1ff36f2: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:33:17.270015 containerd[1652]: time="2025-12-12T17:33:17.269974752Z" level=info msg="CreateContainer within sandbox \"5d78bc2dc87b504e3456e91acc999ad532d01e462b5f7ef85eac864d6d85c5cc\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"b324b760d9751c958f4ea4301d0556868151401adbfce4729f248fc4d1ff36f2\"" Dec 12 17:33:17.270463 containerd[1652]: time="2025-12-12T17:33:17.270437713Z" level=info msg="StartContainer for \"b324b760d9751c958f4ea4301d0556868151401adbfce4729f248fc4d1ff36f2\"" Dec 12 17:33:17.271955 containerd[1652]: time="2025-12-12T17:33:17.271822917Z" level=info msg="connecting to shim b324b760d9751c958f4ea4301d0556868151401adbfce4729f248fc4d1ff36f2" address="unix:///run/containerd/s/0169a36c4a241f7144c72ebbb56ef9e59ec856dd6444e7820bef07abdf706c52" protocol=ttrpc version=3 Dec 12 17:33:17.300566 systemd[1]: Started cri-containerd-b324b760d9751c958f4ea4301d0556868151401adbfce4729f248fc4d1ff36f2.scope - libcontainer container b324b760d9751c958f4ea4301d0556868151401adbfce4729f248fc4d1ff36f2. Dec 12 17:33:17.333321 containerd[1652]: time="2025-12-12T17:33:17.333283996Z" level=info msg="StartContainer for \"b324b760d9751c958f4ea4301d0556868151401adbfce4729f248fc4d1ff36f2\" returns successfully" Dec 12 17:33:17.620917 kubelet[3235]: E1212 17:33:17.620811 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7bccd8547-ssn5z" podUID="7a593db9-1db8-4942-815e-ae24e8a457d5" Dec 12 17:33:19.621067 kubelet[3235]: E1212 17:33:19.621015 3235 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7d565b7c7c-954tf" podUID="419c68ca-a927-4109-af9a-f076c2eb6b23"