Dec 12 17:24:29.806508 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Dec 12 17:24:29.806535 kernel: Linux version 6.12.61-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Fri Dec 12 15:20:48 -00 2025 Dec 12 17:24:29.806545 kernel: KASLR enabled Dec 12 17:24:29.806551 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Dec 12 17:24:29.806557 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390bb018 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b41218 Dec 12 17:24:29.806562 kernel: random: crng init done Dec 12 17:24:29.806569 kernel: secureboot: Secure boot disabled Dec 12 17:24:29.806574 kernel: ACPI: Early table checksum verification disabled Dec 12 17:24:29.806581 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Dec 12 17:24:29.806586 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Dec 12 17:24:29.806594 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:29.806600 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:29.806606 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:29.806612 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:29.806633 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:29.806641 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:29.806647 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:29.806654 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:29.806660 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 17:24:29.806666 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Dec 12 17:24:29.806672 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Dec 12 17:24:29.806678 kernel: ACPI: Use ACPI SPCR as default console: Yes Dec 12 17:24:29.806684 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Dec 12 17:24:29.806690 kernel: NODE_DATA(0) allocated [mem 0x13967da00-0x139684fff] Dec 12 17:24:29.806696 kernel: Zone ranges: Dec 12 17:24:29.806704 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Dec 12 17:24:29.806712 kernel: DMA32 empty Dec 12 17:24:29.806719 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Dec 12 17:24:29.806726 kernel: Device empty Dec 12 17:24:29.806733 kernel: Movable zone start for each node Dec 12 17:24:29.806739 kernel: Early memory node ranges Dec 12 17:24:29.806746 kernel: node 0: [mem 0x0000000040000000-0x000000013666ffff] Dec 12 17:24:29.806753 kernel: node 0: [mem 0x0000000136670000-0x000000013667ffff] Dec 12 17:24:29.806760 kernel: node 0: [mem 0x0000000136680000-0x000000013676ffff] Dec 12 17:24:29.806766 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Dec 12 17:24:29.806772 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Dec 12 17:24:29.806779 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Dec 12 17:24:29.806785 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Dec 12 17:24:29.806792 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Dec 12 17:24:29.806798 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Dec 12 17:24:29.806807 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Dec 12 17:24:29.806814 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Dec 12 17:24:29.806820 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Dec 12 17:24:29.806828 kernel: psci: probing for conduit method from ACPI. Dec 12 17:24:29.806835 kernel: psci: PSCIv1.1 detected in firmware. Dec 12 17:24:29.806841 kernel: psci: Using standard PSCI v0.2 function IDs Dec 12 17:24:29.806847 kernel: psci: Trusted OS migration not required Dec 12 17:24:29.806854 kernel: psci: SMC Calling Convention v1.1 Dec 12 17:24:29.806861 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Dec 12 17:24:29.806867 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Dec 12 17:24:29.806873 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Dec 12 17:24:29.806880 kernel: pcpu-alloc: [0] 0 [0] 1 Dec 12 17:24:29.806886 kernel: Detected PIPT I-cache on CPU0 Dec 12 17:24:29.806893 kernel: CPU features: detected: GIC system register CPU interface Dec 12 17:24:29.806900 kernel: CPU features: detected: Spectre-v4 Dec 12 17:24:29.806907 kernel: CPU features: detected: Spectre-BHB Dec 12 17:24:29.806914 kernel: CPU features: kernel page table isolation forced ON by KASLR Dec 12 17:24:29.806920 kernel: CPU features: detected: Kernel page table isolation (KPTI) Dec 12 17:24:29.806926 kernel: CPU features: detected: ARM erratum 1418040 Dec 12 17:24:29.806933 kernel: CPU features: detected: SSBS not fully self-synchronizing Dec 12 17:24:29.806939 kernel: alternatives: applying boot alternatives Dec 12 17:24:29.806947 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=361f5baddf90aee3bc7ee7e9be879bc0cc94314f224faa1e2791d9b44cd3ec52 Dec 12 17:24:29.806954 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Dec 12 17:24:29.806960 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 12 17:24:29.806967 kernel: Fallback order for Node 0: 0 Dec 12 17:24:29.806974 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1024000 Dec 12 17:24:29.806981 kernel: Policy zone: Normal Dec 12 17:24:29.806988 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 12 17:24:29.806994 kernel: software IO TLB: area num 2. Dec 12 17:24:29.807001 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Dec 12 17:24:29.807007 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Dec 12 17:24:29.807014 kernel: rcu: Preemptible hierarchical RCU implementation. Dec 12 17:24:29.807021 kernel: rcu: RCU event tracing is enabled. Dec 12 17:24:29.807028 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Dec 12 17:24:29.807034 kernel: Trampoline variant of Tasks RCU enabled. Dec 12 17:24:29.807044 kernel: Tracing variant of Tasks RCU enabled. Dec 12 17:24:29.807051 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 12 17:24:29.807060 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Dec 12 17:24:29.807069 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Dec 12 17:24:29.807076 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Dec 12 17:24:29.807083 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Dec 12 17:24:29.807090 kernel: GICv3: 256 SPIs implemented Dec 12 17:24:29.807096 kernel: GICv3: 0 Extended SPIs implemented Dec 12 17:24:29.807103 kernel: Root IRQ handler: gic_handle_irq Dec 12 17:24:29.807110 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Dec 12 17:24:29.807116 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Dec 12 17:24:29.807122 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Dec 12 17:24:29.807129 kernel: ITS [mem 0x08080000-0x0809ffff] Dec 12 17:24:29.807136 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100100000 (indirect, esz 8, psz 64K, shr 1) Dec 12 17:24:29.807143 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100110000 (flat, esz 8, psz 64K, shr 1) Dec 12 17:24:29.807150 kernel: GICv3: using LPI property table @0x0000000100120000 Dec 12 17:24:29.807156 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100130000 Dec 12 17:24:29.807163 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 12 17:24:29.807169 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 12 17:24:29.807176 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Dec 12 17:24:29.807182 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Dec 12 17:24:29.807189 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Dec 12 17:24:29.807195 kernel: Console: colour dummy device 80x25 Dec 12 17:24:29.807202 kernel: ACPI: Core revision 20240827 Dec 12 17:24:29.807211 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Dec 12 17:24:29.807218 kernel: pid_max: default: 32768 minimum: 301 Dec 12 17:24:29.807225 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Dec 12 17:24:29.807231 kernel: landlock: Up and running. Dec 12 17:24:29.807238 kernel: SELinux: Initializing. Dec 12 17:24:29.807257 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Dec 12 17:24:29.807264 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Dec 12 17:24:29.807271 kernel: rcu: Hierarchical SRCU implementation. Dec 12 17:24:29.807278 kernel: rcu: Max phase no-delay instances is 400. Dec 12 17:24:29.807287 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Dec 12 17:24:29.807294 kernel: Remapping and enabling EFI services. Dec 12 17:24:29.807300 kernel: smp: Bringing up secondary CPUs ... Dec 12 17:24:29.807307 kernel: Detected PIPT I-cache on CPU1 Dec 12 17:24:29.807313 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Dec 12 17:24:29.807320 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100140000 Dec 12 17:24:29.807326 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 12 17:24:29.807333 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Dec 12 17:24:29.807339 kernel: smp: Brought up 1 node, 2 CPUs Dec 12 17:24:29.807346 kernel: SMP: Total of 2 processors activated. Dec 12 17:24:29.807359 kernel: CPU: All CPU(s) started at EL1 Dec 12 17:24:29.807371 kernel: CPU features: detected: 32-bit EL0 Support Dec 12 17:24:29.807381 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Dec 12 17:24:29.807388 kernel: CPU features: detected: Common not Private translations Dec 12 17:24:29.807395 kernel: CPU features: detected: CRC32 instructions Dec 12 17:24:29.807402 kernel: CPU features: detected: Enhanced Virtualization Traps Dec 12 17:24:29.807409 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Dec 12 17:24:29.807417 kernel: CPU features: detected: LSE atomic instructions Dec 12 17:24:29.807424 kernel: CPU features: detected: Privileged Access Never Dec 12 17:24:29.807431 kernel: CPU features: detected: RAS Extension Support Dec 12 17:24:29.807438 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Dec 12 17:24:29.807445 kernel: alternatives: applying system-wide alternatives Dec 12 17:24:29.807452 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Dec 12 17:24:29.807459 kernel: Memory: 3858852K/4096000K available (11200K kernel code, 2456K rwdata, 9084K rodata, 39552K init, 1038K bss, 215668K reserved, 16384K cma-reserved) Dec 12 17:24:29.807466 kernel: devtmpfs: initialized Dec 12 17:24:29.807473 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 12 17:24:29.807481 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Dec 12 17:24:29.807489 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Dec 12 17:24:29.807496 kernel: 0 pages in range for non-PLT usage Dec 12 17:24:29.807503 kernel: 508400 pages in range for PLT usage Dec 12 17:24:29.807510 kernel: pinctrl core: initialized pinctrl subsystem Dec 12 17:24:29.807516 kernel: SMBIOS 3.0.0 present. Dec 12 17:24:29.807523 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Dec 12 17:24:29.807530 kernel: DMI: Memory slots populated: 1/1 Dec 12 17:24:29.807537 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 12 17:24:29.807544 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Dec 12 17:24:29.807553 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Dec 12 17:24:29.807560 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Dec 12 17:24:29.807567 kernel: audit: initializing netlink subsys (disabled) Dec 12 17:24:29.807574 kernel: audit: type=2000 audit(0.011:1): state=initialized audit_enabled=0 res=1 Dec 12 17:24:29.807580 kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 12 17:24:29.807588 kernel: cpuidle: using governor menu Dec 12 17:24:29.807594 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Dec 12 17:24:29.807601 kernel: ASID allocator initialised with 32768 entries Dec 12 17:24:29.807608 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 12 17:24:29.807820 kernel: Serial: AMBA PL011 UART driver Dec 12 17:24:29.807834 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Dec 12 17:24:29.807842 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Dec 12 17:24:29.807849 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Dec 12 17:24:29.807856 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Dec 12 17:24:29.807863 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Dec 12 17:24:29.807870 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Dec 12 17:24:29.807877 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Dec 12 17:24:29.807884 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Dec 12 17:24:29.807895 kernel: ACPI: Added _OSI(Module Device) Dec 12 17:24:29.807902 kernel: ACPI: Added _OSI(Processor Device) Dec 12 17:24:29.807909 kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 12 17:24:29.807916 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 12 17:24:29.807923 kernel: ACPI: Interpreter enabled Dec 12 17:24:29.807930 kernel: ACPI: Using GIC for interrupt routing Dec 12 17:24:29.807937 kernel: ACPI: MCFG table detected, 1 entries Dec 12 17:24:29.807944 kernel: ACPI: CPU0 has been hot-added Dec 12 17:24:29.807951 kernel: ACPI: CPU1 has been hot-added Dec 12 17:24:29.807959 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Dec 12 17:24:29.807966 kernel: printk: legacy console [ttyAMA0] enabled Dec 12 17:24:29.807973 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 12 17:24:29.808119 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Dec 12 17:24:29.808184 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Dec 12 17:24:29.808280 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Dec 12 17:24:29.808354 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Dec 12 17:24:29.808416 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Dec 12 17:24:29.808425 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Dec 12 17:24:29.808433 kernel: PCI host bridge to bus 0000:00 Dec 12 17:24:29.808499 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Dec 12 17:24:29.808552 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Dec 12 17:24:29.808603 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Dec 12 17:24:29.808760 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Dec 12 17:24:29.808850 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Dec 12 17:24:29.808920 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 conventional PCI endpoint Dec 12 17:24:29.808980 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11289000-0x11289fff] Dec 12 17:24:29.809038 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref] Dec 12 17:24:29.809109 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:29.809168 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11288000-0x11288fff] Dec 12 17:24:29.809232 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Dec 12 17:24:29.809312 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Dec 12 17:24:29.809373 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Dec 12 17:24:29.809440 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:29.809500 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11287000-0x11287fff] Dec 12 17:24:29.809558 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Dec 12 17:24:29.809629 kernel: pci 0000:00:02.1: bridge window [mem 0x10e00000-0x10ffffff] Dec 12 17:24:29.809702 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:29.809769 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11286000-0x11286fff] Dec 12 17:24:29.809828 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Dec 12 17:24:29.809887 kernel: pci 0000:00:02.2: bridge window [mem 0x10c00000-0x10dfffff] Dec 12 17:24:29.809947 kernel: pci 0000:00:02.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Dec 12 17:24:29.811463 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:29.811533 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11285000-0x11285fff] Dec 12 17:24:29.811599 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Dec 12 17:24:29.811696 kernel: pci 0000:00:02.3: bridge window [mem 0x10a00000-0x10bfffff] Dec 12 17:24:29.811759 kernel: pci 0000:00:02.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Dec 12 17:24:29.811848 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:29.811910 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11284000-0x11284fff] Dec 12 17:24:29.811967 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Dec 12 17:24:29.812025 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Dec 12 17:24:29.812083 kernel: pci 0000:00:02.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Dec 12 17:24:29.812153 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:29.812211 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11283000-0x11283fff] Dec 12 17:24:29.812286 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Dec 12 17:24:29.812345 kernel: pci 0000:00:02.5: bridge window [mem 0x10600000-0x107fffff] Dec 12 17:24:29.812403 kernel: pci 0000:00:02.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Dec 12 17:24:29.812468 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:29.812529 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11282000-0x11282fff] Dec 12 17:24:29.812590 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Dec 12 17:24:29.812667 kernel: pci 0000:00:02.6: bridge window [mem 0x10400000-0x105fffff] Dec 12 17:24:29.812726 kernel: pci 0000:00:02.6: bridge window [mem 0x8000500000-0x80005fffff 64bit pref] Dec 12 17:24:29.812793 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:29.812852 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11281000-0x11281fff] Dec 12 17:24:29.812909 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Dec 12 17:24:29.812968 kernel: pci 0000:00:02.7: bridge window [mem 0x10200000-0x103fffff] Dec 12 17:24:29.813043 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 17:24:29.813103 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11280000-0x11280fff] Dec 12 17:24:29.813163 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Dec 12 17:24:29.813223 kernel: pci 0000:00:03.0: bridge window [mem 0x10000000-0x101fffff] Dec 12 17:24:29.813306 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 conventional PCI endpoint Dec 12 17:24:29.813367 kernel: pci 0000:00:04.0: BAR 0 [io 0x0000-0x0007] Dec 12 17:24:29.813441 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 12 17:24:29.813502 kernel: pci 0000:01:00.0: BAR 1 [mem 0x11000000-0x11000fff] Dec 12 17:24:29.813562 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Dec 12 17:24:29.814720 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Dec 12 17:24:29.814844 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Dec 12 17:24:29.814907 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10e00000-0x10e03fff 64bit] Dec 12 17:24:29.814988 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Dec 12 17:24:29.815049 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10c00000-0x10c00fff] Dec 12 17:24:29.815109 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Dec 12 17:24:29.815177 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Dec 12 17:24:29.815237 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Dec 12 17:24:29.815328 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Dec 12 17:24:29.815391 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff] Dec 12 17:24:29.815454 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Dec 12 17:24:29.815522 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Dec 12 17:24:29.815583 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10600000-0x10600fff] Dec 12 17:24:29.815672 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Dec 12 17:24:29.815745 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 12 17:24:29.815806 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10400000-0x10400fff] Dec 12 17:24:29.815875 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000500000-0x8000503fff 64bit pref] Dec 12 17:24:29.815937 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Dec 12 17:24:29.816000 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Dec 12 17:24:29.816059 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Dec 12 17:24:29.816126 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Dec 12 17:24:29.816201 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Dec 12 17:24:29.816288 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Dec 12 17:24:29.816358 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Dec 12 17:24:29.816420 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Dec 12 17:24:29.816482 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Dec 12 17:24:29.816539 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Dec 12 17:24:29.816599 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Dec 12 17:24:29.816703 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Dec 12 17:24:29.816765 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Dec 12 17:24:29.816832 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Dec 12 17:24:29.816891 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Dec 12 17:24:29.816949 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Dec 12 17:24:29.817011 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Dec 12 17:24:29.817070 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Dec 12 17:24:29.817126 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Dec 12 17:24:29.817187 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Dec 12 17:24:29.817287 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Dec 12 17:24:29.817354 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Dec 12 17:24:29.817417 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Dec 12 17:24:29.817475 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Dec 12 17:24:29.817532 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Dec 12 17:24:29.817594 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Dec 12 17:24:29.819732 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Dec 12 17:24:29.819836 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Dec 12 17:24:29.819910 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff]: assigned Dec 12 17:24:29.819979 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Dec 12 17:24:29.820040 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff]: assigned Dec 12 17:24:29.820098 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Dec 12 17:24:29.820159 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff]: assigned Dec 12 17:24:29.820217 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Dec 12 17:24:29.820335 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff]: assigned Dec 12 17:24:29.820400 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Dec 12 17:24:29.820461 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff]: assigned Dec 12 17:24:29.820519 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Dec 12 17:24:29.820581 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Dec 12 17:24:29.820660 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Dec 12 17:24:29.820725 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Dec 12 17:24:29.820784 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Dec 12 17:24:29.820850 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Dec 12 17:24:29.820908 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Dec 12 17:24:29.820966 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff]: assigned Dec 12 17:24:29.821025 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Dec 12 17:24:29.821088 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8001200000-0x8001203fff 64bit pref]: assigned Dec 12 17:24:29.821145 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11200000-0x11200fff]: assigned Dec 12 17:24:29.821202 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11201000-0x11201fff]: assigned Dec 12 17:24:29.821308 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned Dec 12 17:24:29.821387 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11202000-0x11202fff]: assigned Dec 12 17:24:29.821450 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned Dec 12 17:24:29.821510 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11203000-0x11203fff]: assigned Dec 12 17:24:29.821567 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned Dec 12 17:24:29.821649 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11204000-0x11204fff]: assigned Dec 12 17:24:29.821710 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned Dec 12 17:24:29.821769 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11205000-0x11205fff]: assigned Dec 12 17:24:29.821826 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned Dec 12 17:24:29.821886 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11206000-0x11206fff]: assigned Dec 12 17:24:29.821944 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned Dec 12 17:24:29.822002 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11207000-0x11207fff]: assigned Dec 12 17:24:29.822059 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned Dec 12 17:24:29.822120 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11208000-0x11208fff]: assigned Dec 12 17:24:29.822177 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned Dec 12 17:24:29.822235 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11209000-0x11209fff]: assigned Dec 12 17:24:29.822311 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff]: assigned Dec 12 17:24:29.822375 kernel: pci 0000:00:04.0: BAR 0 [io 0xa000-0xa007]: assigned Dec 12 17:24:29.822440 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Dec 12 17:24:29.822500 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Dec 12 17:24:29.822559 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Dec 12 17:24:29.822634 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Dec 12 17:24:29.822704 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Dec 12 17:24:29.822763 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Dec 12 17:24:29.822840 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Dec 12 17:24:29.822939 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Dec 12 17:24:29.823008 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Dec 12 17:24:29.823067 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Dec 12 17:24:29.823130 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Dec 12 17:24:29.823187 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Dec 12 17:24:29.823295 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Dec 12 17:24:29.823375 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Dec 12 17:24:29.823436 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Dec 12 17:24:29.823494 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Dec 12 17:24:29.823551 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Dec 12 17:24:29.823612 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Dec 12 17:24:29.823707 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Dec 12 17:24:29.823769 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Dec 12 17:24:29.823842 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Dec 12 17:24:29.823901 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Dec 12 17:24:29.823959 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Dec 12 17:24:29.824028 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Dec 12 17:24:29.824088 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Dec 12 17:24:29.824147 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Dec 12 17:24:29.824207 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Dec 12 17:24:29.824279 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Dec 12 17:24:29.824339 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Dec 12 17:24:29.824405 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Dec 12 17:24:29.824465 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Dec 12 17:24:29.824528 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Dec 12 17:24:29.824601 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Dec 12 17:24:29.824908 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Dec 12 17:24:29.824980 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 12 17:24:29.825048 kernel: pci 0000:07:00.0: ROM [mem 0x10c00000-0x10c7ffff pref]: assigned Dec 12 17:24:29.825117 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000c00000-0x8000c03fff 64bit pref]: assigned Dec 12 17:24:29.825184 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10c80000-0x10c80fff]: assigned Dec 12 17:24:29.825289 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Dec 12 17:24:29.825365 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Dec 12 17:24:29.825425 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Dec 12 17:24:29.825484 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 12 17:24:29.825548 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Dec 12 17:24:29.825607 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Dec 12 17:24:29.827048 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Dec 12 17:24:29.827142 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 12 17:24:29.827207 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Dec 12 17:24:29.827319 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Dec 12 17:24:29.827383 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Dec 12 17:24:29.827451 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Dec 12 17:24:29.827513 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Dec 12 17:24:29.827566 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Dec 12 17:24:29.828725 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Dec 12 17:24:29.828835 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Dec 12 17:24:29.828893 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Dec 12 17:24:29.828954 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Dec 12 17:24:29.829017 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Dec 12 17:24:29.829072 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Dec 12 17:24:29.829126 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Dec 12 17:24:29.829188 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Dec 12 17:24:29.829259 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Dec 12 17:24:29.829322 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Dec 12 17:24:29.829390 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Dec 12 17:24:29.829450 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Dec 12 17:24:29.829511 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Dec 12 17:24:29.829580 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Dec 12 17:24:29.829753 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Dec 12 17:24:29.829828 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Dec 12 17:24:29.829898 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Dec 12 17:24:29.829954 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Dec 12 17:24:29.830008 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 12 17:24:29.830069 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Dec 12 17:24:29.830123 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Dec 12 17:24:29.830179 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 12 17:24:29.830281 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Dec 12 17:24:29.831770 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Dec 12 17:24:29.831848 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 12 17:24:29.831924 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Dec 12 17:24:29.831980 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Dec 12 17:24:29.832033 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Dec 12 17:24:29.832043 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Dec 12 17:24:29.832051 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Dec 12 17:24:29.832066 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Dec 12 17:24:29.832073 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Dec 12 17:24:29.832081 kernel: iommu: Default domain type: Translated Dec 12 17:24:29.832089 kernel: iommu: DMA domain TLB invalidation policy: strict mode Dec 12 17:24:29.832096 kernel: efivars: Registered efivars operations Dec 12 17:24:29.832103 kernel: vgaarb: loaded Dec 12 17:24:29.832111 kernel: clocksource: Switched to clocksource arch_sys_counter Dec 12 17:24:29.832118 kernel: VFS: Disk quotas dquot_6.6.0 Dec 12 17:24:29.832126 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 12 17:24:29.832133 kernel: pnp: PnP ACPI init Dec 12 17:24:29.832204 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Dec 12 17:24:29.832215 kernel: pnp: PnP ACPI: found 1 devices Dec 12 17:24:29.832222 kernel: NET: Registered PF_INET protocol family Dec 12 17:24:29.832230 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Dec 12 17:24:29.832238 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Dec 12 17:24:29.832287 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 12 17:24:29.832295 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 12 17:24:29.832303 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Dec 12 17:24:29.832314 kernel: TCP: Hash tables configured (established 32768 bind 32768) Dec 12 17:24:29.832321 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Dec 12 17:24:29.832329 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Dec 12 17:24:29.832336 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 12 17:24:29.832423 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Dec 12 17:24:29.832435 kernel: PCI: CLS 0 bytes, default 64 Dec 12 17:24:29.832443 kernel: kvm [1]: HYP mode not available Dec 12 17:24:29.832451 kernel: Initialise system trusted keyrings Dec 12 17:24:29.832459 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Dec 12 17:24:29.832469 kernel: Key type asymmetric registered Dec 12 17:24:29.832476 kernel: Asymmetric key parser 'x509' registered Dec 12 17:24:29.832485 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Dec 12 17:24:29.832493 kernel: io scheduler mq-deadline registered Dec 12 17:24:29.832500 kernel: io scheduler kyber registered Dec 12 17:24:29.832507 kernel: io scheduler bfq registered Dec 12 17:24:29.832515 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Dec 12 17:24:29.832582 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Dec 12 17:24:29.832690 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Dec 12 17:24:29.832798 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:29.832871 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Dec 12 17:24:29.832932 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Dec 12 17:24:29.832991 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:29.833054 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Dec 12 17:24:29.833113 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Dec 12 17:24:29.833172 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:29.833252 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Dec 12 17:24:29.833323 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Dec 12 17:24:29.833382 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:29.833500 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Dec 12 17:24:29.833563 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Dec 12 17:24:29.833638 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:29.833706 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Dec 12 17:24:29.833766 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Dec 12 17:24:29.833830 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:29.833894 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Dec 12 17:24:29.833952 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Dec 12 17:24:29.834009 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:29.834071 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Dec 12 17:24:29.834129 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Dec 12 17:24:29.834189 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:29.834200 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Dec 12 17:24:29.834277 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Dec 12 17:24:29.834340 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Dec 12 17:24:29.834399 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 17:24:29.834410 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Dec 12 17:24:29.834418 kernel: ACPI: button: Power Button [PWRB] Dec 12 17:24:29.834425 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Dec 12 17:24:29.834488 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Dec 12 17:24:29.834554 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Dec 12 17:24:29.834567 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 12 17:24:29.834575 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Dec 12 17:24:29.836999 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Dec 12 17:24:29.837025 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Dec 12 17:24:29.837033 kernel: thunder_xcv, ver 1.0 Dec 12 17:24:29.837041 kernel: thunder_bgx, ver 1.0 Dec 12 17:24:29.837049 kernel: nicpf, ver 1.0 Dec 12 17:24:29.837056 kernel: nicvf, ver 1.0 Dec 12 17:24:29.837138 kernel: rtc-efi rtc-efi.0: registered as rtc0 Dec 12 17:24:29.837701 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-12-12T17:24:29 UTC (1765560269) Dec 12 17:24:29.837720 kernel: hid: raw HID events driver (C) Jiri Kosina Dec 12 17:24:29.837728 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Dec 12 17:24:29.837736 kernel: watchdog: NMI not fully supported Dec 12 17:24:29.837744 kernel: watchdog: Hard watchdog permanently disabled Dec 12 17:24:29.837751 kernel: NET: Registered PF_INET6 protocol family Dec 12 17:24:29.837758 kernel: Segment Routing with IPv6 Dec 12 17:24:29.837766 kernel: In-situ OAM (IOAM) with IPv6 Dec 12 17:24:29.837779 kernel: NET: Registered PF_PACKET protocol family Dec 12 17:24:29.837786 kernel: Key type dns_resolver registered Dec 12 17:24:29.837794 kernel: registered taskstats version 1 Dec 12 17:24:29.837801 kernel: Loading compiled-in X.509 certificates Dec 12 17:24:29.837809 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.61-flatcar: 92f3a94fb747a7ba7cbcfde1535be91b86f9429a' Dec 12 17:24:29.837816 kernel: Demotion targets for Node 0: null Dec 12 17:24:29.837823 kernel: Key type .fscrypt registered Dec 12 17:24:29.837830 kernel: Key type fscrypt-provisioning registered Dec 12 17:24:29.837838 kernel: ima: No TPM chip found, activating TPM-bypass! Dec 12 17:24:29.837846 kernel: ima: Allocated hash algorithm: sha1 Dec 12 17:24:29.837870 kernel: ima: No architecture policies found Dec 12 17:24:29.837878 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Dec 12 17:24:29.837888 kernel: clk: Disabling unused clocks Dec 12 17:24:29.837896 kernel: PM: genpd: Disabling unused power domains Dec 12 17:24:29.837903 kernel: Warning: unable to open an initial console. Dec 12 17:24:29.837911 kernel: Freeing unused kernel memory: 39552K Dec 12 17:24:29.837918 kernel: Run /init as init process Dec 12 17:24:29.837925 kernel: with arguments: Dec 12 17:24:29.837934 kernel: /init Dec 12 17:24:29.837941 kernel: with environment: Dec 12 17:24:29.837948 kernel: HOME=/ Dec 12 17:24:29.837956 kernel: TERM=linux Dec 12 17:24:29.837964 systemd[1]: Successfully made /usr/ read-only. Dec 12 17:24:29.837975 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 12 17:24:29.837983 systemd[1]: Detected virtualization kvm. Dec 12 17:24:29.837991 systemd[1]: Detected architecture arm64. Dec 12 17:24:29.838000 systemd[1]: Running in initrd. Dec 12 17:24:29.838008 systemd[1]: No hostname configured, using default hostname. Dec 12 17:24:29.838016 systemd[1]: Hostname set to . Dec 12 17:24:29.838023 systemd[1]: Initializing machine ID from VM UUID. Dec 12 17:24:29.838031 systemd[1]: Queued start job for default target initrd.target. Dec 12 17:24:29.838039 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 17:24:29.838047 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 17:24:29.838055 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Dec 12 17:24:29.838065 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 12 17:24:29.838073 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Dec 12 17:24:29.838082 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Dec 12 17:24:29.838090 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Dec 12 17:24:29.838099 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Dec 12 17:24:29.838107 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 17:24:29.838115 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 12 17:24:29.838124 systemd[1]: Reached target paths.target - Path Units. Dec 12 17:24:29.838131 systemd[1]: Reached target slices.target - Slice Units. Dec 12 17:24:29.838139 systemd[1]: Reached target swap.target - Swaps. Dec 12 17:24:29.838147 systemd[1]: Reached target timers.target - Timer Units. Dec 12 17:24:29.838155 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Dec 12 17:24:29.838163 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 12 17:24:29.838171 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Dec 12 17:24:29.838178 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Dec 12 17:24:29.838188 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 12 17:24:29.838196 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 12 17:24:29.838204 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 17:24:29.838212 systemd[1]: Reached target sockets.target - Socket Units. Dec 12 17:24:29.838220 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Dec 12 17:24:29.838227 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 12 17:24:29.838235 systemd[1]: Finished network-cleanup.service - Network Cleanup. Dec 12 17:24:29.838292 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Dec 12 17:24:29.838301 systemd[1]: Starting systemd-fsck-usr.service... Dec 12 17:24:29.838311 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 12 17:24:29.838319 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 12 17:24:29.838327 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:24:29.838335 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Dec 12 17:24:29.838344 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 17:24:29.838353 systemd[1]: Finished systemd-fsck-usr.service. Dec 12 17:24:29.838390 systemd-journald[245]: Collecting audit messages is disabled. Dec 12 17:24:29.838411 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 12 17:24:29.838421 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 12 17:24:29.838429 kernel: Bridge firewalling registered Dec 12 17:24:29.838437 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 12 17:24:29.838445 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 12 17:24:29.838453 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:24:29.838461 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 12 17:24:29.838470 systemd-journald[245]: Journal started Dec 12 17:24:29.838492 systemd-journald[245]: Runtime Journal (/run/log/journal/a1de3e33d28f4b72bb43ab9e55a2f8d3) is 8M, max 76.5M, 68.5M free. Dec 12 17:24:29.800349 systemd-modules-load[247]: Inserted module 'overlay' Dec 12 17:24:29.823275 systemd-modules-load[247]: Inserted module 'br_netfilter' Dec 12 17:24:29.842205 systemd[1]: Started systemd-journald.service - Journal Service. Dec 12 17:24:29.846712 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 12 17:24:29.850141 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 12 17:24:29.853839 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 12 17:24:29.856673 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 12 17:24:29.870214 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 17:24:29.874665 systemd-tmpfiles[270]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Dec 12 17:24:29.879284 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 17:24:29.883766 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 12 17:24:29.885658 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 12 17:24:29.887366 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Dec 12 17:24:29.913099 dracut-cmdline[286]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=361f5baddf90aee3bc7ee7e9be879bc0cc94314f224faa1e2791d9b44cd3ec52 Dec 12 17:24:29.933038 systemd-resolved[285]: Positive Trust Anchors: Dec 12 17:24:29.933054 systemd-resolved[285]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 12 17:24:29.933085 systemd-resolved[285]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 12 17:24:29.947549 systemd-resolved[285]: Defaulting to hostname 'linux'. Dec 12 17:24:29.948680 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 12 17:24:29.949351 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 12 17:24:30.014692 kernel: SCSI subsystem initialized Dec 12 17:24:30.018688 kernel: Loading iSCSI transport class v2.0-870. Dec 12 17:24:30.026799 kernel: iscsi: registered transport (tcp) Dec 12 17:24:30.040840 kernel: iscsi: registered transport (qla4xxx) Dec 12 17:24:30.040917 kernel: QLogic iSCSI HBA Driver Dec 12 17:24:30.067869 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 12 17:24:30.102742 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 17:24:30.106703 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 12 17:24:30.158147 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Dec 12 17:24:30.160400 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Dec 12 17:24:30.224700 kernel: raid6: neonx8 gen() 15467 MB/s Dec 12 17:24:30.241711 kernel: raid6: neonx4 gen() 15591 MB/s Dec 12 17:24:30.258779 kernel: raid6: neonx2 gen() 12960 MB/s Dec 12 17:24:30.275681 kernel: raid6: neonx1 gen() 10338 MB/s Dec 12 17:24:30.292699 kernel: raid6: int64x8 gen() 6851 MB/s Dec 12 17:24:30.309766 kernel: raid6: int64x4 gen() 7296 MB/s Dec 12 17:24:30.326690 kernel: raid6: int64x2 gen() 6041 MB/s Dec 12 17:24:30.343706 kernel: raid6: int64x1 gen() 4993 MB/s Dec 12 17:24:30.343798 kernel: raid6: using algorithm neonx4 gen() 15591 MB/s Dec 12 17:24:30.360711 kernel: raid6: .... xor() 12205 MB/s, rmw enabled Dec 12 17:24:30.360795 kernel: raid6: using neon recovery algorithm Dec 12 17:24:30.365655 kernel: xor: measuring software checksum speed Dec 12 17:24:30.365727 kernel: 8regs : 20541 MB/sec Dec 12 17:24:30.365749 kernel: 32regs : 18743 MB/sec Dec 12 17:24:30.366828 kernel: arm64_neon : 28070 MB/sec Dec 12 17:24:30.366865 kernel: xor: using function: arm64_neon (28070 MB/sec) Dec 12 17:24:30.422669 kernel: Btrfs loaded, zoned=no, fsverity=no Dec 12 17:24:30.431745 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Dec 12 17:24:30.437382 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 17:24:30.472356 systemd-udevd[494]: Using default interface naming scheme 'v255'. Dec 12 17:24:30.477769 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 17:24:30.482354 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Dec 12 17:24:30.515560 dracut-pre-trigger[503]: rd.md=0: removing MD RAID activation Dec 12 17:24:30.545500 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Dec 12 17:24:30.549327 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 12 17:24:30.617338 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 17:24:30.621197 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Dec 12 17:24:30.729654 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Dec 12 17:24:30.731710 kernel: scsi host0: Virtio SCSI HBA Dec 12 17:24:30.736083 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Dec 12 17:24:30.736153 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Dec 12 17:24:30.745929 kernel: ACPI: bus type USB registered Dec 12 17:24:30.746171 kernel: usbcore: registered new interface driver usbfs Dec 12 17:24:30.746184 kernel: usbcore: registered new interface driver hub Dec 12 17:24:30.746925 kernel: usbcore: registered new device driver usb Dec 12 17:24:30.776645 kernel: sr 0:0:0:0: Power-on or device reset occurred Dec 12 17:24:30.778854 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Dec 12 17:24:30.779054 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Dec 12 17:24:30.781403 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 17:24:30.781526 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:24:30.785645 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Dec 12 17:24:30.785814 kernel: sd 0:0:0:1: Power-on or device reset occurred Dec 12 17:24:30.786143 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Dec 12 17:24:30.786812 kernel: sd 0:0:0:1: [sda] Write Protect is off Dec 12 17:24:30.786940 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Dec 12 17:24:30.787013 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Dec 12 17:24:30.788529 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:24:30.791770 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:24:30.793154 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Dec 12 17:24:30.802650 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Dec 12 17:24:30.802708 kernel: GPT:17805311 != 80003071 Dec 12 17:24:30.802718 kernel: GPT:Alternate GPT header not at the end of the disk. Dec 12 17:24:30.802727 kernel: GPT:17805311 != 80003071 Dec 12 17:24:30.802735 kernel: GPT: Use GNU Parted to correct GPT errors. Dec 12 17:24:30.803651 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 12 17:24:30.805667 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Dec 12 17:24:30.813237 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 12 17:24:30.813440 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Dec 12 17:24:30.816663 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Dec 12 17:24:30.816945 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 12 17:24:30.817030 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Dec 12 17:24:30.817107 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Dec 12 17:24:30.817778 kernel: hub 1-0:1.0: USB hub found Dec 12 17:24:30.818672 kernel: hub 1-0:1.0: 4 ports detected Dec 12 17:24:30.822658 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Dec 12 17:24:30.824823 kernel: hub 2-0:1.0: USB hub found Dec 12 17:24:30.825008 kernel: hub 2-0:1.0: 4 ports detected Dec 12 17:24:30.825166 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:24:30.896123 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Dec 12 17:24:30.904794 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Dec 12 17:24:30.914281 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Dec 12 17:24:30.915052 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Dec 12 17:24:30.927127 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Dec 12 17:24:30.929444 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Dec 12 17:24:30.934043 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Dec 12 17:24:30.934846 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 17:24:30.936072 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 12 17:24:30.938102 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Dec 12 17:24:30.939790 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Dec 12 17:24:30.953532 disk-uuid[600]: Primary Header is updated. Dec 12 17:24:30.953532 disk-uuid[600]: Secondary Entries is updated. Dec 12 17:24:30.953532 disk-uuid[600]: Secondary Header is updated. Dec 12 17:24:30.964923 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Dec 12 17:24:30.968711 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 12 17:24:31.060673 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Dec 12 17:24:31.193458 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Dec 12 17:24:31.193527 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Dec 12 17:24:31.193819 kernel: usbcore: registered new interface driver usbhid Dec 12 17:24:31.194640 kernel: usbhid: USB HID core driver Dec 12 17:24:31.298666 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Dec 12 17:24:31.432090 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Dec 12 17:24:31.485709 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Dec 12 17:24:31.990038 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 12 17:24:31.993691 disk-uuid[602]: The operation has completed successfully. Dec 12 17:24:32.052124 systemd[1]: disk-uuid.service: Deactivated successfully. Dec 12 17:24:32.052287 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Dec 12 17:24:32.076576 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Dec 12 17:24:32.098222 sh[625]: Success Dec 12 17:24:32.117764 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 12 17:24:32.117838 kernel: device-mapper: uevent: version 1.0.3 Dec 12 17:24:32.117862 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Dec 12 17:24:32.126663 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Dec 12 17:24:32.181776 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Dec 12 17:24:32.186156 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Dec 12 17:24:32.205896 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Dec 12 17:24:32.218803 kernel: BTRFS: device fsid 6d6d314d-b8a1-4727-8a34-8525e276a248 devid 1 transid 38 /dev/mapper/usr (254:0) scanned by mount (637) Dec 12 17:24:32.220991 kernel: BTRFS info (device dm-0): first mount of filesystem 6d6d314d-b8a1-4727-8a34-8525e276a248 Dec 12 17:24:32.221042 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:24:32.228063 kernel: BTRFS info (device dm-0): enabling ssd optimizations Dec 12 17:24:32.228133 kernel: BTRFS info (device dm-0): disabling log replay at mount time Dec 12 17:24:32.228146 kernel: BTRFS info (device dm-0): enabling free space tree Dec 12 17:24:32.231368 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Dec 12 17:24:32.232752 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Dec 12 17:24:32.233488 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Dec 12 17:24:32.234263 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Dec 12 17:24:32.236831 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Dec 12 17:24:32.276647 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (671) Dec 12 17:24:32.278682 kernel: BTRFS info (device sda6): first mount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:24:32.278754 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:24:32.282998 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 12 17:24:32.283063 kernel: BTRFS info (device sda6): turning on async discard Dec 12 17:24:32.283075 kernel: BTRFS info (device sda6): enabling free space tree Dec 12 17:24:32.289702 kernel: BTRFS info (device sda6): last unmount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:24:32.291553 systemd[1]: Finished ignition-setup.service - Ignition (setup). Dec 12 17:24:32.296812 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Dec 12 17:24:32.396266 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 12 17:24:32.404182 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 12 17:24:32.451189 systemd-networkd[807]: lo: Link UP Dec 12 17:24:32.451202 systemd-networkd[807]: lo: Gained carrier Dec 12 17:24:32.452953 systemd-networkd[807]: Enumeration completed Dec 12 17:24:32.453734 systemd-networkd[807]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:24:32.453737 systemd-networkd[807]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 12 17:24:32.453757 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 12 17:24:32.454136 systemd-networkd[807]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:24:32.454140 systemd-networkd[807]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 12 17:24:32.454487 systemd-networkd[807]: eth0: Link UP Dec 12 17:24:32.454711 systemd-networkd[807]: eth1: Link UP Dec 12 17:24:32.467967 ignition[723]: Ignition 2.22.0 Dec 12 17:24:32.455016 systemd[1]: Reached target network.target - Network. Dec 12 17:24:32.467974 ignition[723]: Stage: fetch-offline Dec 12 17:24:32.455882 systemd-networkd[807]: eth0: Gained carrier Dec 12 17:24:32.468006 ignition[723]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:24:32.455895 systemd-networkd[807]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:24:32.468013 ignition[723]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 12 17:24:32.466489 systemd-networkd[807]: eth1: Gained carrier Dec 12 17:24:32.468103 ignition[723]: parsed url from cmdline: "" Dec 12 17:24:32.466511 systemd-networkd[807]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:24:32.468106 ignition[723]: no config URL provided Dec 12 17:24:32.470503 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Dec 12 17:24:32.468111 ignition[723]: reading system config file "/usr/lib/ignition/user.ign" Dec 12 17:24:32.472974 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Dec 12 17:24:32.468117 ignition[723]: no config at "/usr/lib/ignition/user.ign" Dec 12 17:24:32.468122 ignition[723]: failed to fetch config: resource requires networking Dec 12 17:24:32.468306 ignition[723]: Ignition finished successfully Dec 12 17:24:32.503724 systemd-networkd[807]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Dec 12 17:24:32.510826 ignition[816]: Ignition 2.22.0 Dec 12 17:24:32.510843 ignition[816]: Stage: fetch Dec 12 17:24:32.511019 ignition[816]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:24:32.511031 ignition[816]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 12 17:24:32.511129 ignition[816]: parsed url from cmdline: "" Dec 12 17:24:32.511134 ignition[816]: no config URL provided Dec 12 17:24:32.511139 ignition[816]: reading system config file "/usr/lib/ignition/user.ign" Dec 12 17:24:32.511147 ignition[816]: no config at "/usr/lib/ignition/user.ign" Dec 12 17:24:32.511181 ignition[816]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Dec 12 17:24:32.511703 ignition[816]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Dec 12 17:24:32.535738 systemd-networkd[807]: eth0: DHCPv4 address 23.88.120.93/32, gateway 172.31.1.1 acquired from 172.31.1.1 Dec 12 17:24:32.712570 ignition[816]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Dec 12 17:24:32.721641 ignition[816]: GET result: OK Dec 12 17:24:32.721834 ignition[816]: parsing config with SHA512: 96214d1f7cc66561635b9afa4da8448895942183b046249454ff5ba3df0743c6dad9d64cfa70205024b8aeb340ea976171b2216e2ff0af496aeeec5e21bd8d64 Dec 12 17:24:32.729481 unknown[816]: fetched base config from "system" Dec 12 17:24:32.729492 unknown[816]: fetched base config from "system" Dec 12 17:24:32.729497 unknown[816]: fetched user config from "hetzner" Dec 12 17:24:32.731941 ignition[816]: fetch: fetch complete Dec 12 17:24:32.731948 ignition[816]: fetch: fetch passed Dec 12 17:24:32.732018 ignition[816]: Ignition finished successfully Dec 12 17:24:32.734956 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Dec 12 17:24:32.736796 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Dec 12 17:24:32.773041 ignition[824]: Ignition 2.22.0 Dec 12 17:24:32.773058 ignition[824]: Stage: kargs Dec 12 17:24:32.773199 ignition[824]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:24:32.773244 ignition[824]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 12 17:24:32.777165 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Dec 12 17:24:32.774219 ignition[824]: kargs: kargs passed Dec 12 17:24:32.774273 ignition[824]: Ignition finished successfully Dec 12 17:24:32.781579 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Dec 12 17:24:32.814697 ignition[831]: Ignition 2.22.0 Dec 12 17:24:32.814714 ignition[831]: Stage: disks Dec 12 17:24:32.814872 ignition[831]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:24:32.814885 ignition[831]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 12 17:24:32.817905 systemd[1]: Finished ignition-disks.service - Ignition (disks). Dec 12 17:24:32.815949 ignition[831]: disks: disks passed Dec 12 17:24:32.821419 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Dec 12 17:24:32.816013 ignition[831]: Ignition finished successfully Dec 12 17:24:32.822571 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Dec 12 17:24:32.824097 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 12 17:24:32.825435 systemd[1]: Reached target sysinit.target - System Initialization. Dec 12 17:24:32.826459 systemd[1]: Reached target basic.target - Basic System. Dec 12 17:24:32.829678 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Dec 12 17:24:32.880227 systemd-fsck[839]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Dec 12 17:24:32.884964 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Dec 12 17:24:32.887699 systemd[1]: Mounting sysroot.mount - /sysroot... Dec 12 17:24:32.974654 kernel: EXT4-fs (sda9): mounted filesystem 895d7845-d0e8-43ae-a778-7804b473b868 r/w with ordered data mode. Quota mode: none. Dec 12 17:24:32.975895 systemd[1]: Mounted sysroot.mount - /sysroot. Dec 12 17:24:32.977923 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Dec 12 17:24:32.982411 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 12 17:24:32.984903 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Dec 12 17:24:33.000053 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Dec 12 17:24:33.005032 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Dec 12 17:24:33.005070 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Dec 12 17:24:33.008333 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Dec 12 17:24:33.012126 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Dec 12 17:24:33.019968 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (847) Dec 12 17:24:33.023080 kernel: BTRFS info (device sda6): first mount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:24:33.023142 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:24:33.031723 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 12 17:24:33.031802 kernel: BTRFS info (device sda6): turning on async discard Dec 12 17:24:33.031820 kernel: BTRFS info (device sda6): enabling free space tree Dec 12 17:24:33.035232 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 12 17:24:33.072406 initrd-setup-root[874]: cut: /sysroot/etc/passwd: No such file or directory Dec 12 17:24:33.078737 initrd-setup-root[881]: cut: /sysroot/etc/group: No such file or directory Dec 12 17:24:33.080358 coreos-metadata[849]: Dec 12 17:24:33.079 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Dec 12 17:24:33.082836 coreos-metadata[849]: Dec 12 17:24:33.080 INFO Fetch successful Dec 12 17:24:33.082836 coreos-metadata[849]: Dec 12 17:24:33.081 INFO wrote hostname ci-4459-2-2-1-a1e622265d to /sysroot/etc/hostname Dec 12 17:24:33.085363 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Dec 12 17:24:33.089321 initrd-setup-root[888]: cut: /sysroot/etc/shadow: No such file or directory Dec 12 17:24:33.095294 initrd-setup-root[896]: cut: /sysroot/etc/gshadow: No such file or directory Dec 12 17:24:33.200295 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Dec 12 17:24:33.203704 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Dec 12 17:24:33.205668 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Dec 12 17:24:33.220847 systemd[1]: sysroot-oem.mount: Deactivated successfully. Dec 12 17:24:33.222677 kernel: BTRFS info (device sda6): last unmount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:24:33.246042 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Dec 12 17:24:33.258678 ignition[963]: INFO : Ignition 2.22.0 Dec 12 17:24:33.260552 ignition[963]: INFO : Stage: mount Dec 12 17:24:33.260552 ignition[963]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 17:24:33.260552 ignition[963]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 12 17:24:33.260552 ignition[963]: INFO : mount: mount passed Dec 12 17:24:33.260552 ignition[963]: INFO : Ignition finished successfully Dec 12 17:24:33.264113 systemd[1]: Finished ignition-mount.service - Ignition (mount). Dec 12 17:24:33.267445 systemd[1]: Starting ignition-files.service - Ignition (files)... Dec 12 17:24:33.291351 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 12 17:24:33.320449 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (976) Dec 12 17:24:33.320523 kernel: BTRFS info (device sda6): first mount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:24:33.320547 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:24:33.326158 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 12 17:24:33.326270 kernel: BTRFS info (device sda6): turning on async discard Dec 12 17:24:33.326289 kernel: BTRFS info (device sda6): enabling free space tree Dec 12 17:24:33.328821 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 12 17:24:33.370509 ignition[993]: INFO : Ignition 2.22.0 Dec 12 17:24:33.372522 ignition[993]: INFO : Stage: files Dec 12 17:24:33.372522 ignition[993]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 17:24:33.372522 ignition[993]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 12 17:24:33.372522 ignition[993]: DEBUG : files: compiled without relabeling support, skipping Dec 12 17:24:33.375435 ignition[993]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Dec 12 17:24:33.376545 ignition[993]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Dec 12 17:24:33.381283 ignition[993]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Dec 12 17:24:33.383236 ignition[993]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Dec 12 17:24:33.385146 unknown[993]: wrote ssh authorized keys file for user: core Dec 12 17:24:33.386455 ignition[993]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Dec 12 17:24:33.391454 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Dec 12 17:24:33.391454 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Dec 12 17:24:33.436909 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Dec 12 17:24:33.514443 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Dec 12 17:24:33.516472 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Dec 12 17:24:33.516472 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Dec 12 17:24:33.516472 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Dec 12 17:24:33.516472 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Dec 12 17:24:33.516472 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 12 17:24:33.516472 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 12 17:24:33.516472 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 12 17:24:33.516472 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 12 17:24:33.527928 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Dec 12 17:24:33.527928 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Dec 12 17:24:33.527928 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 12 17:24:33.527928 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 12 17:24:33.527928 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 12 17:24:33.527928 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.1-arm64.raw: attempt #1 Dec 12 17:24:33.654956 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Dec 12 17:24:34.178996 systemd-networkd[807]: eth0: Gained IPv6LL Dec 12 17:24:34.179427 systemd-networkd[807]: eth1: Gained IPv6LL Dec 12 17:24:34.237582 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 12 17:24:34.239965 ignition[993]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Dec 12 17:24:34.239965 ignition[993]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 12 17:24:34.242579 ignition[993]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 12 17:24:34.242579 ignition[993]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Dec 12 17:24:34.242579 ignition[993]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Dec 12 17:24:34.242579 ignition[993]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Dec 12 17:24:34.242579 ignition[993]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Dec 12 17:24:34.242579 ignition[993]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Dec 12 17:24:34.242579 ignition[993]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Dec 12 17:24:34.242579 ignition[993]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Dec 12 17:24:34.242579 ignition[993]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Dec 12 17:24:34.242579 ignition[993]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Dec 12 17:24:34.242579 ignition[993]: INFO : files: files passed Dec 12 17:24:34.242579 ignition[993]: INFO : Ignition finished successfully Dec 12 17:24:34.244712 systemd[1]: Finished ignition-files.service - Ignition (files). Dec 12 17:24:34.249364 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Dec 12 17:24:34.252878 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Dec 12 17:24:34.272713 systemd[1]: ignition-quench.service: Deactivated successfully. Dec 12 17:24:34.272824 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Dec 12 17:24:34.283764 initrd-setup-root-after-ignition[1022]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 12 17:24:34.285147 initrd-setup-root-after-ignition[1022]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Dec 12 17:24:34.286615 initrd-setup-root-after-ignition[1026]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 12 17:24:34.289245 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 12 17:24:34.290493 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Dec 12 17:24:34.294439 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Dec 12 17:24:34.355461 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 12 17:24:34.355740 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Dec 12 17:24:34.358286 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Dec 12 17:24:34.359583 systemd[1]: Reached target initrd.target - Initrd Default Target. Dec 12 17:24:34.360942 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Dec 12 17:24:34.361827 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Dec 12 17:24:34.404485 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 12 17:24:34.407974 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Dec 12 17:24:34.431959 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Dec 12 17:24:34.433758 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 17:24:34.435697 systemd[1]: Stopped target timers.target - Timer Units. Dec 12 17:24:34.437012 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 12 17:24:34.437798 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 12 17:24:34.439785 systemd[1]: Stopped target initrd.target - Initrd Default Target. Dec 12 17:24:34.441149 systemd[1]: Stopped target basic.target - Basic System. Dec 12 17:24:34.441758 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Dec 12 17:24:34.442921 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Dec 12 17:24:34.444304 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Dec 12 17:24:34.445987 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Dec 12 17:24:34.447373 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Dec 12 17:24:34.448518 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Dec 12 17:24:34.449851 systemd[1]: Stopped target sysinit.target - System Initialization. Dec 12 17:24:34.451144 systemd[1]: Stopped target local-fs.target - Local File Systems. Dec 12 17:24:34.452456 systemd[1]: Stopped target swap.target - Swaps. Dec 12 17:24:34.453462 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 12 17:24:34.453609 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Dec 12 17:24:34.455135 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Dec 12 17:24:34.455881 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 17:24:34.457151 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Dec 12 17:24:34.460768 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 17:24:34.461766 systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 12 17:24:34.461921 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Dec 12 17:24:34.464012 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Dec 12 17:24:34.464168 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 12 17:24:34.466644 systemd[1]: ignition-files.service: Deactivated successfully. Dec 12 17:24:34.466789 systemd[1]: Stopped ignition-files.service - Ignition (files). Dec 12 17:24:34.468906 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Dec 12 17:24:34.469127 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Dec 12 17:24:34.472748 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Dec 12 17:24:34.473538 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 12 17:24:34.473727 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 17:24:34.478902 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Dec 12 17:24:34.480201 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 12 17:24:34.481110 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 17:24:34.482792 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 12 17:24:34.483585 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Dec 12 17:24:34.491284 systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 12 17:24:34.494472 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Dec 12 17:24:34.510676 systemd[1]: sysroot-boot.mount: Deactivated successfully. Dec 12 17:24:34.518237 ignition[1046]: INFO : Ignition 2.22.0 Dec 12 17:24:34.519597 systemd[1]: sysroot-boot.service: Deactivated successfully. Dec 12 17:24:34.521119 ignition[1046]: INFO : Stage: umount Dec 12 17:24:34.521119 ignition[1046]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 17:24:34.521119 ignition[1046]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 12 17:24:34.519718 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Dec 12 17:24:34.524429 ignition[1046]: INFO : umount: umount passed Dec 12 17:24:34.524429 ignition[1046]: INFO : Ignition finished successfully Dec 12 17:24:34.524950 systemd[1]: ignition-mount.service: Deactivated successfully. Dec 12 17:24:34.526658 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Dec 12 17:24:34.527962 systemd[1]: ignition-disks.service: Deactivated successfully. Dec 12 17:24:34.528076 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Dec 12 17:24:34.529721 systemd[1]: ignition-kargs.service: Deactivated successfully. Dec 12 17:24:34.529782 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Dec 12 17:24:34.530834 systemd[1]: ignition-fetch.service: Deactivated successfully. Dec 12 17:24:34.530886 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Dec 12 17:24:34.531939 systemd[1]: Stopped target network.target - Network. Dec 12 17:24:34.532931 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Dec 12 17:24:34.532997 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Dec 12 17:24:34.534087 systemd[1]: Stopped target paths.target - Path Units. Dec 12 17:24:34.535034 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 12 17:24:34.540801 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 17:24:34.542461 systemd[1]: Stopped target slices.target - Slice Units. Dec 12 17:24:34.544668 systemd[1]: Stopped target sockets.target - Socket Units. Dec 12 17:24:34.545679 systemd[1]: iscsid.socket: Deactivated successfully. Dec 12 17:24:34.545730 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Dec 12 17:24:34.546942 systemd[1]: iscsiuio.socket: Deactivated successfully. Dec 12 17:24:34.546984 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 12 17:24:34.547942 systemd[1]: ignition-setup.service: Deactivated successfully. Dec 12 17:24:34.548007 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Dec 12 17:24:34.549237 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Dec 12 17:24:34.549282 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Dec 12 17:24:34.550281 systemd[1]: initrd-setup-root.service: Deactivated successfully. Dec 12 17:24:34.550335 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Dec 12 17:24:34.551463 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Dec 12 17:24:34.553337 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Dec 12 17:24:34.561985 systemd[1]: systemd-resolved.service: Deactivated successfully. Dec 12 17:24:34.562119 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Dec 12 17:24:34.566414 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Dec 12 17:24:34.566799 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 12 17:24:34.566846 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 17:24:34.569940 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Dec 12 17:24:34.571978 systemd[1]: systemd-networkd.service: Deactivated successfully. Dec 12 17:24:34.572135 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Dec 12 17:24:34.575691 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Dec 12 17:24:34.575904 systemd[1]: Stopped target network-pre.target - Preparation for Network. Dec 12 17:24:34.577396 systemd[1]: systemd-networkd.socket: Deactivated successfully. Dec 12 17:24:34.577440 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Dec 12 17:24:34.579473 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Dec 12 17:24:34.580083 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Dec 12 17:24:34.580147 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 12 17:24:34.580911 systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 12 17:24:34.580967 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Dec 12 17:24:34.581722 systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 12 17:24:34.581764 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Dec 12 17:24:34.582999 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 17:24:34.588317 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Dec 12 17:24:34.605711 systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 12 17:24:34.606037 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 17:24:34.609711 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 12 17:24:34.609771 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Dec 12 17:24:34.612038 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 12 17:24:34.612078 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 17:24:34.614065 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 12 17:24:34.614137 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Dec 12 17:24:34.617206 systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 12 17:24:34.617274 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Dec 12 17:24:34.618535 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 12 17:24:34.618600 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 12 17:24:34.621117 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Dec 12 17:24:34.622086 systemd[1]: systemd-network-generator.service: Deactivated successfully. Dec 12 17:24:34.622159 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 17:24:34.623103 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 12 17:24:34.623151 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 17:24:34.625463 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 17:24:34.625533 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:24:34.640119 systemd[1]: network-cleanup.service: Deactivated successfully. Dec 12 17:24:34.643683 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Dec 12 17:24:34.646017 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 12 17:24:34.647085 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Dec 12 17:24:34.648814 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Dec 12 17:24:34.650760 systemd[1]: Starting initrd-switch-root.service - Switch Root... Dec 12 17:24:34.684254 systemd[1]: Switching root. Dec 12 17:24:34.725587 systemd-journald[245]: Journal stopped Dec 12 17:24:35.741048 systemd-journald[245]: Received SIGTERM from PID 1 (systemd). Dec 12 17:24:35.741134 kernel: SELinux: policy capability network_peer_controls=1 Dec 12 17:24:35.741148 kernel: SELinux: policy capability open_perms=1 Dec 12 17:24:35.741158 kernel: SELinux: policy capability extended_socket_class=1 Dec 12 17:24:35.741202 kernel: SELinux: policy capability always_check_network=0 Dec 12 17:24:35.741214 kernel: SELinux: policy capability cgroup_seclabel=1 Dec 12 17:24:35.741224 kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 12 17:24:35.741233 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Dec 12 17:24:35.741242 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Dec 12 17:24:35.741250 kernel: SELinux: policy capability userspace_initial_context=0 Dec 12 17:24:35.741262 kernel: audit: type=1403 audit(1765560274.909:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Dec 12 17:24:35.741273 systemd[1]: Successfully loaded SELinux policy in 63.279ms. Dec 12 17:24:35.741293 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.143ms. Dec 12 17:24:35.741304 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 12 17:24:35.741315 systemd[1]: Detected virtualization kvm. Dec 12 17:24:35.741324 systemd[1]: Detected architecture arm64. Dec 12 17:24:35.741334 systemd[1]: Detected first boot. Dec 12 17:24:35.741343 systemd[1]: Hostname set to . Dec 12 17:24:35.741353 systemd[1]: Initializing machine ID from VM UUID. Dec 12 17:24:35.741363 kernel: NET: Registered PF_VSOCK protocol family Dec 12 17:24:35.741373 zram_generator::config[1090]: No configuration found. Dec 12 17:24:35.741383 systemd[1]: Populated /etc with preset unit settings. Dec 12 17:24:35.741395 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Dec 12 17:24:35.741409 systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 12 17:24:35.741419 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Dec 12 17:24:35.741432 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 12 17:24:35.741444 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Dec 12 17:24:35.741454 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Dec 12 17:24:35.741463 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Dec 12 17:24:35.741473 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Dec 12 17:24:35.741483 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Dec 12 17:24:35.741494 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Dec 12 17:24:35.741505 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Dec 12 17:24:35.741514 systemd[1]: Created slice user.slice - User and Session Slice. Dec 12 17:24:35.741525 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 17:24:35.741540 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 17:24:35.741552 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Dec 12 17:24:35.741562 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Dec 12 17:24:35.741572 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Dec 12 17:24:35.741582 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 12 17:24:35.741594 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Dec 12 17:24:35.741609 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 17:24:35.741631 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 12 17:24:35.741655 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Dec 12 17:24:35.741666 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Dec 12 17:24:35.741680 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Dec 12 17:24:35.741690 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Dec 12 17:24:35.741702 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 17:24:35.741717 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 12 17:24:35.741727 systemd[1]: Reached target slices.target - Slice Units. Dec 12 17:24:35.741737 systemd[1]: Reached target swap.target - Swaps. Dec 12 17:24:35.741747 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Dec 12 17:24:35.741756 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Dec 12 17:24:35.741766 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Dec 12 17:24:35.741777 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 12 17:24:35.741787 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 12 17:24:35.741796 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 17:24:35.741811 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Dec 12 17:24:35.741821 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Dec 12 17:24:35.741831 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Dec 12 17:24:35.741841 systemd[1]: Mounting media.mount - External Media Directory... Dec 12 17:24:35.741851 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Dec 12 17:24:35.741861 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Dec 12 17:24:35.741871 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Dec 12 17:24:35.741881 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 12 17:24:35.741892 systemd[1]: Reached target machines.target - Containers. Dec 12 17:24:35.741902 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Dec 12 17:24:35.741913 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:24:35.741923 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 12 17:24:35.741933 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Dec 12 17:24:35.741942 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 17:24:35.741952 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 12 17:24:35.741962 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 17:24:35.741972 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Dec 12 17:24:35.741984 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 17:24:35.741994 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Dec 12 17:24:35.742004 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 12 17:24:35.742015 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Dec 12 17:24:35.742026 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Dec 12 17:24:35.742036 systemd[1]: Stopped systemd-fsck-usr.service. Dec 12 17:24:35.742046 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:24:35.742058 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 12 17:24:35.742068 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 12 17:24:35.742078 kernel: loop: module loaded Dec 12 17:24:35.742088 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 12 17:24:35.742098 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Dec 12 17:24:35.742109 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Dec 12 17:24:35.742120 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 12 17:24:35.742130 systemd[1]: verity-setup.service: Deactivated successfully. Dec 12 17:24:35.742140 systemd[1]: Stopped verity-setup.service. Dec 12 17:24:35.742150 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Dec 12 17:24:35.742161 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Dec 12 17:24:35.742185 systemd[1]: Mounted media.mount - External Media Directory. Dec 12 17:24:35.742196 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Dec 12 17:24:35.742206 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Dec 12 17:24:35.742217 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Dec 12 17:24:35.742227 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 17:24:35.742237 systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 12 17:24:35.742247 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Dec 12 17:24:35.742257 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 17:24:35.742268 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 17:24:35.742282 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 17:24:35.742293 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 17:24:35.742303 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 17:24:35.742313 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 17:24:35.742323 kernel: ACPI: bus type drm_connector registered Dec 12 17:24:35.742333 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 12 17:24:35.742343 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 12 17:24:35.742353 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Dec 12 17:24:35.742364 kernel: fuse: init (API version 7.41) Dec 12 17:24:35.742375 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Dec 12 17:24:35.742385 systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 12 17:24:35.742395 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Dec 12 17:24:35.742405 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Dec 12 17:24:35.742416 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Dec 12 17:24:35.742426 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Dec 12 17:24:35.742437 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 12 17:24:35.742447 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Dec 12 17:24:35.742460 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Dec 12 17:24:35.742472 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:24:35.742519 systemd-journald[1154]: Collecting audit messages is disabled. Dec 12 17:24:35.742543 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Dec 12 17:24:35.742555 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 12 17:24:35.742566 systemd-journald[1154]: Journal started Dec 12 17:24:35.742589 systemd-journald[1154]: Runtime Journal (/run/log/journal/a1de3e33d28f4b72bb43ab9e55a2f8d3) is 8M, max 76.5M, 68.5M free. Dec 12 17:24:35.422862 systemd[1]: Queued start job for default target multi-user.target. Dec 12 17:24:35.449926 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Dec 12 17:24:35.450477 systemd[1]: systemd-journald.service: Deactivated successfully. Dec 12 17:24:35.750474 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Dec 12 17:24:35.750544 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 12 17:24:35.753854 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Dec 12 17:24:35.753923 systemd[1]: Started systemd-journald.service - Journal Service. Dec 12 17:24:35.756564 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Dec 12 17:24:35.757739 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 12 17:24:35.758829 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 17:24:35.760450 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Dec 12 17:24:35.761493 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Dec 12 17:24:35.776915 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 12 17:24:35.781910 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Dec 12 17:24:35.788064 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 12 17:24:35.793736 systemd[1]: Starting systemd-sysusers.service - Create System Users... Dec 12 17:24:35.806694 kernel: loop0: detected capacity change from 0 to 100632 Dec 12 17:24:35.808043 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Dec 12 17:24:35.808993 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Dec 12 17:24:35.813309 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Dec 12 17:24:35.834027 systemd-journald[1154]: Time spent on flushing to /var/log/journal/a1de3e33d28f4b72bb43ab9e55a2f8d3 is 27.873ms for 1174 entries. Dec 12 17:24:35.834027 systemd-journald[1154]: System Journal (/var/log/journal/a1de3e33d28f4b72bb43ab9e55a2f8d3) is 8M, max 584.8M, 576.8M free. Dec 12 17:24:35.869805 systemd-journald[1154]: Received client request to flush runtime journal. Dec 12 17:24:35.869847 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Dec 12 17:24:35.873421 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Dec 12 17:24:35.891650 kernel: loop1: detected capacity change from 0 to 200800 Dec 12 17:24:35.891803 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 12 17:24:35.897309 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Dec 12 17:24:35.916701 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 17:24:35.921078 systemd[1]: Finished systemd-sysusers.service - Create System Users. Dec 12 17:24:35.923798 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 12 17:24:35.942652 kernel: loop2: detected capacity change from 0 to 119840 Dec 12 17:24:35.961132 systemd-tmpfiles[1228]: ACLs are not supported, ignoring. Dec 12 17:24:35.961152 systemd-tmpfiles[1228]: ACLs are not supported, ignoring. Dec 12 17:24:35.965290 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 17:24:35.972670 kernel: loop3: detected capacity change from 0 to 8 Dec 12 17:24:35.990755 kernel: loop4: detected capacity change from 0 to 100632 Dec 12 17:24:36.006932 kernel: loop5: detected capacity change from 0 to 200800 Dec 12 17:24:36.023684 kernel: loop6: detected capacity change from 0 to 119840 Dec 12 17:24:36.039672 kernel: loop7: detected capacity change from 0 to 8 Dec 12 17:24:36.040838 (sd-merge)[1233]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Dec 12 17:24:36.041292 (sd-merge)[1233]: Merged extensions into '/usr'. Dec 12 17:24:36.047309 systemd[1]: Reload requested from client PID 1190 ('systemd-sysext') (unit systemd-sysext.service)... Dec 12 17:24:36.047469 systemd[1]: Reloading... Dec 12 17:24:36.182655 zram_generator::config[1262]: No configuration found. Dec 12 17:24:36.264970 ldconfig[1187]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Dec 12 17:24:36.391743 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Dec 12 17:24:36.392270 systemd[1]: Reloading finished in 343 ms. Dec 12 17:24:36.408574 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Dec 12 17:24:36.415949 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Dec 12 17:24:36.426833 systemd[1]: Starting ensure-sysext.service... Dec 12 17:24:36.430771 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 12 17:24:36.456784 systemd[1]: Reload requested from client PID 1296 ('systemctl') (unit ensure-sysext.service)... Dec 12 17:24:36.456802 systemd[1]: Reloading... Dec 12 17:24:36.460918 systemd-tmpfiles[1297]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Dec 12 17:24:36.460964 systemd-tmpfiles[1297]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Dec 12 17:24:36.461354 systemd-tmpfiles[1297]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 12 17:24:36.461711 systemd-tmpfiles[1297]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Dec 12 17:24:36.464021 systemd-tmpfiles[1297]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Dec 12 17:24:36.464311 systemd-tmpfiles[1297]: ACLs are not supported, ignoring. Dec 12 17:24:36.464354 systemd-tmpfiles[1297]: ACLs are not supported, ignoring. Dec 12 17:24:36.470531 systemd-tmpfiles[1297]: Detected autofs mount point /boot during canonicalization of boot. Dec 12 17:24:36.470548 systemd-tmpfiles[1297]: Skipping /boot Dec 12 17:24:36.483057 systemd-tmpfiles[1297]: Detected autofs mount point /boot during canonicalization of boot. Dec 12 17:24:36.483072 systemd-tmpfiles[1297]: Skipping /boot Dec 12 17:24:36.524649 zram_generator::config[1324]: No configuration found. Dec 12 17:24:36.696586 systemd[1]: Reloading finished in 239 ms. Dec 12 17:24:36.714083 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Dec 12 17:24:36.720656 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 17:24:36.728797 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 12 17:24:36.732892 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Dec 12 17:24:36.736965 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Dec 12 17:24:36.745702 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 12 17:24:36.750431 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 17:24:36.753881 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Dec 12 17:24:36.763392 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:24:36.767009 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 17:24:36.775549 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 17:24:36.778607 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 17:24:36.779645 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:24:36.780013 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:24:36.785012 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:24:36.785362 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:24:36.785502 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:24:36.790842 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Dec 12 17:24:36.794493 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:24:36.805913 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 12 17:24:36.806899 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:24:36.807050 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:24:36.807904 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Dec 12 17:24:36.818358 systemd[1]: Starting systemd-update-done.service - Update is Completed... Dec 12 17:24:36.827756 systemd[1]: Finished ensure-sysext.service. Dec 12 17:24:36.831605 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Dec 12 17:24:36.833103 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 17:24:36.833515 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 17:24:36.839110 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 17:24:36.841326 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 17:24:36.842544 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 12 17:24:36.850701 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Dec 12 17:24:36.852033 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 17:24:36.852390 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 17:24:36.856059 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 12 17:24:36.856267 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 12 17:24:36.863505 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 12 17:24:36.877244 systemd[1]: Finished systemd-update-done.service - Update is Completed. Dec 12 17:24:36.879895 systemd-udevd[1367]: Using default interface naming scheme 'v255'. Dec 12 17:24:36.891668 augenrules[1403]: No rules Dec 12 17:24:36.893374 systemd[1]: audit-rules.service: Deactivated successfully. Dec 12 17:24:36.894377 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 12 17:24:36.910216 systemd[1]: Started systemd-userdbd.service - User Database Manager. Dec 12 17:24:36.916776 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Dec 12 17:24:36.917908 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 12 17:24:36.924439 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 17:24:36.929449 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 12 17:24:37.088482 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Dec 12 17:24:37.145759 systemd-networkd[1418]: lo: Link UP Dec 12 17:24:37.145769 systemd-networkd[1418]: lo: Gained carrier Dec 12 17:24:37.146612 systemd-networkd[1418]: Enumeration completed Dec 12 17:24:37.147706 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 12 17:24:37.150500 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Dec 12 17:24:37.153967 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Dec 12 17:24:37.160216 systemd-resolved[1366]: Positive Trust Anchors: Dec 12 17:24:37.160736 systemd-resolved[1366]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 12 17:24:37.160832 systemd-resolved[1366]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 12 17:24:37.168123 systemd-resolved[1366]: Using system hostname 'ci-4459-2-2-1-a1e622265d'. Dec 12 17:24:37.171517 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 12 17:24:37.173095 systemd[1]: Reached target network.target - Network. Dec 12 17:24:37.174707 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 12 17:24:37.188307 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Dec 12 17:24:37.199579 systemd-networkd[1418]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:24:37.199592 systemd-networkd[1418]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 12 17:24:37.201372 systemd-networkd[1418]: eth0: Link UP Dec 12 17:24:37.201478 systemd-networkd[1418]: eth0: Gained carrier Dec 12 17:24:37.201500 systemd-networkd[1418]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:24:37.230697 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Dec 12 17:24:37.231656 systemd[1]: Reached target sysinit.target - System Initialization. Dec 12 17:24:37.232801 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Dec 12 17:24:37.234135 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Dec 12 17:24:37.235502 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Dec 12 17:24:37.237758 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Dec 12 17:24:37.237795 systemd[1]: Reached target paths.target - Path Units. Dec 12 17:24:37.238345 systemd[1]: Reached target time-set.target - System Time Set. Dec 12 17:24:37.239195 systemd[1]: Started logrotate.timer - Daily rotation of log files. Dec 12 17:24:37.240286 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Dec 12 17:24:37.241042 systemd[1]: Reached target timers.target - Timer Units. Dec 12 17:24:37.243863 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Dec 12 17:24:37.246229 systemd-networkd[1418]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:24:37.246241 systemd-networkd[1418]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 12 17:24:37.246983 systemd[1]: Starting docker.socket - Docker Socket for the API... Dec 12 17:24:37.250180 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Dec 12 17:24:37.252394 systemd-networkd[1418]: eth1: Link UP Dec 12 17:24:37.254046 systemd-networkd[1418]: eth1: Gained carrier Dec 12 17:24:37.254074 systemd-networkd[1418]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:24:37.255525 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Dec 12 17:24:37.257919 systemd[1]: Reached target ssh-access.target - SSH Access Available. Dec 12 17:24:37.262953 systemd-networkd[1418]: eth0: DHCPv4 address 23.88.120.93/32, gateway 172.31.1.1 acquired from 172.31.1.1 Dec 12 17:24:37.265391 systemd-timesyncd[1393]: Network configuration changed, trying to establish connection. Dec 12 17:24:37.267782 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Dec 12 17:24:37.269255 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Dec 12 17:24:37.273709 systemd[1]: Listening on docker.socket - Docker Socket for the API. Dec 12 17:24:37.275973 systemd[1]: Reached target sockets.target - Socket Units. Dec 12 17:24:37.277326 systemd[1]: Reached target basic.target - Basic System. Dec 12 17:24:37.278412 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Dec 12 17:24:37.278446 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Dec 12 17:24:37.281764 systemd[1]: Starting containerd.service - containerd container runtime... Dec 12 17:24:37.283766 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Dec 12 17:24:37.289366 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Dec 12 17:24:37.295986 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Dec 12 17:24:37.297870 systemd-networkd[1418]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Dec 12 17:24:37.311872 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Dec 12 17:24:37.318871 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Dec 12 17:24:37.319485 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Dec 12 17:24:37.320636 kernel: mousedev: PS/2 mouse device common for all mice Dec 12 17:24:37.323136 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Dec 12 17:24:37.332281 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Dec 12 17:24:37.338324 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Dec 12 17:24:37.357254 extend-filesystems[1474]: Found /dev/sda6 Dec 12 17:24:37.362210 extend-filesystems[1474]: Found /dev/sda9 Dec 12 17:24:37.366589 extend-filesystems[1474]: Checking size of /dev/sda9 Dec 12 17:24:37.368033 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Dec 12 17:24:37.373880 systemd[1]: Starting systemd-logind.service - User Login Management... Dec 12 17:24:37.376953 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Dec 12 17:24:37.377560 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Dec 12 17:24:37.378443 systemd[1]: Starting update-engine.service - Update Engine... Dec 12 17:24:37.387651 jq[1473]: false Dec 12 17:24:37.384025 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Dec 12 17:24:37.391921 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Dec 12 17:24:37.392942 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Dec 12 17:24:37.393122 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Dec 12 17:24:37.394058 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Dec 12 17:24:37.394309 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Dec 12 17:24:37.406608 extend-filesystems[1474]: Resized partition /dev/sda9 Dec 12 17:24:37.409654 extend-filesystems[1498]: resize2fs 1.47.3 (8-Jul-2025) Dec 12 17:24:37.421679 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Dec 12 17:24:37.431303 systemd-timesyncd[1393]: Contacted time server 193.99.165.217:123 (0.flatcar.pool.ntp.org). Dec 12 17:24:37.431400 systemd-timesyncd[1393]: Initial clock synchronization to Fri 2025-12-12 17:24:37.177155 UTC. Dec 12 17:24:37.434090 jq[1490]: true Dec 12 17:24:37.444373 systemd[1]: motdgen.service: Deactivated successfully. Dec 12 17:24:37.444637 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Dec 12 17:24:37.468841 tar[1493]: linux-arm64/LICENSE Dec 12 17:24:37.470093 tar[1493]: linux-arm64/helm Dec 12 17:24:37.479269 coreos-metadata[1470]: Dec 12 17:24:37.479 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Dec 12 17:24:37.483343 (ntainerd)[1508]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Dec 12 17:24:37.487410 coreos-metadata[1470]: Dec 12 17:24:37.486 INFO Fetch successful Dec 12 17:24:37.487410 coreos-metadata[1470]: Dec 12 17:24:37.486 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Dec 12 17:24:37.487410 coreos-metadata[1470]: Dec 12 17:24:37.486 INFO Fetch successful Dec 12 17:24:37.496665 jq[1507]: true Dec 12 17:24:37.532899 update_engine[1488]: I20251212 17:24:37.532447 1488 main.cc:92] Flatcar Update Engine starting Dec 12 17:24:37.550996 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Dec 12 17:24:37.565270 extend-filesystems[1498]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Dec 12 17:24:37.565270 extend-filesystems[1498]: old_desc_blocks = 1, new_desc_blocks = 5 Dec 12 17:24:37.565270 extend-filesystems[1498]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Dec 12 17:24:37.580029 extend-filesystems[1474]: Resized filesystem in /dev/sda9 Dec 12 17:24:37.572399 dbus-daemon[1471]: [system] SELinux support is enabled Dec 12 17:24:37.570139 systemd[1]: extend-filesystems.service: Deactivated successfully. Dec 12 17:24:37.570378 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Dec 12 17:24:37.578117 systemd[1]: Started dbus.service - D-Bus System Message Bus. Dec 12 17:24:37.587199 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Dec 12 17:24:37.587243 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Dec 12 17:24:37.590966 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Dec 12 17:24:37.590986 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Dec 12 17:24:37.600563 systemd[1]: Started update-engine.service - Update Engine. Dec 12 17:24:37.603042 update_engine[1488]: I20251212 17:24:37.601866 1488 update_check_scheduler.cc:74] Next update check in 9m34s Dec 12 17:24:37.620946 systemd[1]: Started locksmithd.service - Cluster reboot manager. Dec 12 17:24:37.633522 bash[1538]: Updated "/home/core/.ssh/authorized_keys" Dec 12 17:24:37.643719 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Dec 12 17:24:37.648855 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Dec 12 17:24:37.653799 systemd[1]: Starting sshkeys.service... Dec 12 17:24:37.655468 systemd-logind[1487]: New seat seat0. Dec 12 17:24:37.661228 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Dec 12 17:24:37.663909 systemd[1]: Started systemd-logind.service - User Login Management. Dec 12 17:24:37.734018 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Dec 12 17:24:37.739432 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Dec 12 17:24:37.742694 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Dec 12 17:24:37.746314 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Dec 12 17:24:37.761311 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Dec 12 17:24:37.855780 coreos-metadata[1553]: Dec 12 17:24:37.855 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Dec 12 17:24:37.872133 coreos-metadata[1553]: Dec 12 17:24:37.871 INFO Fetch successful Dec 12 17:24:37.875860 unknown[1553]: wrote ssh authorized keys file for user: core Dec 12 17:24:37.898104 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Dec 12 17:24:37.898203 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Dec 12 17:24:37.898240 kernel: [drm] features: -context_init Dec 12 17:24:37.898254 kernel: [drm] number of scanouts: 1 Dec 12 17:24:37.898267 kernel: [drm] number of cap sets: 0 Dec 12 17:24:37.900653 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Dec 12 17:24:37.902372 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Dec 12 17:24:37.908022 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Dec 12 17:24:37.919472 kernel: Console: switching to colour frame buffer device 160x50 Dec 12 17:24:37.935656 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Dec 12 17:24:37.949753 locksmithd[1539]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Dec 12 17:24:37.977466 update-ssh-keys[1561]: Updated "/home/core/.ssh/authorized_keys" Dec 12 17:24:37.978045 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Dec 12 17:24:37.986223 systemd[1]: Finished sshkeys.service. Dec 12 17:24:38.010534 containerd[1508]: time="2025-12-12T17:24:38Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Dec 12 17:24:38.015316 containerd[1508]: time="2025-12-12T17:24:38.013969146Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Dec 12 17:24:38.032905 containerd[1508]: time="2025-12-12T17:24:38.032612545Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="11.968µs" Dec 12 17:24:38.032905 containerd[1508]: time="2025-12-12T17:24:38.032895195Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Dec 12 17:24:38.033019 containerd[1508]: time="2025-12-12T17:24:38.032921221Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Dec 12 17:24:38.034947 containerd[1508]: time="2025-12-12T17:24:38.034606161Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Dec 12 17:24:38.034947 containerd[1508]: time="2025-12-12T17:24:38.034659918Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Dec 12 17:24:38.034947 containerd[1508]: time="2025-12-12T17:24:38.034695743Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 12 17:24:38.034947 containerd[1508]: time="2025-12-12T17:24:38.034798958Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 12 17:24:38.034947 containerd[1508]: time="2025-12-12T17:24:38.034812901Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 12 17:24:38.035254 containerd[1508]: time="2025-12-12T17:24:38.035216620Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 12 17:24:38.035254 containerd[1508]: time="2025-12-12T17:24:38.035242608Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 12 17:24:38.035298 containerd[1508]: time="2025-12-12T17:24:38.035266233Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 12 17:24:38.035298 containerd[1508]: time="2025-12-12T17:24:38.035274444Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Dec 12 17:24:38.035379 containerd[1508]: time="2025-12-12T17:24:38.035359727Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Dec 12 17:24:38.037213 containerd[1508]: time="2025-12-12T17:24:38.035552523Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 12 17:24:38.037213 containerd[1508]: time="2025-12-12T17:24:38.035599541Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 12 17:24:38.037213 containerd[1508]: time="2025-12-12T17:24:38.035623980Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Dec 12 17:24:38.037213 containerd[1508]: time="2025-12-12T17:24:38.035658914Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Dec 12 17:24:38.037213 containerd[1508]: time="2025-12-12T17:24:38.035892184Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Dec 12 17:24:38.037213 containerd[1508]: time="2025-12-12T17:24:38.035954229Z" level=info msg="metadata content store policy set" policy=shared Dec 12 17:24:38.042386 containerd[1508]: time="2025-12-12T17:24:38.041022719Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Dec 12 17:24:38.042386 containerd[1508]: time="2025-12-12T17:24:38.041262650Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Dec 12 17:24:38.042386 containerd[1508]: time="2025-12-12T17:24:38.041482365Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Dec 12 17:24:38.042386 containerd[1508]: time="2025-12-12T17:24:38.041506648Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Dec 12 17:24:38.042386 containerd[1508]: time="2025-12-12T17:24:38.041520087Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Dec 12 17:24:38.042386 containerd[1508]: time="2025-12-12T17:24:38.041570475Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Dec 12 17:24:38.042386 containerd[1508]: time="2025-12-12T17:24:38.041592473Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Dec 12 17:24:38.042386 containerd[1508]: time="2025-12-12T17:24:38.041605487Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Dec 12 17:24:38.042386 containerd[1508]: time="2025-12-12T17:24:38.041875588Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Dec 12 17:24:38.042386 containerd[1508]: time="2025-12-12T17:24:38.041889570Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Dec 12 17:24:38.042386 containerd[1508]: time="2025-12-12T17:24:38.041901034Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Dec 12 17:24:38.042386 containerd[1508]: time="2025-12-12T17:24:38.041923187Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Dec 12 17:24:38.044622 containerd[1508]: time="2025-12-12T17:24:38.042751579Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Dec 12 17:24:38.044622 containerd[1508]: time="2025-12-12T17:24:38.042837675Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Dec 12 17:24:38.044622 containerd[1508]: time="2025-12-12T17:24:38.042854600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Dec 12 17:24:38.044622 containerd[1508]: time="2025-12-12T17:24:38.042869511Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Dec 12 17:24:38.044622 containerd[1508]: time="2025-12-12T17:24:38.042882524Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Dec 12 17:24:38.044622 containerd[1508]: time="2025-12-12T17:24:38.042894220Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Dec 12 17:24:38.044622 containerd[1508]: time="2025-12-12T17:24:38.042904755Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Dec 12 17:24:38.044622 containerd[1508]: time="2025-12-12T17:24:38.042915096Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Dec 12 17:24:38.044622 containerd[1508]: time="2025-12-12T17:24:38.042933105Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Dec 12 17:24:38.044622 containerd[1508]: time="2025-12-12T17:24:38.042947861Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Dec 12 17:24:38.044622 containerd[1508]: time="2025-12-12T17:24:38.042957776Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Dec 12 17:24:38.044622 containerd[1508]: time="2025-12-12T17:24:38.043131479Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Dec 12 17:24:38.044622 containerd[1508]: time="2025-12-12T17:24:38.043145693Z" level=info msg="Start snapshots syncer" Dec 12 17:24:38.044622 containerd[1508]: time="2025-12-12T17:24:38.043187444Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Dec 12 17:24:38.044915 containerd[1508]: time="2025-12-12T17:24:38.043712194Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Dec 12 17:24:38.044915 containerd[1508]: time="2025-12-12T17:24:38.043774510Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Dec 12 17:24:38.045015 containerd[1508]: time="2025-12-12T17:24:38.043836826Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Dec 12 17:24:38.045015 containerd[1508]: time="2025-12-12T17:24:38.044071722Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Dec 12 17:24:38.045015 containerd[1508]: time="2025-12-12T17:24:38.044095580Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Dec 12 17:24:38.045015 containerd[1508]: time="2025-12-12T17:24:38.044108438Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Dec 12 17:24:38.045015 containerd[1508]: time="2025-12-12T17:24:38.044121839Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Dec 12 17:24:38.045015 containerd[1508]: time="2025-12-12T17:24:38.044134774Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Dec 12 17:24:38.045015 containerd[1508]: time="2025-12-12T17:24:38.044146355Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Dec 12 17:24:38.045015 containerd[1508]: time="2025-12-12T17:24:38.044165874Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Dec 12 17:24:38.045015 containerd[1508]: time="2025-12-12T17:24:38.044195619Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Dec 12 17:24:38.045015 containerd[1508]: time="2025-12-12T17:24:38.044208439Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Dec 12 17:24:38.045015 containerd[1508]: time="2025-12-12T17:24:38.044221413Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Dec 12 17:24:38.045462 containerd[1508]: time="2025-12-12T17:24:38.045240471Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 12 17:24:38.045462 containerd[1508]: time="2025-12-12T17:24:38.045330479Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 12 17:24:38.045462 containerd[1508]: time="2025-12-12T17:24:38.045343067Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 12 17:24:38.045462 containerd[1508]: time="2025-12-12T17:24:38.045352672Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 12 17:24:38.045462 containerd[1508]: time="2025-12-12T17:24:38.045360727Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Dec 12 17:24:38.045462 containerd[1508]: time="2025-12-12T17:24:38.045407861Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Dec 12 17:24:38.045462 containerd[1508]: time="2025-12-12T17:24:38.045423315Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Dec 12 17:24:38.045624 containerd[1508]: time="2025-12-12T17:24:38.045559489Z" level=info msg="runtime interface created" Dec 12 17:24:38.045624 containerd[1508]: time="2025-12-12T17:24:38.045565608Z" level=info msg="created NRI interface" Dec 12 17:24:38.045624 containerd[1508]: time="2025-12-12T17:24:38.045574322Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Dec 12 17:24:38.045624 containerd[1508]: time="2025-12-12T17:24:38.045589543Z" level=info msg="Connect containerd service" Dec 12 17:24:38.045714 containerd[1508]: time="2025-12-12T17:24:38.045633153Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Dec 12 17:24:38.047043 containerd[1508]: time="2025-12-12T17:24:38.046993266Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 12 17:24:38.247918 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:24:38.286328 systemd-logind[1487]: Watching system buttons on /dev/input/event0 (Power Button) Dec 12 17:24:38.299657 systemd-logind[1487]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Dec 12 17:24:38.305726 containerd[1508]: time="2025-12-12T17:24:38.305655493Z" level=info msg="Start subscribing containerd event" Dec 12 17:24:38.305825 containerd[1508]: time="2025-12-12T17:24:38.305740737Z" level=info msg="Start recovering state" Dec 12 17:24:38.305862 containerd[1508]: time="2025-12-12T17:24:38.305829855Z" level=info msg="Start event monitor" Dec 12 17:24:38.305862 containerd[1508]: time="2025-12-12T17:24:38.305844339Z" level=info msg="Start cni network conf syncer for default" Dec 12 17:24:38.305862 containerd[1508]: time="2025-12-12T17:24:38.305851776Z" level=info msg="Start streaming server" Dec 12 17:24:38.305862 containerd[1508]: time="2025-12-12T17:24:38.305861032Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Dec 12 17:24:38.305932 containerd[1508]: time="2025-12-12T17:24:38.305868391Z" level=info msg="runtime interface starting up..." Dec 12 17:24:38.305932 containerd[1508]: time="2025-12-12T17:24:38.305876369Z" level=info msg="starting plugins..." Dec 12 17:24:38.305932 containerd[1508]: time="2025-12-12T17:24:38.305890544Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Dec 12 17:24:38.308415 containerd[1508]: time="2025-12-12T17:24:38.308369290Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Dec 12 17:24:38.308501 containerd[1508]: time="2025-12-12T17:24:38.308440940Z" level=info msg=serving... address=/run/containerd/containerd.sock Dec 12 17:24:38.308598 systemd[1]: Started containerd.service - containerd container runtime. Dec 12 17:24:38.311130 containerd[1508]: time="2025-12-12T17:24:38.310859848Z" level=info msg="containerd successfully booted in 0.301466s" Dec 12 17:24:38.361728 tar[1493]: linux-arm64/README.md Dec 12 17:24:38.405679 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Dec 12 17:24:38.410738 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 17:24:38.411420 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:24:38.419510 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Dec 12 17:24:38.427787 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:24:38.468476 systemd-networkd[1418]: eth1: Gained IPv6LL Dec 12 17:24:38.475708 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Dec 12 17:24:38.478295 systemd[1]: Reached target network-online.target - Network is Online. Dec 12 17:24:38.484031 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:24:38.489322 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Dec 12 17:24:38.517558 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:24:38.574062 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Dec 12 17:24:39.234753 systemd-networkd[1418]: eth0: Gained IPv6LL Dec 12 17:24:39.320579 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:24:39.333311 (kubelet)[1635]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:24:39.752394 kubelet[1635]: E1212 17:24:39.752349 1635 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:24:39.755042 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:24:39.755302 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:24:39.757777 systemd[1]: kubelet.service: Consumed 804ms CPU time, 246.4M memory peak. Dec 12 17:24:40.056847 sshd_keygen[1506]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Dec 12 17:24:40.080997 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Dec 12 17:24:40.084867 systemd[1]: Starting issuegen.service - Generate /run/issue... Dec 12 17:24:40.108291 systemd[1]: issuegen.service: Deactivated successfully. Dec 12 17:24:40.108568 systemd[1]: Finished issuegen.service - Generate /run/issue. Dec 12 17:24:40.111992 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Dec 12 17:24:40.137572 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Dec 12 17:24:40.145810 systemd[1]: Started getty@tty1.service - Getty on tty1. Dec 12 17:24:40.149970 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Dec 12 17:24:40.150923 systemd[1]: Reached target getty.target - Login Prompts. Dec 12 17:24:40.151717 systemd[1]: Reached target multi-user.target - Multi-User System. Dec 12 17:24:40.153016 systemd[1]: Startup finished in 2.321s (kernel) + 5.288s (initrd) + 5.306s (userspace) = 12.917s. Dec 12 17:24:50.006151 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Dec 12 17:24:50.009468 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:24:50.181501 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:24:50.193338 (kubelet)[1670]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:24:50.241793 kubelet[1670]: E1212 17:24:50.241729 1670 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:24:50.245338 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:24:50.245520 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:24:50.246311 systemd[1]: kubelet.service: Consumed 176ms CPU time, 106.2M memory peak. Dec 12 17:25:00.496467 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Dec 12 17:25:00.501111 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:00.678454 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:00.690146 (kubelet)[1686]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:25:00.733988 kubelet[1686]: E1212 17:25:00.733925 1686 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:25:00.736829 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:25:00.737120 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:25:00.737808 systemd[1]: kubelet.service: Consumed 176ms CPU time, 106.4M memory peak. Dec 12 17:25:01.317703 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Dec 12 17:25:01.319499 systemd[1]: Started sshd@0-23.88.120.93:22-49.124.152.249:37802.service - OpenSSH per-connection server daemon (49.124.152.249:37802). Dec 12 17:25:01.593779 sshd[1695]: Connection closed by 49.124.152.249 port 37802 Dec 12 17:25:01.596434 systemd[1]: sshd@0-23.88.120.93:22-49.124.152.249:37802.service: Deactivated successfully. Dec 12 17:25:10.987904 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Dec 12 17:25:10.990646 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:11.168975 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:11.179041 (kubelet)[1707]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:25:11.224111 kubelet[1707]: E1212 17:25:11.223940 1707 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:25:11.226740 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:25:11.226983 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:25:11.227547 systemd[1]: kubelet.service: Consumed 176ms CPU time, 106.9M memory peak. Dec 12 17:25:12.388663 systemd[1]: Started sshd@1-23.88.120.93:22-139.178.89.65:39300.service - OpenSSH per-connection server daemon (139.178.89.65:39300). Dec 12 17:25:13.376789 sshd[1714]: Accepted publickey for core from 139.178.89.65 port 39300 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:25:13.380143 sshd-session[1714]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:25:13.401179 systemd-logind[1487]: New session 1 of user core. Dec 12 17:25:13.402445 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Dec 12 17:25:13.405353 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Dec 12 17:25:13.428805 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Dec 12 17:25:13.432076 systemd[1]: Starting user@500.service - User Manager for UID 500... Dec 12 17:25:13.447742 (systemd)[1719]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Dec 12 17:25:13.451954 systemd-logind[1487]: New session c1 of user core. Dec 12 17:25:13.583901 systemd[1719]: Queued start job for default target default.target. Dec 12 17:25:13.595938 systemd[1719]: Created slice app.slice - User Application Slice. Dec 12 17:25:13.596003 systemd[1719]: Reached target paths.target - Paths. Dec 12 17:25:13.596076 systemd[1719]: Reached target timers.target - Timers. Dec 12 17:25:13.598811 systemd[1719]: Starting dbus.socket - D-Bus User Message Bus Socket... Dec 12 17:25:13.631659 systemd[1719]: Listening on dbus.socket - D-Bus User Message Bus Socket. Dec 12 17:25:13.631967 systemd[1719]: Reached target sockets.target - Sockets. Dec 12 17:25:13.632116 systemd[1719]: Reached target basic.target - Basic System. Dec 12 17:25:13.632229 systemd[1719]: Reached target default.target - Main User Target. Dec 12 17:25:13.632340 systemd[1719]: Startup finished in 172ms. Dec 12 17:25:13.632354 systemd[1]: Started user@500.service - User Manager for UID 500. Dec 12 17:25:13.644008 systemd[1]: Started session-1.scope - Session 1 of User core. Dec 12 17:25:14.325729 systemd[1]: Started sshd@2-23.88.120.93:22-139.178.89.65:39304.service - OpenSSH per-connection server daemon (139.178.89.65:39304). Dec 12 17:25:15.324351 sshd[1730]: Accepted publickey for core from 139.178.89.65 port 39304 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:25:15.326463 sshd-session[1730]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:25:15.333364 systemd-logind[1487]: New session 2 of user core. Dec 12 17:25:15.339023 systemd[1]: Started session-2.scope - Session 2 of User core. Dec 12 17:25:16.010484 sshd[1733]: Connection closed by 139.178.89.65 port 39304 Dec 12 17:25:16.011385 sshd-session[1730]: pam_unix(sshd:session): session closed for user core Dec 12 17:25:16.019889 systemd-logind[1487]: Session 2 logged out. Waiting for processes to exit. Dec 12 17:25:16.020012 systemd[1]: sshd@2-23.88.120.93:22-139.178.89.65:39304.service: Deactivated successfully. Dec 12 17:25:16.023020 systemd[1]: session-2.scope: Deactivated successfully. Dec 12 17:25:16.026065 systemd-logind[1487]: Removed session 2. Dec 12 17:25:16.184879 systemd[1]: Started sshd@3-23.88.120.93:22-139.178.89.65:39320.service - OpenSSH per-connection server daemon (139.178.89.65:39320). Dec 12 17:25:17.182203 sshd[1739]: Accepted publickey for core from 139.178.89.65 port 39320 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:25:17.184867 sshd-session[1739]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:25:17.192724 systemd-logind[1487]: New session 3 of user core. Dec 12 17:25:17.211081 systemd[1]: Started session-3.scope - Session 3 of User core. Dec 12 17:25:17.851296 sshd[1742]: Connection closed by 139.178.89.65 port 39320 Dec 12 17:25:17.852288 sshd-session[1739]: pam_unix(sshd:session): session closed for user core Dec 12 17:25:17.858234 systemd[1]: sshd@3-23.88.120.93:22-139.178.89.65:39320.service: Deactivated successfully. Dec 12 17:25:17.861896 systemd[1]: session-3.scope: Deactivated successfully. Dec 12 17:25:17.862856 systemd-logind[1487]: Session 3 logged out. Waiting for processes to exit. Dec 12 17:25:17.864389 systemd-logind[1487]: Removed session 3. Dec 12 17:25:18.018744 systemd[1]: Started sshd@4-23.88.120.93:22-139.178.89.65:39334.service - OpenSSH per-connection server daemon (139.178.89.65:39334). Dec 12 17:25:19.008842 sshd[1748]: Accepted publickey for core from 139.178.89.65 port 39334 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:25:19.011198 sshd-session[1748]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:25:19.017064 systemd-logind[1487]: New session 4 of user core. Dec 12 17:25:19.029062 systemd[1]: Started session-4.scope - Session 4 of User core. Dec 12 17:25:19.686166 sshd[1751]: Connection closed by 139.178.89.65 port 39334 Dec 12 17:25:19.687071 sshd-session[1748]: pam_unix(sshd:session): session closed for user core Dec 12 17:25:19.692661 systemd[1]: sshd@4-23.88.120.93:22-139.178.89.65:39334.service: Deactivated successfully. Dec 12 17:25:19.695257 systemd[1]: session-4.scope: Deactivated successfully. Dec 12 17:25:19.698611 systemd-logind[1487]: Session 4 logged out. Waiting for processes to exit. Dec 12 17:25:19.700143 systemd-logind[1487]: Removed session 4. Dec 12 17:25:19.857163 systemd[1]: Started sshd@5-23.88.120.93:22-139.178.89.65:39342.service - OpenSSH per-connection server daemon (139.178.89.65:39342). Dec 12 17:25:20.827145 sshd[1757]: Accepted publickey for core from 139.178.89.65 port 39342 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:25:20.829041 sshd-session[1757]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:25:20.833906 systemd-logind[1487]: New session 5 of user core. Dec 12 17:25:20.849452 systemd[1]: Started session-5.scope - Session 5 of User core. Dec 12 17:25:21.345240 sudo[1761]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Dec 12 17:25:21.345528 sudo[1761]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:25:21.346932 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Dec 12 17:25:21.349821 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:21.360138 sudo[1761]: pam_unix(sudo:session): session closed for user root Dec 12 17:25:21.515045 sshd[1760]: Connection closed by 139.178.89.65 port 39342 Dec 12 17:25:21.515710 sshd-session[1757]: pam_unix(sshd:session): session closed for user core Dec 12 17:25:21.521381 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:21.521979 systemd[1]: sshd@5-23.88.120.93:22-139.178.89.65:39342.service: Deactivated successfully. Dec 12 17:25:21.523540 systemd[1]: session-5.scope: Deactivated successfully. Dec 12 17:25:21.527299 systemd-logind[1487]: Session 5 logged out. Waiting for processes to exit. Dec 12 17:25:21.528665 systemd-logind[1487]: Removed session 5. Dec 12 17:25:21.534202 (kubelet)[1770]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:25:21.578537 kubelet[1770]: E1212 17:25:21.578460 1770 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:25:21.581201 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:25:21.581337 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:25:21.581900 systemd[1]: kubelet.service: Consumed 165ms CPU time, 104.8M memory peak. Dec 12 17:25:21.684660 systemd[1]: Started sshd@6-23.88.120.93:22-139.178.89.65:58472.service - OpenSSH per-connection server daemon (139.178.89.65:58472). Dec 12 17:25:22.677452 sshd[1782]: Accepted publickey for core from 139.178.89.65 port 58472 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:25:22.679308 sshd-session[1782]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:25:22.685294 systemd-logind[1487]: New session 6 of user core. Dec 12 17:25:22.693967 systemd[1]: Started session-6.scope - Session 6 of User core. Dec 12 17:25:23.189184 sudo[1787]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Dec 12 17:25:23.189919 sudo[1787]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:25:23.196921 sudo[1787]: pam_unix(sudo:session): session closed for user root Dec 12 17:25:23.204138 sudo[1786]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Dec 12 17:25:23.204473 sudo[1786]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:25:23.219144 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 12 17:25:23.235826 update_engine[1488]: I20251212 17:25:23.235728 1488 update_attempter.cc:509] Updating boot flags... Dec 12 17:25:23.277661 augenrules[1817]: No rules Dec 12 17:25:23.278271 systemd[1]: audit-rules.service: Deactivated successfully. Dec 12 17:25:23.279721 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 12 17:25:23.281869 sudo[1786]: pam_unix(sudo:session): session closed for user root Dec 12 17:25:23.438690 sshd[1785]: Connection closed by 139.178.89.65 port 58472 Dec 12 17:25:23.440072 sshd-session[1782]: pam_unix(sshd:session): session closed for user core Dec 12 17:25:23.448065 systemd[1]: sshd@6-23.88.120.93:22-139.178.89.65:58472.service: Deactivated successfully. Dec 12 17:25:23.450310 systemd[1]: session-6.scope: Deactivated successfully. Dec 12 17:25:23.453375 systemd-logind[1487]: Session 6 logged out. Waiting for processes to exit. Dec 12 17:25:23.455173 systemd-logind[1487]: Removed session 6. Dec 12 17:25:23.611975 systemd[1]: Started sshd@7-23.88.120.93:22-139.178.89.65:58478.service - OpenSSH per-connection server daemon (139.178.89.65:58478). Dec 12 17:25:24.618216 sshd[1838]: Accepted publickey for core from 139.178.89.65 port 58478 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:25:24.620450 sshd-session[1838]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:25:24.625859 systemd-logind[1487]: New session 7 of user core. Dec 12 17:25:24.634016 systemd[1]: Started session-7.scope - Session 7 of User core. Dec 12 17:25:25.145925 sudo[1842]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Dec 12 17:25:25.146771 sudo[1842]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:25:25.466344 systemd[1]: Starting docker.service - Docker Application Container Engine... Dec 12 17:25:25.481520 (dockerd)[1861]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Dec 12 17:25:25.718276 dockerd[1861]: time="2025-12-12T17:25:25.718026315Z" level=info msg="Starting up" Dec 12 17:25:25.720719 dockerd[1861]: time="2025-12-12T17:25:25.720682036Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Dec 12 17:25:25.736118 dockerd[1861]: time="2025-12-12T17:25:25.735983911Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Dec 12 17:25:25.757095 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport3500865547-merged.mount: Deactivated successfully. Dec 12 17:25:25.780572 dockerd[1861]: time="2025-12-12T17:25:25.780510516Z" level=info msg="Loading containers: start." Dec 12 17:25:25.790687 kernel: Initializing XFRM netlink socket Dec 12 17:25:26.054234 systemd-networkd[1418]: docker0: Link UP Dec 12 17:25:26.060028 dockerd[1861]: time="2025-12-12T17:25:26.059962772Z" level=info msg="Loading containers: done." Dec 12 17:25:26.077235 dockerd[1861]: time="2025-12-12T17:25:26.077189744Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Dec 12 17:25:26.077418 dockerd[1861]: time="2025-12-12T17:25:26.077286026Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Dec 12 17:25:26.077418 dockerd[1861]: time="2025-12-12T17:25:26.077373507Z" level=info msg="Initializing buildkit" Dec 12 17:25:26.109643 dockerd[1861]: time="2025-12-12T17:25:26.109571979Z" level=info msg="Completed buildkit initialization" Dec 12 17:25:26.118680 dockerd[1861]: time="2025-12-12T17:25:26.118585191Z" level=info msg="Daemon has completed initialization" Dec 12 17:25:26.118809 dockerd[1861]: time="2025-12-12T17:25:26.118694473Z" level=info msg="API listen on /run/docker.sock" Dec 12 17:25:26.119037 systemd[1]: Started docker.service - Docker Application Container Engine. Dec 12 17:25:26.881868 containerd[1508]: time="2025-12-12T17:25:26.881809860Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\"" Dec 12 17:25:27.516471 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount706413767.mount: Deactivated successfully. Dec 12 17:25:28.479985 containerd[1508]: time="2025-12-12T17:25:28.479704378Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:28.481353 containerd[1508]: time="2025-12-12T17:25:28.481298999Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.3: active requests=0, bytes read=24571138" Dec 12 17:25:28.482376 containerd[1508]: time="2025-12-12T17:25:28.482321173Z" level=info msg="ImageCreate event name:\"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:28.484930 containerd[1508]: time="2025-12-12T17:25:28.484853007Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:28.486229 containerd[1508]: time="2025-12-12T17:25:28.485925301Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.3\" with image id \"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\", size \"24567639\" in 1.604054801s" Dec 12 17:25:28.486229 containerd[1508]: time="2025-12-12T17:25:28.485973102Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\" returns image reference \"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\"" Dec 12 17:25:28.486573 containerd[1508]: time="2025-12-12T17:25:28.486483188Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\"" Dec 12 17:25:29.699135 containerd[1508]: time="2025-12-12T17:25:29.699075956Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:29.700400 containerd[1508]: time="2025-12-12T17:25:29.700194490Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.3: active requests=0, bytes read=19135497" Dec 12 17:25:29.701610 containerd[1508]: time="2025-12-12T17:25:29.701558747Z" level=info msg="ImageCreate event name:\"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:29.704505 containerd[1508]: time="2025-12-12T17:25:29.704452224Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:29.705840 containerd[1508]: time="2025-12-12T17:25:29.705798401Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.3\" with image id \"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\", size \"20719958\" in 1.219282092s" Dec 12 17:25:29.706059 containerd[1508]: time="2025-12-12T17:25:29.705959323Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\" returns image reference \"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\"" Dec 12 17:25:29.707063 containerd[1508]: time="2025-12-12T17:25:29.707017297Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\"" Dec 12 17:25:30.777373 containerd[1508]: time="2025-12-12T17:25:30.777299114Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:30.778660 containerd[1508]: time="2025-12-12T17:25:30.778517649Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.3: active requests=0, bytes read=14191736" Dec 12 17:25:30.779977 containerd[1508]: time="2025-12-12T17:25:30.779924106Z" level=info msg="ImageCreate event name:\"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:30.784715 containerd[1508]: time="2025-12-12T17:25:30.784602123Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:30.788639 containerd[1508]: time="2025-12-12T17:25:30.788268448Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.3\" with image id \"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\", size \"15776215\" in 1.081208831s" Dec 12 17:25:30.788639 containerd[1508]: time="2025-12-12T17:25:30.788332729Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\" returns image reference \"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\"" Dec 12 17:25:30.790287 containerd[1508]: time="2025-12-12T17:25:30.790258272Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\"" Dec 12 17:25:31.690907 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Dec 12 17:25:31.693698 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:31.761507 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount99531026.mount: Deactivated successfully. Dec 12 17:25:31.867400 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:31.877032 (kubelet)[2151]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:25:31.923080 kubelet[2151]: E1212 17:25:31.923024 2151 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:25:31.925234 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:25:31.925369 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:25:31.927025 systemd[1]: kubelet.service: Consumed 166ms CPU time, 106.7M memory peak. Dec 12 17:25:32.070793 containerd[1508]: time="2025-12-12T17:25:32.070642931Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:32.072500 containerd[1508]: time="2025-12-12T17:25:32.072455872Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.3: active requests=0, bytes read=22805279" Dec 12 17:25:32.073422 containerd[1508]: time="2025-12-12T17:25:32.073375482Z" level=info msg="ImageCreate event name:\"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:32.076689 containerd[1508]: time="2025-12-12T17:25:32.076167353Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:32.076814 containerd[1508]: time="2025-12-12T17:25:32.076771480Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.3\" with image id \"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\", repo tag \"registry.k8s.io/kube-proxy:v1.34.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\", size \"22804272\" in 1.286354846s" Dec 12 17:25:32.076844 containerd[1508]: time="2025-12-12T17:25:32.076826561Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\" returns image reference \"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\"" Dec 12 17:25:32.077464 containerd[1508]: time="2025-12-12T17:25:32.077426087Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Dec 12 17:25:32.674073 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3437893425.mount: Deactivated successfully. Dec 12 17:25:33.403843 containerd[1508]: time="2025-12-12T17:25:33.403204080Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:33.405141 containerd[1508]: time="2025-12-12T17:25:33.405085540Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=20395498" Dec 12 17:25:33.406278 containerd[1508]: time="2025-12-12T17:25:33.405734867Z" level=info msg="ImageCreate event name:\"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:33.409497 containerd[1508]: time="2025-12-12T17:25:33.409451906Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:33.411777 containerd[1508]: time="2025-12-12T17:25:33.411733611Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"20392204\" in 1.334258163s" Dec 12 17:25:33.411942 containerd[1508]: time="2025-12-12T17:25:33.411926853Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\"" Dec 12 17:25:33.412528 containerd[1508]: time="2025-12-12T17:25:33.412471939Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Dec 12 17:25:33.985314 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2539599424.mount: Deactivated successfully. Dec 12 17:25:33.992189 containerd[1508]: time="2025-12-12T17:25:33.991408982Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:33.992655 containerd[1508]: time="2025-12-12T17:25:33.992598995Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=268729" Dec 12 17:25:33.994323 containerd[1508]: time="2025-12-12T17:25:33.994287173Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:33.997286 containerd[1508]: time="2025-12-12T17:25:33.997220205Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:33.998987 containerd[1508]: time="2025-12-12T17:25:33.998928663Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 586.246442ms" Dec 12 17:25:33.998987 containerd[1508]: time="2025-12-12T17:25:33.998980503Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\"" Dec 12 17:25:33.999538 containerd[1508]: time="2025-12-12T17:25:33.999506029Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\"" Dec 12 17:25:34.632448 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1016295701.mount: Deactivated successfully. Dec 12 17:25:37.249754 containerd[1508]: time="2025-12-12T17:25:37.249687503Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.4-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:37.250961 containerd[1508]: time="2025-12-12T17:25:37.250824753Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.4-0: active requests=0, bytes read=98063043" Dec 12 17:25:37.251738 containerd[1508]: time="2025-12-12T17:25:37.251702001Z" level=info msg="ImageCreate event name:\"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:37.254757 containerd[1508]: time="2025-12-12T17:25:37.254687428Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:37.256683 containerd[1508]: time="2025-12-12T17:25:37.255703078Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.4-0\" with image id \"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\", repo tag \"registry.k8s.io/etcd:3.6.4-0\", repo digest \"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\", size \"98207481\" in 3.256156049s" Dec 12 17:25:37.256683 containerd[1508]: time="2025-12-12T17:25:37.255743238Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\" returns image reference \"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\"" Dec 12 17:25:41.941259 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Dec 12 17:25:41.945793 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:42.111785 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:42.123570 (kubelet)[2296]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:25:42.173609 kubelet[2296]: E1212 17:25:42.173560 2296 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:25:42.176767 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:25:42.177051 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:25:42.177783 systemd[1]: kubelet.service: Consumed 165ms CPU time, 104.7M memory peak. Dec 12 17:25:43.061292 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:43.061863 systemd[1]: kubelet.service: Consumed 165ms CPU time, 104.7M memory peak. Dec 12 17:25:43.064894 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:43.103709 systemd[1]: Reload requested from client PID 2310 ('systemctl') (unit session-7.scope)... Dec 12 17:25:43.103727 systemd[1]: Reloading... Dec 12 17:25:43.236665 zram_generator::config[2354]: No configuration found. Dec 12 17:25:43.434145 systemd[1]: Reloading finished in 330 ms. Dec 12 17:25:43.490398 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Dec 12 17:25:43.490501 systemd[1]: kubelet.service: Failed with result 'signal'. Dec 12 17:25:43.491159 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:43.491227 systemd[1]: kubelet.service: Consumed 119ms CPU time, 94.9M memory peak. Dec 12 17:25:43.493592 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:43.645847 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:43.655528 (kubelet)[2401]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 12 17:25:43.699229 kubelet[2401]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 12 17:25:43.699571 kubelet[2401]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 17:25:43.699764 kubelet[2401]: I1212 17:25:43.699729 2401 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 12 17:25:43.930881 kubelet[2401]: I1212 17:25:43.930823 2401 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Dec 12 17:25:43.930881 kubelet[2401]: I1212 17:25:43.930864 2401 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 12 17:25:43.930881 kubelet[2401]: I1212 17:25:43.930893 2401 watchdog_linux.go:95] "Systemd watchdog is not enabled" Dec 12 17:25:43.930881 kubelet[2401]: I1212 17:25:43.930899 2401 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 12 17:25:43.931238 kubelet[2401]: I1212 17:25:43.931197 2401 server.go:956] "Client rotation is on, will bootstrap in background" Dec 12 17:25:43.941484 kubelet[2401]: I1212 17:25:43.940790 2401 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 12 17:25:43.941484 kubelet[2401]: E1212 17:25:43.941363 2401 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://23.88.120.93:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 23.88.120.93:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Dec 12 17:25:43.951852 kubelet[2401]: I1212 17:25:43.951466 2401 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 12 17:25:43.954870 kubelet[2401]: I1212 17:25:43.954835 2401 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Dec 12 17:25:43.955112 kubelet[2401]: I1212 17:25:43.955078 2401 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 12 17:25:43.955287 kubelet[2401]: I1212 17:25:43.955113 2401 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-1-a1e622265d","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 12 17:25:43.955391 kubelet[2401]: I1212 17:25:43.955290 2401 topology_manager.go:138] "Creating topology manager with none policy" Dec 12 17:25:43.955391 kubelet[2401]: I1212 17:25:43.955299 2401 container_manager_linux.go:306] "Creating device plugin manager" Dec 12 17:25:43.955445 kubelet[2401]: I1212 17:25:43.955430 2401 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Dec 12 17:25:43.958280 kubelet[2401]: I1212 17:25:43.958245 2401 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:25:43.959750 kubelet[2401]: I1212 17:25:43.959715 2401 kubelet.go:475] "Attempting to sync node with API server" Dec 12 17:25:43.959750 kubelet[2401]: I1212 17:25:43.959747 2401 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 12 17:25:43.960521 kubelet[2401]: E1212 17:25:43.960472 2401 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://23.88.120.93:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-2-1-a1e622265d&limit=500&resourceVersion=0\": dial tcp 23.88.120.93:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 12 17:25:43.960879 kubelet[2401]: I1212 17:25:43.960853 2401 kubelet.go:387] "Adding apiserver pod source" Dec 12 17:25:43.960943 kubelet[2401]: I1212 17:25:43.960889 2401 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 12 17:25:43.962881 kubelet[2401]: E1212 17:25:43.962699 2401 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://23.88.120.93:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 23.88.120.93:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 12 17:25:43.964707 kubelet[2401]: I1212 17:25:43.963058 2401 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 12 17:25:43.964707 kubelet[2401]: I1212 17:25:43.964181 2401 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 12 17:25:43.964707 kubelet[2401]: I1212 17:25:43.964215 2401 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Dec 12 17:25:43.964707 kubelet[2401]: W1212 17:25:43.964267 2401 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Dec 12 17:25:43.968821 kubelet[2401]: I1212 17:25:43.968786 2401 server.go:1262] "Started kubelet" Dec 12 17:25:43.969341 kubelet[2401]: I1212 17:25:43.969303 2401 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 12 17:25:43.970402 kubelet[2401]: I1212 17:25:43.970375 2401 server.go:310] "Adding debug handlers to kubelet server" Dec 12 17:25:43.972384 kubelet[2401]: I1212 17:25:43.972297 2401 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 12 17:25:43.972384 kubelet[2401]: I1212 17:25:43.972382 2401 server_v1.go:49] "podresources" method="list" useActivePods=true Dec 12 17:25:43.972808 kubelet[2401]: I1212 17:25:43.972775 2401 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 12 17:25:43.975699 kubelet[2401]: E1212 17:25:43.972947 2401 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://23.88.120.93:6443/api/v1/namespaces/default/events\": dial tcp 23.88.120.93:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-2-1-a1e622265d.188087c74bc604c0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-2-1-a1e622265d,UID:ci-4459-2-2-1-a1e622265d,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-2-1-a1e622265d,},FirstTimestamp:2025-12-12 17:25:43.96874464 +0000 UTC m=+0.309557221,LastTimestamp:2025-12-12 17:25:43.96874464 +0000 UTC m=+0.309557221,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-1-a1e622265d,}" Dec 12 17:25:43.980305 kubelet[2401]: E1212 17:25:43.980215 2401 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 12 17:25:43.980680 kubelet[2401]: I1212 17:25:43.980351 2401 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 12 17:25:43.980680 kubelet[2401]: I1212 17:25:43.980403 2401 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 12 17:25:43.981778 kubelet[2401]: I1212 17:25:43.981711 2401 volume_manager.go:313] "Starting Kubelet Volume Manager" Dec 12 17:25:43.984352 kubelet[2401]: I1212 17:25:43.984088 2401 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 12 17:25:43.984352 kubelet[2401]: I1212 17:25:43.984184 2401 reconciler.go:29] "Reconciler: start to sync state" Dec 12 17:25:43.984820 kubelet[2401]: E1212 17:25:43.984771 2401 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://23.88.120.93:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 23.88.120.93:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 12 17:25:43.985908 kubelet[2401]: E1212 17:25:43.985866 2401 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-1-a1e622265d\" not found" Dec 12 17:25:43.986352 kubelet[2401]: E1212 17:25:43.986318 2401 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://23.88.120.93:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-1-a1e622265d?timeout=10s\": dial tcp 23.88.120.93:6443: connect: connection refused" interval="200ms" Dec 12 17:25:43.986575 kubelet[2401]: I1212 17:25:43.986550 2401 factory.go:223] Registration of the systemd container factory successfully Dec 12 17:25:43.986731 kubelet[2401]: I1212 17:25:43.986707 2401 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 12 17:25:43.987904 kubelet[2401]: I1212 17:25:43.987878 2401 factory.go:223] Registration of the containerd container factory successfully Dec 12 17:25:44.007876 kubelet[2401]: I1212 17:25:44.007843 2401 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 12 17:25:44.007876 kubelet[2401]: I1212 17:25:44.007869 2401 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 12 17:25:44.008034 kubelet[2401]: I1212 17:25:44.007894 2401 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:25:44.011671 kubelet[2401]: I1212 17:25:44.010890 2401 policy_none.go:49] "None policy: Start" Dec 12 17:25:44.011671 kubelet[2401]: I1212 17:25:44.010926 2401 memory_manager.go:187] "Starting memorymanager" policy="None" Dec 12 17:25:44.011671 kubelet[2401]: I1212 17:25:44.010940 2401 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Dec 12 17:25:44.012488 kubelet[2401]: I1212 17:25:44.012428 2401 policy_none.go:47] "Start" Dec 12 17:25:44.020581 kubelet[2401]: I1212 17:25:44.020479 2401 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Dec 12 17:25:44.023188 kubelet[2401]: I1212 17:25:44.022772 2401 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Dec 12 17:25:44.023188 kubelet[2401]: I1212 17:25:44.022804 2401 status_manager.go:244] "Starting to sync pod status with apiserver" Dec 12 17:25:44.023188 kubelet[2401]: I1212 17:25:44.022848 2401 kubelet.go:2427] "Starting kubelet main sync loop" Dec 12 17:25:44.023188 kubelet[2401]: E1212 17:25:44.022899 2401 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 12 17:25:44.024123 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Dec 12 17:25:44.028372 kubelet[2401]: E1212 17:25:44.028335 2401 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://23.88.120.93:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 23.88.120.93:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Dec 12 17:25:44.043129 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Dec 12 17:25:44.048108 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Dec 12 17:25:44.060045 kubelet[2401]: E1212 17:25:44.059983 2401 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 12 17:25:44.061841 kubelet[2401]: I1212 17:25:44.061756 2401 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 12 17:25:44.062126 kubelet[2401]: I1212 17:25:44.061798 2401 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 12 17:25:44.064426 kubelet[2401]: I1212 17:25:44.064406 2401 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 12 17:25:44.065119 kubelet[2401]: E1212 17:25:44.065017 2401 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 12 17:25:44.065119 kubelet[2401]: E1212 17:25:44.065082 2401 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-2-1-a1e622265d\" not found" Dec 12 17:25:44.141850 systemd[1]: Created slice kubepods-burstable-pod3dda58f77b4c8cc70e516d3a1d8232bf.slice - libcontainer container kubepods-burstable-pod3dda58f77b4c8cc70e516d3a1d8232bf.slice. Dec 12 17:25:44.151316 kubelet[2401]: E1212 17:25:44.151261 2401 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-1-a1e622265d\" not found" node="ci-4459-2-2-1-a1e622265d" Dec 12 17:25:44.157662 systemd[1]: Created slice kubepods-burstable-pod4ab3e080306a34c2e7d66e718b927097.slice - libcontainer container kubepods-burstable-pod4ab3e080306a34c2e7d66e718b927097.slice. Dec 12 17:25:44.161810 kubelet[2401]: E1212 17:25:44.161755 2401 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-1-a1e622265d\" not found" node="ci-4459-2-2-1-a1e622265d" Dec 12 17:25:44.163311 systemd[1]: Created slice kubepods-burstable-pod10a215612e7c69b012babe332ee7d047.slice - libcontainer container kubepods-burstable-pod10a215612e7c69b012babe332ee7d047.slice. Dec 12 17:25:44.165935 kubelet[2401]: E1212 17:25:44.165897 2401 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-1-a1e622265d\" not found" node="ci-4459-2-2-1-a1e622265d" Dec 12 17:25:44.166328 kubelet[2401]: I1212 17:25:44.166314 2401 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-1-a1e622265d" Dec 12 17:25:44.166952 kubelet[2401]: E1212 17:25:44.166926 2401 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://23.88.120.93:6443/api/v1/nodes\": dial tcp 23.88.120.93:6443: connect: connection refused" node="ci-4459-2-2-1-a1e622265d" Dec 12 17:25:44.185430 kubelet[2401]: I1212 17:25:44.185206 2401 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3dda58f77b4c8cc70e516d3a1d8232bf-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-1-a1e622265d\" (UID: \"3dda58f77b4c8cc70e516d3a1d8232bf\") " pod="kube-system/kube-apiserver-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:44.185430 kubelet[2401]: I1212 17:25:44.185436 2401 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3dda58f77b4c8cc70e516d3a1d8232bf-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-1-a1e622265d\" (UID: \"3dda58f77b4c8cc70e516d3a1d8232bf\") " pod="kube-system/kube-apiserver-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:44.185976 kubelet[2401]: I1212 17:25:44.185477 2401 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4ab3e080306a34c2e7d66e718b927097-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-1-a1e622265d\" (UID: \"4ab3e080306a34c2e7d66e718b927097\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:44.186102 kubelet[2401]: I1212 17:25:44.185560 2401 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4ab3e080306a34c2e7d66e718b927097-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-1-a1e622265d\" (UID: \"4ab3e080306a34c2e7d66e718b927097\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:44.186207 kubelet[2401]: I1212 17:25:44.186153 2401 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4ab3e080306a34c2e7d66e718b927097-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-1-a1e622265d\" (UID: \"4ab3e080306a34c2e7d66e718b927097\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:44.186270 kubelet[2401]: I1212 17:25:44.186211 2401 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/10a215612e7c69b012babe332ee7d047-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-1-a1e622265d\" (UID: \"10a215612e7c69b012babe332ee7d047\") " pod="kube-system/kube-scheduler-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:44.186270 kubelet[2401]: I1212 17:25:44.186257 2401 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3dda58f77b4c8cc70e516d3a1d8232bf-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-1-a1e622265d\" (UID: \"3dda58f77b4c8cc70e516d3a1d8232bf\") " pod="kube-system/kube-apiserver-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:44.186359 kubelet[2401]: I1212 17:25:44.186281 2401 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/4ab3e080306a34c2e7d66e718b927097-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-1-a1e622265d\" (UID: \"4ab3e080306a34c2e7d66e718b927097\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:44.186359 kubelet[2401]: I1212 17:25:44.186298 2401 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4ab3e080306a34c2e7d66e718b927097-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-1-a1e622265d\" (UID: \"4ab3e080306a34c2e7d66e718b927097\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:44.188154 kubelet[2401]: E1212 17:25:44.188105 2401 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://23.88.120.93:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-1-a1e622265d?timeout=10s\": dial tcp 23.88.120.93:6443: connect: connection refused" interval="400ms" Dec 12 17:25:44.371420 kubelet[2401]: I1212 17:25:44.370169 2401 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-1-a1e622265d" Dec 12 17:25:44.374108 kubelet[2401]: E1212 17:25:44.370588 2401 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://23.88.120.93:6443/api/v1/nodes\": dial tcp 23.88.120.93:6443: connect: connection refused" node="ci-4459-2-2-1-a1e622265d" Dec 12 17:25:44.477681 containerd[1508]: time="2025-12-12T17:25:44.476989909Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-1-a1e622265d,Uid:4ab3e080306a34c2e7d66e718b927097,Namespace:kube-system,Attempt:0,}" Dec 12 17:25:44.482154 containerd[1508]: time="2025-12-12T17:25:44.478849403Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-1-a1e622265d,Uid:3dda58f77b4c8cc70e516d3a1d8232bf,Namespace:kube-system,Attempt:0,}" Dec 12 17:25:44.485495 containerd[1508]: time="2025-12-12T17:25:44.485450050Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-1-a1e622265d,Uid:10a215612e7c69b012babe332ee7d047,Namespace:kube-system,Attempt:0,}" Dec 12 17:25:44.589775 kubelet[2401]: E1212 17:25:44.589708 2401 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://23.88.120.93:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-1-a1e622265d?timeout=10s\": dial tcp 23.88.120.93:6443: connect: connection refused" interval="800ms" Dec 12 17:25:44.776520 kubelet[2401]: I1212 17:25:44.776352 2401 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-1-a1e622265d" Dec 12 17:25:44.777448 kubelet[2401]: E1212 17:25:44.777390 2401 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://23.88.120.93:6443/api/v1/nodes\": dial tcp 23.88.120.93:6443: connect: connection refused" node="ci-4459-2-2-1-a1e622265d" Dec 12 17:25:44.809428 kubelet[2401]: E1212 17:25:44.809190 2401 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://23.88.120.93:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 23.88.120.93:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 12 17:25:44.952330 kubelet[2401]: E1212 17:25:44.952098 2401 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://23.88.120.93:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-2-1-a1e622265d&limit=500&resourceVersion=0\": dial tcp 23.88.120.93:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 12 17:25:45.018492 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2037951344.mount: Deactivated successfully. Dec 12 17:25:45.032174 containerd[1508]: time="2025-12-12T17:25:45.031843340Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:25:45.034534 containerd[1508]: time="2025-12-12T17:25:45.034379998Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Dec 12 17:25:45.037733 containerd[1508]: time="2025-12-12T17:25:45.037254338Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:25:45.041046 containerd[1508]: time="2025-12-12T17:25:45.040979644Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:25:45.043109 containerd[1508]: time="2025-12-12T17:25:45.043037458Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:25:45.044865 containerd[1508]: time="2025-12-12T17:25:45.044766910Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 12 17:25:45.047302 containerd[1508]: time="2025-12-12T17:25:45.047224328Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:25:45.048190 containerd[1508]: time="2025-12-12T17:25:45.047953373Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 12 17:25:45.050249 containerd[1508]: time="2025-12-12T17:25:45.050148388Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 568.187603ms" Dec 12 17:25:45.054547 containerd[1508]: time="2025-12-12T17:25:45.054089456Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 562.499282ms" Dec 12 17:25:45.061776 containerd[1508]: time="2025-12-12T17:25:45.061592628Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 568.623044ms" Dec 12 17:25:45.087504 containerd[1508]: time="2025-12-12T17:25:45.087449849Z" level=info msg="connecting to shim 0074d04f6639beb1334d27bc410aa3885e6b76f9b645f4e4007b49b114b089b6" address="unix:///run/containerd/s/8a5297e54ee8ea3691ed049294a02c410e332db2e4dbd0e0215823e461a0d359" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:25:45.101683 containerd[1508]: time="2025-12-12T17:25:45.100860143Z" level=info msg="connecting to shim aed981c6fea3e19f10e2bb16d48b9a28cc887b7f5a8595d49cc46bb50342f100" address="unix:///run/containerd/s/c30dde26079f9fc2fde3d43ab2bbb9a46a322cbc90b2f6474e4f70a6b6e04fce" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:25:45.118964 containerd[1508]: time="2025-12-12T17:25:45.118896829Z" level=info msg="connecting to shim 42c0b604f33ecdc7519e9cd5a448e884a2bc3fca453c3b7c12c8168356317e05" address="unix:///run/containerd/s/3aed40cb7be7b94f9fbf27f470f02e8a522115d370928d4be0d34330977059f7" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:25:45.120109 kubelet[2401]: E1212 17:25:45.120071 2401 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://23.88.120.93:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 23.88.120.93:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 12 17:25:45.131156 systemd[1]: Started cri-containerd-0074d04f6639beb1334d27bc410aa3885e6b76f9b645f4e4007b49b114b089b6.scope - libcontainer container 0074d04f6639beb1334d27bc410aa3885e6b76f9b645f4e4007b49b114b089b6. Dec 12 17:25:45.152866 systemd[1]: Started cri-containerd-aed981c6fea3e19f10e2bb16d48b9a28cc887b7f5a8595d49cc46bb50342f100.scope - libcontainer container aed981c6fea3e19f10e2bb16d48b9a28cc887b7f5a8595d49cc46bb50342f100. Dec 12 17:25:45.175179 systemd[1]: Started cri-containerd-42c0b604f33ecdc7519e9cd5a448e884a2bc3fca453c3b7c12c8168356317e05.scope - libcontainer container 42c0b604f33ecdc7519e9cd5a448e884a2bc3fca453c3b7c12c8168356317e05. Dec 12 17:25:45.209068 containerd[1508]: time="2025-12-12T17:25:45.209020219Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-1-a1e622265d,Uid:4ab3e080306a34c2e7d66e718b927097,Namespace:kube-system,Attempt:0,} returns sandbox id \"0074d04f6639beb1334d27bc410aa3885e6b76f9b645f4e4007b49b114b089b6\"" Dec 12 17:25:45.226956 containerd[1508]: time="2025-12-12T17:25:45.226896064Z" level=info msg="CreateContainer within sandbox \"0074d04f6639beb1334d27bc410aa3885e6b76f9b645f4e4007b49b114b089b6\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Dec 12 17:25:45.240389 containerd[1508]: time="2025-12-12T17:25:45.240335478Z" level=info msg="Container fe72a36bac981fec69246dda02f90d12531ee2fa4665edf92d6db32f72828f47: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:25:45.256771 containerd[1508]: time="2025-12-12T17:25:45.256701672Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-1-a1e622265d,Uid:3dda58f77b4c8cc70e516d3a1d8232bf,Namespace:kube-system,Attempt:0,} returns sandbox id \"aed981c6fea3e19f10e2bb16d48b9a28cc887b7f5a8595d49cc46bb50342f100\"" Dec 12 17:25:45.259449 containerd[1508]: time="2025-12-12T17:25:45.259396291Z" level=info msg="CreateContainer within sandbox \"0074d04f6639beb1334d27bc410aa3885e6b76f9b645f4e4007b49b114b089b6\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"fe72a36bac981fec69246dda02f90d12531ee2fa4665edf92d6db32f72828f47\"" Dec 12 17:25:45.262658 containerd[1508]: time="2025-12-12T17:25:45.261874669Z" level=info msg="StartContainer for \"fe72a36bac981fec69246dda02f90d12531ee2fa4665edf92d6db32f72828f47\"" Dec 12 17:25:45.267299 containerd[1508]: time="2025-12-12T17:25:45.267242706Z" level=info msg="CreateContainer within sandbox \"aed981c6fea3e19f10e2bb16d48b9a28cc887b7f5a8595d49cc46bb50342f100\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Dec 12 17:25:45.267771 containerd[1508]: time="2025-12-12T17:25:45.267594269Z" level=info msg="connecting to shim fe72a36bac981fec69246dda02f90d12531ee2fa4665edf92d6db32f72828f47" address="unix:///run/containerd/s/8a5297e54ee8ea3691ed049294a02c410e332db2e4dbd0e0215823e461a0d359" protocol=ttrpc version=3 Dec 12 17:25:45.269483 containerd[1508]: time="2025-12-12T17:25:45.269446322Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-1-a1e622265d,Uid:10a215612e7c69b012babe332ee7d047,Namespace:kube-system,Attempt:0,} returns sandbox id \"42c0b604f33ecdc7519e9cd5a448e884a2bc3fca453c3b7c12c8168356317e05\"" Dec 12 17:25:45.277711 containerd[1508]: time="2025-12-12T17:25:45.277457898Z" level=info msg="CreateContainer within sandbox \"42c0b604f33ecdc7519e9cd5a448e884a2bc3fca453c3b7c12c8168356317e05\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Dec 12 17:25:45.278583 containerd[1508]: time="2025-12-12T17:25:45.278497625Z" level=info msg="Container 35c021aa418d7822de9c377a73c16a58beb5109590bb5becedd245df551b293c: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:25:45.291999 containerd[1508]: time="2025-12-12T17:25:45.291338435Z" level=info msg="CreateContainer within sandbox \"aed981c6fea3e19f10e2bb16d48b9a28cc887b7f5a8595d49cc46bb50342f100\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"35c021aa418d7822de9c377a73c16a58beb5109590bb5becedd245df551b293c\"" Dec 12 17:25:45.291999 containerd[1508]: time="2025-12-12T17:25:45.291962279Z" level=info msg="StartContainer for \"35c021aa418d7822de9c377a73c16a58beb5109590bb5becedd245df551b293c\"" Dec 12 17:25:45.293902 containerd[1508]: time="2025-12-12T17:25:45.293854572Z" level=info msg="connecting to shim 35c021aa418d7822de9c377a73c16a58beb5109590bb5becedd245df551b293c" address="unix:///run/containerd/s/c30dde26079f9fc2fde3d43ab2bbb9a46a322cbc90b2f6474e4f70a6b6e04fce" protocol=ttrpc version=3 Dec 12 17:25:45.300075 systemd[1]: Started cri-containerd-fe72a36bac981fec69246dda02f90d12531ee2fa4665edf92d6db32f72828f47.scope - libcontainer container fe72a36bac981fec69246dda02f90d12531ee2fa4665edf92d6db32f72828f47. Dec 12 17:25:45.308086 containerd[1508]: time="2025-12-12T17:25:45.307974151Z" level=info msg="Container add69298466784047f1a9dc34d511845b3aba4295e8a4e9eadd05b6dd2d47f9a: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:25:45.320914 containerd[1508]: time="2025-12-12T17:25:45.320842841Z" level=info msg="CreateContainer within sandbox \"42c0b604f33ecdc7519e9cd5a448e884a2bc3fca453c3b7c12c8168356317e05\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"add69298466784047f1a9dc34d511845b3aba4295e8a4e9eadd05b6dd2d47f9a\"" Dec 12 17:25:45.323262 containerd[1508]: time="2025-12-12T17:25:45.322955816Z" level=info msg="StartContainer for \"add69298466784047f1a9dc34d511845b3aba4295e8a4e9eadd05b6dd2d47f9a\"" Dec 12 17:25:45.326679 containerd[1508]: time="2025-12-12T17:25:45.325548074Z" level=info msg="connecting to shim add69298466784047f1a9dc34d511845b3aba4295e8a4e9eadd05b6dd2d47f9a" address="unix:///run/containerd/s/3aed40cb7be7b94f9fbf27f470f02e8a522115d370928d4be0d34330977059f7" protocol=ttrpc version=3 Dec 12 17:25:45.331889 systemd[1]: Started cri-containerd-35c021aa418d7822de9c377a73c16a58beb5109590bb5becedd245df551b293c.scope - libcontainer container 35c021aa418d7822de9c377a73c16a58beb5109590bb5becedd245df551b293c. Dec 12 17:25:45.375213 systemd[1]: Started cri-containerd-add69298466784047f1a9dc34d511845b3aba4295e8a4e9eadd05b6dd2d47f9a.scope - libcontainer container add69298466784047f1a9dc34d511845b3aba4295e8a4e9eadd05b6dd2d47f9a. Dec 12 17:25:45.390409 kubelet[2401]: E1212 17:25:45.390367 2401 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://23.88.120.93:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-1-a1e622265d?timeout=10s\": dial tcp 23.88.120.93:6443: connect: connection refused" interval="1.6s" Dec 12 17:25:45.398312 containerd[1508]: time="2025-12-12T17:25:45.398232942Z" level=info msg="StartContainer for \"fe72a36bac981fec69246dda02f90d12531ee2fa4665edf92d6db32f72828f47\" returns successfully" Dec 12 17:25:45.424156 containerd[1508]: time="2025-12-12T17:25:45.424095603Z" level=info msg="StartContainer for \"35c021aa418d7822de9c377a73c16a58beb5109590bb5becedd245df551b293c\" returns successfully" Dec 12 17:25:45.498983 kubelet[2401]: E1212 17:25:45.498257 2401 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://23.88.120.93:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 23.88.120.93:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Dec 12 17:25:45.501865 containerd[1508]: time="2025-12-12T17:25:45.501819026Z" level=info msg="StartContainer for \"add69298466784047f1a9dc34d511845b3aba4295e8a4e9eadd05b6dd2d47f9a\" returns successfully" Dec 12 17:25:45.580107 kubelet[2401]: I1212 17:25:45.579984 2401 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-1-a1e622265d" Dec 12 17:25:45.580401 kubelet[2401]: E1212 17:25:45.580371 2401 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://23.88.120.93:6443/api/v1/nodes\": dial tcp 23.88.120.93:6443: connect: connection refused" node="ci-4459-2-2-1-a1e622265d" Dec 12 17:25:46.060551 kubelet[2401]: E1212 17:25:46.060002 2401 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-1-a1e622265d\" not found" node="ci-4459-2-2-1-a1e622265d" Dec 12 17:25:46.062817 kubelet[2401]: E1212 17:25:46.062768 2401 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-1-a1e622265d\" not found" node="ci-4459-2-2-1-a1e622265d" Dec 12 17:25:46.066020 kubelet[2401]: E1212 17:25:46.065988 2401 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-1-a1e622265d\" not found" node="ci-4459-2-2-1-a1e622265d" Dec 12 17:25:47.071752 kubelet[2401]: E1212 17:25:47.070577 2401 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-1-a1e622265d\" not found" node="ci-4459-2-2-1-a1e622265d" Dec 12 17:25:47.072370 kubelet[2401]: E1212 17:25:47.070693 2401 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-1-a1e622265d\" not found" node="ci-4459-2-2-1-a1e622265d" Dec 12 17:25:47.182806 kubelet[2401]: I1212 17:25:47.182765 2401 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-1-a1e622265d" Dec 12 17:25:48.536875 kubelet[2401]: I1212 17:25:48.536655 2401 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-1-a1e622265d" Dec 12 17:25:48.536875 kubelet[2401]: E1212 17:25:48.536700 2401 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"ci-4459-2-2-1-a1e622265d\": node \"ci-4459-2-2-1-a1e622265d\" not found" Dec 12 17:25:48.590104 kubelet[2401]: I1212 17:25:48.590061 2401 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:48.650909 kubelet[2401]: E1212 17:25:48.650870 2401 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-1-a1e622265d\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:48.650909 kubelet[2401]: I1212 17:25:48.650903 2401 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:48.657341 kubelet[2401]: E1212 17:25:48.657110 2401 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-1-a1e622265d\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:48.657341 kubelet[2401]: I1212 17:25:48.657142 2401 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:48.664924 kubelet[2401]: E1212 17:25:48.664889 2401 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-2-1-a1e622265d\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:48.965157 kubelet[2401]: I1212 17:25:48.964723 2401 apiserver.go:52] "Watching apiserver" Dec 12 17:25:48.984260 kubelet[2401]: I1212 17:25:48.984220 2401 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 12 17:25:50.625497 systemd[1]: Reload requested from client PID 2685 ('systemctl') (unit session-7.scope)... Dec 12 17:25:50.625894 systemd[1]: Reloading... Dec 12 17:25:50.740654 zram_generator::config[2735]: No configuration found. Dec 12 17:25:50.950931 systemd[1]: Reloading finished in 324 ms. Dec 12 17:25:50.982811 kubelet[2401]: I1212 17:25:50.982645 2401 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 12 17:25:50.983319 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:51.000147 systemd[1]: kubelet.service: Deactivated successfully. Dec 12 17:25:51.000405 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:51.000471 systemd[1]: kubelet.service: Consumed 776ms CPU time, 120.7M memory peak. Dec 12 17:25:51.002931 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:25:51.190355 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:25:51.204052 (kubelet)[2774]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 12 17:25:51.257665 kubelet[2774]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 12 17:25:51.258404 kubelet[2774]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 17:25:51.258578 kubelet[2774]: I1212 17:25:51.258509 2774 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 12 17:25:51.272819 kubelet[2774]: I1212 17:25:51.272745 2774 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Dec 12 17:25:51.272819 kubelet[2774]: I1212 17:25:51.272782 2774 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 12 17:25:51.272819 kubelet[2774]: I1212 17:25:51.272821 2774 watchdog_linux.go:95] "Systemd watchdog is not enabled" Dec 12 17:25:51.272819 kubelet[2774]: I1212 17:25:51.272828 2774 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 12 17:25:51.273133 kubelet[2774]: I1212 17:25:51.273113 2774 server.go:956] "Client rotation is on, will bootstrap in background" Dec 12 17:25:51.274695 kubelet[2774]: I1212 17:25:51.274658 2774 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Dec 12 17:25:51.281541 kubelet[2774]: I1212 17:25:51.281406 2774 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 12 17:25:51.287813 kubelet[2774]: I1212 17:25:51.287691 2774 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 12 17:25:51.290986 kubelet[2774]: I1212 17:25:51.290945 2774 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Dec 12 17:25:51.291425 kubelet[2774]: I1212 17:25:51.291397 2774 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 12 17:25:51.291727 kubelet[2774]: I1212 17:25:51.291492 2774 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-1-a1e622265d","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 12 17:25:51.292640 kubelet[2774]: I1212 17:25:51.291871 2774 topology_manager.go:138] "Creating topology manager with none policy" Dec 12 17:25:51.292640 kubelet[2774]: I1212 17:25:51.291890 2774 container_manager_linux.go:306] "Creating device plugin manager" Dec 12 17:25:51.292640 kubelet[2774]: I1212 17:25:51.291924 2774 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Dec 12 17:25:51.292898 kubelet[2774]: I1212 17:25:51.292882 2774 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:25:51.293124 kubelet[2774]: I1212 17:25:51.293114 2774 kubelet.go:475] "Attempting to sync node with API server" Dec 12 17:25:51.293824 kubelet[2774]: I1212 17:25:51.293685 2774 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 12 17:25:51.293939 kubelet[2774]: I1212 17:25:51.293928 2774 kubelet.go:387] "Adding apiserver pod source" Dec 12 17:25:51.293999 kubelet[2774]: I1212 17:25:51.293990 2774 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 12 17:25:51.299820 kubelet[2774]: I1212 17:25:51.299786 2774 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 12 17:25:51.301655 kubelet[2774]: I1212 17:25:51.300609 2774 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 12 17:25:51.301796 kubelet[2774]: I1212 17:25:51.301785 2774 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Dec 12 17:25:51.307051 kubelet[2774]: I1212 17:25:51.307029 2774 server.go:1262] "Started kubelet" Dec 12 17:25:51.312933 kubelet[2774]: I1212 17:25:51.312803 2774 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 12 17:25:51.331685 kubelet[2774]: I1212 17:25:51.331614 2774 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 12 17:25:51.332116 kubelet[2774]: I1212 17:25:51.332055 2774 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 12 17:25:51.332184 kubelet[2774]: I1212 17:25:51.332139 2774 server_v1.go:49] "podresources" method="list" useActivePods=true Dec 12 17:25:51.332388 kubelet[2774]: I1212 17:25:51.332371 2774 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 12 17:25:51.335502 kubelet[2774]: I1212 17:25:51.335469 2774 server.go:310] "Adding debug handlers to kubelet server" Dec 12 17:25:51.340695 kubelet[2774]: I1212 17:25:51.337265 2774 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 12 17:25:51.342169 kubelet[2774]: E1212 17:25:51.337299 2774 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 12 17:25:51.343862 kubelet[2774]: I1212 17:25:51.338197 2774 volume_manager.go:313] "Starting Kubelet Volume Manager" Dec 12 17:25:51.343965 kubelet[2774]: I1212 17:25:51.338207 2774 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 12 17:25:51.343965 kubelet[2774]: E1212 17:25:51.338341 2774 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-1-a1e622265d\" not found" Dec 12 17:25:51.344174 kubelet[2774]: I1212 17:25:51.344070 2774 reconciler.go:29] "Reconciler: start to sync state" Dec 12 17:25:51.349481 kubelet[2774]: I1212 17:25:51.349437 2774 factory.go:223] Registration of the systemd container factory successfully Dec 12 17:25:51.349806 kubelet[2774]: I1212 17:25:51.349646 2774 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 12 17:25:51.349871 kubelet[2774]: I1212 17:25:51.349824 2774 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Dec 12 17:25:51.353339 kubelet[2774]: I1212 17:25:51.353238 2774 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Dec 12 17:25:51.353339 kubelet[2774]: I1212 17:25:51.353277 2774 status_manager.go:244] "Starting to sync pod status with apiserver" Dec 12 17:25:51.353339 kubelet[2774]: I1212 17:25:51.353305 2774 kubelet.go:2427] "Starting kubelet main sync loop" Dec 12 17:25:51.353498 kubelet[2774]: E1212 17:25:51.353354 2774 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 12 17:25:51.360015 kubelet[2774]: I1212 17:25:51.359981 2774 factory.go:223] Registration of the containerd container factory successfully Dec 12 17:25:51.418033 kubelet[2774]: I1212 17:25:51.418005 2774 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 12 17:25:51.418371 kubelet[2774]: I1212 17:25:51.418184 2774 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 12 17:25:51.418371 kubelet[2774]: I1212 17:25:51.418210 2774 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:25:51.418817 kubelet[2774]: I1212 17:25:51.418783 2774 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 12 17:25:51.418817 kubelet[2774]: I1212 17:25:51.418808 2774 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 12 17:25:51.418938 kubelet[2774]: I1212 17:25:51.418827 2774 policy_none.go:49] "None policy: Start" Dec 12 17:25:51.418938 kubelet[2774]: I1212 17:25:51.418838 2774 memory_manager.go:187] "Starting memorymanager" policy="None" Dec 12 17:25:51.418938 kubelet[2774]: I1212 17:25:51.418850 2774 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Dec 12 17:25:51.419094 kubelet[2774]: I1212 17:25:51.419075 2774 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Dec 12 17:25:51.419094 kubelet[2774]: I1212 17:25:51.419097 2774 policy_none.go:47] "Start" Dec 12 17:25:51.425348 kubelet[2774]: E1212 17:25:51.424598 2774 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 12 17:25:51.425348 kubelet[2774]: I1212 17:25:51.424812 2774 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 12 17:25:51.425348 kubelet[2774]: I1212 17:25:51.424827 2774 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 12 17:25:51.425348 kubelet[2774]: I1212 17:25:51.425067 2774 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 12 17:25:51.426788 kubelet[2774]: E1212 17:25:51.426765 2774 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 12 17:25:51.455460 kubelet[2774]: I1212 17:25:51.455341 2774 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:51.456311 kubelet[2774]: I1212 17:25:51.456291 2774 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:51.458389 kubelet[2774]: I1212 17:25:51.458184 2774 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:51.527702 kubelet[2774]: I1212 17:25:51.527672 2774 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-1-a1e622265d" Dec 12 17:25:51.542256 kubelet[2774]: I1212 17:25:51.541742 2774 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459-2-2-1-a1e622265d" Dec 12 17:25:51.542256 kubelet[2774]: I1212 17:25:51.541901 2774 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-1-a1e622265d" Dec 12 17:25:51.544695 kubelet[2774]: I1212 17:25:51.544401 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3dda58f77b4c8cc70e516d3a1d8232bf-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-1-a1e622265d\" (UID: \"3dda58f77b4c8cc70e516d3a1d8232bf\") " pod="kube-system/kube-apiserver-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:51.645738 kubelet[2774]: I1212 17:25:51.645677 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/4ab3e080306a34c2e7d66e718b927097-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-1-a1e622265d\" (UID: \"4ab3e080306a34c2e7d66e718b927097\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:51.646031 kubelet[2774]: I1212 17:25:51.645863 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4ab3e080306a34c2e7d66e718b927097-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-1-a1e622265d\" (UID: \"4ab3e080306a34c2e7d66e718b927097\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:51.646280 kubelet[2774]: I1212 17:25:51.645898 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4ab3e080306a34c2e7d66e718b927097-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-1-a1e622265d\" (UID: \"4ab3e080306a34c2e7d66e718b927097\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:51.646485 kubelet[2774]: I1212 17:25:51.646266 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/10a215612e7c69b012babe332ee7d047-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-1-a1e622265d\" (UID: \"10a215612e7c69b012babe332ee7d047\") " pod="kube-system/kube-scheduler-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:51.646485 kubelet[2774]: I1212 17:25:51.646443 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3dda58f77b4c8cc70e516d3a1d8232bf-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-1-a1e622265d\" (UID: \"3dda58f77b4c8cc70e516d3a1d8232bf\") " pod="kube-system/kube-apiserver-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:51.646803 kubelet[2774]: I1212 17:25:51.646464 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3dda58f77b4c8cc70e516d3a1d8232bf-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-1-a1e622265d\" (UID: \"3dda58f77b4c8cc70e516d3a1d8232bf\") " pod="kube-system/kube-apiserver-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:51.646803 kubelet[2774]: I1212 17:25:51.646664 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4ab3e080306a34c2e7d66e718b927097-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-1-a1e622265d\" (UID: \"4ab3e080306a34c2e7d66e718b927097\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:51.646909 kubelet[2774]: I1212 17:25:51.646687 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4ab3e080306a34c2e7d66e718b927097-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-1-a1e622265d\" (UID: \"4ab3e080306a34c2e7d66e718b927097\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:52.297718 kubelet[2774]: I1212 17:25:52.297442 2774 apiserver.go:52] "Watching apiserver" Dec 12 17:25:52.344241 kubelet[2774]: I1212 17:25:52.344181 2774 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 12 17:25:52.392848 kubelet[2774]: I1212 17:25:52.392598 2774 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:52.394827 kubelet[2774]: I1212 17:25:52.393603 2774 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:52.410942 kubelet[2774]: E1212 17:25:52.410721 2774 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-1-a1e622265d\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:52.411189 kubelet[2774]: E1212 17:25:52.410692 2774 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-2-1-a1e622265d\" already exists" pod="kube-system/kube-controller-manager-ci-4459-2-2-1-a1e622265d" Dec 12 17:25:52.460145 kubelet[2774]: I1212 17:25:52.460048 2774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-2-1-a1e622265d" podStartSLOduration=1.460028941 podStartE2EDuration="1.460028941s" podCreationTimestamp="2025-12-12 17:25:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:25:52.432830023 +0000 UTC m=+1.224012251" watchObservedRunningTime="2025-12-12 17:25:52.460028941 +0000 UTC m=+1.251210809" Dec 12 17:25:52.477695 kubelet[2774]: I1212 17:25:52.476570 2774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-2-1-a1e622265d" podStartSLOduration=1.4765414780000001 podStartE2EDuration="1.476541478s" podCreationTimestamp="2025-12-12 17:25:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:25:52.462859558 +0000 UTC m=+1.254041426" watchObservedRunningTime="2025-12-12 17:25:52.476541478 +0000 UTC m=+1.267723346" Dec 12 17:25:52.478118 kubelet[2774]: I1212 17:25:52.478018 2774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-2-1-a1e622265d" podStartSLOduration=1.477999886 podStartE2EDuration="1.477999886s" podCreationTimestamp="2025-12-12 17:25:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:25:52.477965926 +0000 UTC m=+1.269147794" watchObservedRunningTime="2025-12-12 17:25:52.477999886 +0000 UTC m=+1.269181754" Dec 12 17:25:55.325191 kubelet[2774]: I1212 17:25:55.325153 2774 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Dec 12 17:25:55.326047 containerd[1508]: time="2025-12-12T17:25:55.325752416Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Dec 12 17:25:55.326853 kubelet[2774]: I1212 17:25:55.326526 2774 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Dec 12 17:25:56.060419 systemd[1]: Created slice kubepods-besteffort-podea211adf_0946_4297_807f_629b7b800d6f.slice - libcontainer container kubepods-besteffort-podea211adf_0946_4297_807f_629b7b800d6f.slice. Dec 12 17:25:56.077730 kubelet[2774]: I1212 17:25:56.077323 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/ea211adf-0946-4297-807f-629b7b800d6f-kube-proxy\") pod \"kube-proxy-5brnl\" (UID: \"ea211adf-0946-4297-807f-629b7b800d6f\") " pod="kube-system/kube-proxy-5brnl" Dec 12 17:25:56.077730 kubelet[2774]: I1212 17:25:56.077390 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ea211adf-0946-4297-807f-629b7b800d6f-xtables-lock\") pod \"kube-proxy-5brnl\" (UID: \"ea211adf-0946-4297-807f-629b7b800d6f\") " pod="kube-system/kube-proxy-5brnl" Dec 12 17:25:56.077730 kubelet[2774]: I1212 17:25:56.077421 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ea211adf-0946-4297-807f-629b7b800d6f-lib-modules\") pod \"kube-proxy-5brnl\" (UID: \"ea211adf-0946-4297-807f-629b7b800d6f\") " pod="kube-system/kube-proxy-5brnl" Dec 12 17:25:56.077730 kubelet[2774]: I1212 17:25:56.077462 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqv4t\" (UniqueName: \"kubernetes.io/projected/ea211adf-0946-4297-807f-629b7b800d6f-kube-api-access-hqv4t\") pod \"kube-proxy-5brnl\" (UID: \"ea211adf-0946-4297-807f-629b7b800d6f\") " pod="kube-system/kube-proxy-5brnl" Dec 12 17:25:56.374682 containerd[1508]: time="2025-12-12T17:25:56.373874039Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-5brnl,Uid:ea211adf-0946-4297-807f-629b7b800d6f,Namespace:kube-system,Attempt:0,}" Dec 12 17:25:56.411259 containerd[1508]: time="2025-12-12T17:25:56.410768838Z" level=info msg="connecting to shim 36d23e66591fe03edf875c493b09fee4434ef3e2defb12b14e5787932a25d457" address="unix:///run/containerd/s/e2fead5d3c940637f4c7f66b81bf4ebeeaa406e89ca093a53d20b4e9413f88b7" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:25:56.441910 systemd[1]: Started cri-containerd-36d23e66591fe03edf875c493b09fee4434ef3e2defb12b14e5787932a25d457.scope - libcontainer container 36d23e66591fe03edf875c493b09fee4434ef3e2defb12b14e5787932a25d457. Dec 12 17:25:56.476123 containerd[1508]: time="2025-12-12T17:25:56.475879148Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-5brnl,Uid:ea211adf-0946-4297-807f-629b7b800d6f,Namespace:kube-system,Attempt:0,} returns sandbox id \"36d23e66591fe03edf875c493b09fee4434ef3e2defb12b14e5787932a25d457\"" Dec 12 17:25:56.489367 containerd[1508]: time="2025-12-12T17:25:56.489326220Z" level=info msg="CreateContainer within sandbox \"36d23e66591fe03edf875c493b09fee4434ef3e2defb12b14e5787932a25d457\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Dec 12 17:25:56.510773 containerd[1508]: time="2025-12-12T17:25:56.510727215Z" level=info msg="Container 90f2f442a3885cd4941f2dba2b4db7349f6063e94e0adae6db78ef0136b7ea93: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:25:56.537721 containerd[1508]: time="2025-12-12T17:25:56.537519559Z" level=info msg="CreateContainer within sandbox \"36d23e66591fe03edf875c493b09fee4434ef3e2defb12b14e5787932a25d457\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"90f2f442a3885cd4941f2dba2b4db7349f6063e94e0adae6db78ef0136b7ea93\"" Dec 12 17:25:56.541756 containerd[1508]: time="2025-12-12T17:25:56.541705901Z" level=info msg="StartContainer for \"90f2f442a3885cd4941f2dba2b4db7349f6063e94e0adae6db78ef0136b7ea93\"" Dec 12 17:25:56.547292 containerd[1508]: time="2025-12-12T17:25:56.547229171Z" level=info msg="connecting to shim 90f2f442a3885cd4941f2dba2b4db7349f6063e94e0adae6db78ef0136b7ea93" address="unix:///run/containerd/s/e2fead5d3c940637f4c7f66b81bf4ebeeaa406e89ca093a53d20b4e9413f88b7" protocol=ttrpc version=3 Dec 12 17:25:56.569767 systemd[1]: Created slice kubepods-besteffort-pod993d7d65_0e0a_447f_8ed4_a4d7d1f328c8.slice - libcontainer container kubepods-besteffort-pod993d7d65_0e0a_447f_8ed4_a4d7d1f328c8.slice. Dec 12 17:25:56.586399 kubelet[2774]: I1212 17:25:56.586363 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44fxm\" (UniqueName: \"kubernetes.io/projected/993d7d65-0e0a-447f-8ed4-a4d7d1f328c8-kube-api-access-44fxm\") pod \"tigera-operator-65cdcdfd6d-n7wx8\" (UID: \"993d7d65-0e0a-447f-8ed4-a4d7d1f328c8\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-n7wx8" Dec 12 17:25:56.588439 kubelet[2774]: I1212 17:25:56.586856 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/993d7d65-0e0a-447f-8ed4-a4d7d1f328c8-var-lib-calico\") pod \"tigera-operator-65cdcdfd6d-n7wx8\" (UID: \"993d7d65-0e0a-447f-8ed4-a4d7d1f328c8\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-n7wx8" Dec 12 17:25:56.611942 systemd[1]: Started cri-containerd-90f2f442a3885cd4941f2dba2b4db7349f6063e94e0adae6db78ef0136b7ea93.scope - libcontainer container 90f2f442a3885cd4941f2dba2b4db7349f6063e94e0adae6db78ef0136b7ea93. Dec 12 17:25:56.687235 containerd[1508]: time="2025-12-12T17:25:56.686695201Z" level=info msg="StartContainer for \"90f2f442a3885cd4941f2dba2b4db7349f6063e94e0adae6db78ef0136b7ea93\" returns successfully" Dec 12 17:25:56.883460 containerd[1508]: time="2025-12-12T17:25:56.883345697Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-n7wx8,Uid:993d7d65-0e0a-447f-8ed4-a4d7d1f328c8,Namespace:tigera-operator,Attempt:0,}" Dec 12 17:25:56.902646 containerd[1508]: time="2025-12-12T17:25:56.902147439Z" level=info msg="connecting to shim 35bb6039cd1b88d45ba0f0547c86e65d4b39d8129308c49ac77ec059803a1070" address="unix:///run/containerd/s/3291ae1379f0a86b0febe6052b090227db92aeb039eef110003af8b6194f9a8f" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:25:56.929850 systemd[1]: Started cri-containerd-35bb6039cd1b88d45ba0f0547c86e65d4b39d8129308c49ac77ec059803a1070.scope - libcontainer container 35bb6039cd1b88d45ba0f0547c86e65d4b39d8129308c49ac77ec059803a1070. Dec 12 17:25:57.001682 containerd[1508]: time="2025-12-12T17:25:57.001336572Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-n7wx8,Uid:993d7d65-0e0a-447f-8ed4-a4d7d1f328c8,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"35bb6039cd1b88d45ba0f0547c86e65d4b39d8129308c49ac77ec059803a1070\"" Dec 12 17:25:57.006470 containerd[1508]: time="2025-12-12T17:25:57.006374118Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Dec 12 17:25:57.425438 kubelet[2774]: I1212 17:25:57.425236 2774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-5brnl" podStartSLOduration=1.425206888 podStartE2EDuration="1.425206888s" podCreationTimestamp="2025-12-12 17:25:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:25:57.42358684 +0000 UTC m=+6.214768788" watchObservedRunningTime="2025-12-12 17:25:57.425206888 +0000 UTC m=+6.216388756" Dec 12 17:25:58.763086 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2676787585.mount: Deactivated successfully. Dec 12 17:25:59.114989 containerd[1508]: time="2025-12-12T17:25:59.114708651Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:59.115913 containerd[1508]: time="2025-12-12T17:25:59.115607135Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=22152004" Dec 12 17:25:59.116754 containerd[1508]: time="2025-12-12T17:25:59.116694821Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:59.119403 containerd[1508]: time="2025-12-12T17:25:59.119359395Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:25:59.120498 containerd[1508]: time="2025-12-12T17:25:59.120348440Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 2.113915962s" Dec 12 17:25:59.120498 containerd[1508]: time="2025-12-12T17:25:59.120382200Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Dec 12 17:25:59.126795 containerd[1508]: time="2025-12-12T17:25:59.126752712Z" level=info msg="CreateContainer within sandbox \"35bb6039cd1b88d45ba0f0547c86e65d4b39d8129308c49ac77ec059803a1070\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Dec 12 17:25:59.139685 containerd[1508]: time="2025-12-12T17:25:59.138271171Z" level=info msg="Container 8ff72935087b35adc09bf37af1301f752d5e9a327ab4a48b55843118faee4fea: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:25:59.150433 containerd[1508]: time="2025-12-12T17:25:59.150382193Z" level=info msg="CreateContainer within sandbox \"35bb6039cd1b88d45ba0f0547c86e65d4b39d8129308c49ac77ec059803a1070\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"8ff72935087b35adc09bf37af1301f752d5e9a327ab4a48b55843118faee4fea\"" Dec 12 17:25:59.151705 containerd[1508]: time="2025-12-12T17:25:59.151657719Z" level=info msg="StartContainer for \"8ff72935087b35adc09bf37af1301f752d5e9a327ab4a48b55843118faee4fea\"" Dec 12 17:25:59.152876 containerd[1508]: time="2025-12-12T17:25:59.152829125Z" level=info msg="connecting to shim 8ff72935087b35adc09bf37af1301f752d5e9a327ab4a48b55843118faee4fea" address="unix:///run/containerd/s/3291ae1379f0a86b0febe6052b090227db92aeb039eef110003af8b6194f9a8f" protocol=ttrpc version=3 Dec 12 17:25:59.176833 systemd[1]: Started cri-containerd-8ff72935087b35adc09bf37af1301f752d5e9a327ab4a48b55843118faee4fea.scope - libcontainer container 8ff72935087b35adc09bf37af1301f752d5e9a327ab4a48b55843118faee4fea. Dec 12 17:25:59.212641 containerd[1508]: time="2025-12-12T17:25:59.212561630Z" level=info msg="StartContainer for \"8ff72935087b35adc09bf37af1301f752d5e9a327ab4a48b55843118faee4fea\" returns successfully" Dec 12 17:25:59.436669 kubelet[2774]: I1212 17:25:59.436091 2774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-65cdcdfd6d-n7wx8" podStartSLOduration=1.31877135 podStartE2EDuration="3.436061929s" podCreationTimestamp="2025-12-12 17:25:56 +0000 UTC" firstStartedPulling="2025-12-12 17:25:57.004432628 +0000 UTC m=+5.795614536" lastFinishedPulling="2025-12-12 17:25:59.121723247 +0000 UTC m=+7.912905115" observedRunningTime="2025-12-12 17:25:59.435295005 +0000 UTC m=+8.226476913" watchObservedRunningTime="2025-12-12 17:25:59.436061929 +0000 UTC m=+8.227243837" Dec 12 17:26:05.472162 sudo[1842]: pam_unix(sudo:session): session closed for user root Dec 12 17:26:05.634281 sshd[1841]: Connection closed by 139.178.89.65 port 58478 Dec 12 17:26:05.637198 sshd-session[1838]: pam_unix(sshd:session): session closed for user core Dec 12 17:26:05.642653 systemd[1]: sshd@7-23.88.120.93:22-139.178.89.65:58478.service: Deactivated successfully. Dec 12 17:26:05.649814 systemd[1]: session-7.scope: Deactivated successfully. Dec 12 17:26:05.650187 systemd[1]: session-7.scope: Consumed 7.832s CPU time, 224.7M memory peak. Dec 12 17:26:05.653288 systemd-logind[1487]: Session 7 logged out. Waiting for processes to exit. Dec 12 17:26:05.657970 systemd-logind[1487]: Removed session 7. Dec 12 17:26:15.037125 systemd[1]: Created slice kubepods-besteffort-pod79074293_956c_4474_b150_9a400de37268.slice - libcontainer container kubepods-besteffort-pod79074293_956c_4474_b150_9a400de37268.slice. Dec 12 17:26:15.116805 kubelet[2774]: I1212 17:26:15.116325 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/79074293-956c-4474-b150-9a400de37268-typha-certs\") pod \"calico-typha-c7d585977-xl5q8\" (UID: \"79074293-956c-4474-b150-9a400de37268\") " pod="calico-system/calico-typha-c7d585977-xl5q8" Dec 12 17:26:15.116805 kubelet[2774]: I1212 17:26:15.116419 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sld8t\" (UniqueName: \"kubernetes.io/projected/79074293-956c-4474-b150-9a400de37268-kube-api-access-sld8t\") pod \"calico-typha-c7d585977-xl5q8\" (UID: \"79074293-956c-4474-b150-9a400de37268\") " pod="calico-system/calico-typha-c7d585977-xl5q8" Dec 12 17:26:15.116805 kubelet[2774]: I1212 17:26:15.116447 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79074293-956c-4474-b150-9a400de37268-tigera-ca-bundle\") pod \"calico-typha-c7d585977-xl5q8\" (UID: \"79074293-956c-4474-b150-9a400de37268\") " pod="calico-system/calico-typha-c7d585977-xl5q8" Dec 12 17:26:15.193665 kubelet[2774]: E1212 17:26:15.193593 2774 status_manager.go:1018] "Failed to get status for pod" err="pods \"calico-node-xgkvx\" is forbidden: User \"system:node:ci-4459-2-2-1-a1e622265d\" cannot get resource \"pods\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4459-2-2-1-a1e622265d' and this object" podUID="57cf4d09-0311-4b4e-b150-193cc67abd90" pod="calico-system/calico-node-xgkvx" Dec 12 17:26:15.193898 kubelet[2774]: E1212 17:26:15.193626 2774 reflector.go:205] "Failed to watch" err="failed to list *v1.Secret: secrets \"node-certs\" is forbidden: User \"system:node:ci-4459-2-2-1-a1e622265d\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4459-2-2-1-a1e622265d' and this object" logger="UnhandledError" reflector="object-\"calico-system\"/\"node-certs\"" type="*v1.Secret" Dec 12 17:26:15.194967 systemd[1]: Created slice kubepods-besteffort-pod57cf4d09_0311_4b4e_b150_193cc67abd90.slice - libcontainer container kubepods-besteffort-pod57cf4d09_0311_4b4e_b150_193cc67abd90.slice. Dec 12 17:26:15.217042 kubelet[2774]: I1212 17:26:15.216977 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/57cf4d09-0311-4b4e-b150-193cc67abd90-cni-net-dir\") pod \"calico-node-xgkvx\" (UID: \"57cf4d09-0311-4b4e-b150-193cc67abd90\") " pod="calico-system/calico-node-xgkvx" Dec 12 17:26:15.217042 kubelet[2774]: I1212 17:26:15.217036 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/57cf4d09-0311-4b4e-b150-193cc67abd90-policysync\") pod \"calico-node-xgkvx\" (UID: \"57cf4d09-0311-4b4e-b150-193cc67abd90\") " pod="calico-system/calico-node-xgkvx" Dec 12 17:26:15.217042 kubelet[2774]: I1212 17:26:15.217060 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/57cf4d09-0311-4b4e-b150-193cc67abd90-cni-bin-dir\") pod \"calico-node-xgkvx\" (UID: \"57cf4d09-0311-4b4e-b150-193cc67abd90\") " pod="calico-system/calico-node-xgkvx" Dec 12 17:26:15.217418 kubelet[2774]: I1212 17:26:15.217115 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57cf4d09-0311-4b4e-b150-193cc67abd90-tigera-ca-bundle\") pod \"calico-node-xgkvx\" (UID: \"57cf4d09-0311-4b4e-b150-193cc67abd90\") " pod="calico-system/calico-node-xgkvx" Dec 12 17:26:15.217418 kubelet[2774]: I1212 17:26:15.217144 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/57cf4d09-0311-4b4e-b150-193cc67abd90-var-run-calico\") pod \"calico-node-xgkvx\" (UID: \"57cf4d09-0311-4b4e-b150-193cc67abd90\") " pod="calico-system/calico-node-xgkvx" Dec 12 17:26:15.217418 kubelet[2774]: I1212 17:26:15.217173 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/57cf4d09-0311-4b4e-b150-193cc67abd90-cni-log-dir\") pod \"calico-node-xgkvx\" (UID: \"57cf4d09-0311-4b4e-b150-193cc67abd90\") " pod="calico-system/calico-node-xgkvx" Dec 12 17:26:15.217418 kubelet[2774]: I1212 17:26:15.217203 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/57cf4d09-0311-4b4e-b150-193cc67abd90-lib-modules\") pod \"calico-node-xgkvx\" (UID: \"57cf4d09-0311-4b4e-b150-193cc67abd90\") " pod="calico-system/calico-node-xgkvx" Dec 12 17:26:15.217418 kubelet[2774]: I1212 17:26:15.217224 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/57cf4d09-0311-4b4e-b150-193cc67abd90-node-certs\") pod \"calico-node-xgkvx\" (UID: \"57cf4d09-0311-4b4e-b150-193cc67abd90\") " pod="calico-system/calico-node-xgkvx" Dec 12 17:26:15.217786 kubelet[2774]: I1212 17:26:15.217245 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/57cf4d09-0311-4b4e-b150-193cc67abd90-flexvol-driver-host\") pod \"calico-node-xgkvx\" (UID: \"57cf4d09-0311-4b4e-b150-193cc67abd90\") " pod="calico-system/calico-node-xgkvx" Dec 12 17:26:15.217786 kubelet[2774]: I1212 17:26:15.217269 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/57cf4d09-0311-4b4e-b150-193cc67abd90-var-lib-calico\") pod \"calico-node-xgkvx\" (UID: \"57cf4d09-0311-4b4e-b150-193cc67abd90\") " pod="calico-system/calico-node-xgkvx" Dec 12 17:26:15.217786 kubelet[2774]: I1212 17:26:15.217302 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/57cf4d09-0311-4b4e-b150-193cc67abd90-xtables-lock\") pod \"calico-node-xgkvx\" (UID: \"57cf4d09-0311-4b4e-b150-193cc67abd90\") " pod="calico-system/calico-node-xgkvx" Dec 12 17:26:15.217786 kubelet[2774]: I1212 17:26:15.217322 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w48rg\" (UniqueName: \"kubernetes.io/projected/57cf4d09-0311-4b4e-b150-193cc67abd90-kube-api-access-w48rg\") pod \"calico-node-xgkvx\" (UID: \"57cf4d09-0311-4b4e-b150-193cc67abd90\") " pod="calico-system/calico-node-xgkvx" Dec 12 17:26:15.345121 kubelet[2774]: E1212 17:26:15.344494 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.348719 kubelet[2774]: W1212 17:26:15.345355 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.348719 kubelet[2774]: E1212 17:26:15.348661 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.349006 containerd[1508]: time="2025-12-12T17:26:15.348956400Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-c7d585977-xl5q8,Uid:79074293-956c-4474-b150-9a400de37268,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:15.376116 kubelet[2774]: E1212 17:26:15.376054 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nnwtm" podUID="df83329f-2747-4a89-9a6a-7b20123df2cf" Dec 12 17:26:15.390999 containerd[1508]: time="2025-12-12T17:26:15.390951495Z" level=info msg="connecting to shim 43204a3fb049a83f92eb9a5003896266e883805a26333ea54c638db5e928175b" address="unix:///run/containerd/s/f31ca35adddacf05f3663ffacddc893592eb0f0c6c6e58736c16300beb833d49" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:15.392418 kubelet[2774]: E1212 17:26:15.392257 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.392418 kubelet[2774]: W1212 17:26:15.392283 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.392418 kubelet[2774]: E1212 17:26:15.392309 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.394419 kubelet[2774]: E1212 17:26:15.393410 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.394419 kubelet[2774]: W1212 17:26:15.393433 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.394419 kubelet[2774]: E1212 17:26:15.393486 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.394890 kubelet[2774]: E1212 17:26:15.394735 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.394890 kubelet[2774]: W1212 17:26:15.394755 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.394890 kubelet[2774]: E1212 17:26:15.394775 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.395079 kubelet[2774]: E1212 17:26:15.395067 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.395145 kubelet[2774]: W1212 17:26:15.395131 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.395199 kubelet[2774]: E1212 17:26:15.395188 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.395824 kubelet[2774]: E1212 17:26:15.395746 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.395824 kubelet[2774]: W1212 17:26:15.395766 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.395824 kubelet[2774]: E1212 17:26:15.395781 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.396917 kubelet[2774]: E1212 17:26:15.396770 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.396917 kubelet[2774]: W1212 17:26:15.396788 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.396917 kubelet[2774]: E1212 17:26:15.396805 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.398799 kubelet[2774]: E1212 17:26:15.398778 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.399839 kubelet[2774]: W1212 17:26:15.399649 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.399839 kubelet[2774]: E1212 17:26:15.399683 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.400169 kubelet[2774]: E1212 17:26:15.400031 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.400169 kubelet[2774]: W1212 17:26:15.400045 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.400169 kubelet[2774]: E1212 17:26:15.400057 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.400524 kubelet[2774]: E1212 17:26:15.400401 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.400524 kubelet[2774]: W1212 17:26:15.400414 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.400524 kubelet[2774]: E1212 17:26:15.400428 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.400853 kubelet[2774]: E1212 17:26:15.400840 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.402185 kubelet[2774]: W1212 17:26:15.401660 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.402185 kubelet[2774]: E1212 17:26:15.401690 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.402520 kubelet[2774]: E1212 17:26:15.402504 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.402632 kubelet[2774]: W1212 17:26:15.402595 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.402751 kubelet[2774]: E1212 17:26:15.402614 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.403859 kubelet[2774]: E1212 17:26:15.403840 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.404011 kubelet[2774]: W1212 17:26:15.403994 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.404083 kubelet[2774]: E1212 17:26:15.404071 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.404885 kubelet[2774]: E1212 17:26:15.404751 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.404885 kubelet[2774]: W1212 17:26:15.404776 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.404885 kubelet[2774]: E1212 17:26:15.404789 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.405469 kubelet[2774]: E1212 17:26:15.405447 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.406149 kubelet[2774]: W1212 17:26:15.405602 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.406149 kubelet[2774]: E1212 17:26:15.405642 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.407256 kubelet[2774]: E1212 17:26:15.406890 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.408649 kubelet[2774]: W1212 17:26:15.407680 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.408649 kubelet[2774]: E1212 17:26:15.407705 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.409016 kubelet[2774]: E1212 17:26:15.408867 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.409016 kubelet[2774]: W1212 17:26:15.408884 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.409016 kubelet[2774]: E1212 17:26:15.408898 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.409499 kubelet[2774]: E1212 17:26:15.409433 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.409499 kubelet[2774]: W1212 17:26:15.409446 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.409499 kubelet[2774]: E1212 17:26:15.409459 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.413922 kubelet[2774]: E1212 17:26:15.413724 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.413922 kubelet[2774]: W1212 17:26:15.413750 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.413922 kubelet[2774]: E1212 17:26:15.413770 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.414194 kubelet[2774]: E1212 17:26:15.414180 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.414315 kubelet[2774]: W1212 17:26:15.414247 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.414315 kubelet[2774]: E1212 17:26:15.414265 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.414824 kubelet[2774]: E1212 17:26:15.414807 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.415035 kubelet[2774]: W1212 17:26:15.414924 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.415035 kubelet[2774]: E1212 17:26:15.414943 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.422219 kubelet[2774]: E1212 17:26:15.422008 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.422219 kubelet[2774]: W1212 17:26:15.422034 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.422219 kubelet[2774]: E1212 17:26:15.422057 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.422219 kubelet[2774]: I1212 17:26:15.422102 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/df83329f-2747-4a89-9a6a-7b20123df2cf-varrun\") pod \"csi-node-driver-nnwtm\" (UID: \"df83329f-2747-4a89-9a6a-7b20123df2cf\") " pod="calico-system/csi-node-driver-nnwtm" Dec 12 17:26:15.425036 kubelet[2774]: E1212 17:26:15.424891 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.425036 kubelet[2774]: W1212 17:26:15.424917 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.425036 kubelet[2774]: E1212 17:26:15.424939 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.425036 kubelet[2774]: I1212 17:26:15.424985 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/df83329f-2747-4a89-9a6a-7b20123df2cf-socket-dir\") pod \"csi-node-driver-nnwtm\" (UID: \"df83329f-2747-4a89-9a6a-7b20123df2cf\") " pod="calico-system/csi-node-driver-nnwtm" Dec 12 17:26:15.425783 kubelet[2774]: E1212 17:26:15.425709 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.425783 kubelet[2774]: W1212 17:26:15.425734 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.425783 kubelet[2774]: E1212 17:26:15.425756 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.426758 kubelet[2774]: E1212 17:26:15.426732 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.426758 kubelet[2774]: W1212 17:26:15.426755 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.428599 kubelet[2774]: E1212 17:26:15.426774 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.428599 kubelet[2774]: E1212 17:26:15.427952 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.428599 kubelet[2774]: W1212 17:26:15.427972 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.428599 kubelet[2774]: E1212 17:26:15.427991 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.428599 kubelet[2774]: I1212 17:26:15.428021 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/df83329f-2747-4a89-9a6a-7b20123df2cf-registration-dir\") pod \"csi-node-driver-nnwtm\" (UID: \"df83329f-2747-4a89-9a6a-7b20123df2cf\") " pod="calico-system/csi-node-driver-nnwtm" Dec 12 17:26:15.429784 kubelet[2774]: E1212 17:26:15.429712 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.429784 kubelet[2774]: W1212 17:26:15.429738 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.429784 kubelet[2774]: E1212 17:26:15.429759 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.430144 kubelet[2774]: I1212 17:26:15.429880 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc8kp\" (UniqueName: \"kubernetes.io/projected/df83329f-2747-4a89-9a6a-7b20123df2cf-kube-api-access-sc8kp\") pod \"csi-node-driver-nnwtm\" (UID: \"df83329f-2747-4a89-9a6a-7b20123df2cf\") " pod="calico-system/csi-node-driver-nnwtm" Dec 12 17:26:15.430769 kubelet[2774]: E1212 17:26:15.430709 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.430769 kubelet[2774]: W1212 17:26:15.430724 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.430769 kubelet[2774]: E1212 17:26:15.430736 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.431088 kubelet[2774]: E1212 17:26:15.430887 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.431088 kubelet[2774]: W1212 17:26:15.430896 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.431088 kubelet[2774]: E1212 17:26:15.430905 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.431088 kubelet[2774]: E1212 17:26:15.431030 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.431088 kubelet[2774]: W1212 17:26:15.431037 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.431088 kubelet[2774]: E1212 17:26:15.431045 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.431893 kubelet[2774]: I1212 17:26:15.431336 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df83329f-2747-4a89-9a6a-7b20123df2cf-kubelet-dir\") pod \"csi-node-driver-nnwtm\" (UID: \"df83329f-2747-4a89-9a6a-7b20123df2cf\") " pod="calico-system/csi-node-driver-nnwtm" Dec 12 17:26:15.431893 kubelet[2774]: E1212 17:26:15.431705 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.431893 kubelet[2774]: W1212 17:26:15.431717 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.431893 kubelet[2774]: E1212 17:26:15.431732 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.431893 kubelet[2774]: E1212 17:26:15.431879 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.431893 kubelet[2774]: W1212 17:26:15.431887 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.431893 kubelet[2774]: E1212 17:26:15.431895 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.433851 kubelet[2774]: E1212 17:26:15.433825 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.434118 kubelet[2774]: W1212 17:26:15.433967 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.434217 kubelet[2774]: E1212 17:26:15.434200 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.434793 kubelet[2774]: E1212 17:26:15.434661 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.434793 kubelet[2774]: W1212 17:26:15.434677 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.434793 kubelet[2774]: E1212 17:26:15.434692 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.436915 kubelet[2774]: E1212 17:26:15.436292 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.436915 kubelet[2774]: W1212 17:26:15.436484 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.436915 kubelet[2774]: E1212 17:26:15.436504 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.437707 kubelet[2774]: E1212 17:26:15.437679 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.437974 kubelet[2774]: W1212 17:26:15.437880 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.437974 kubelet[2774]: E1212 17:26:15.437906 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.462909 systemd[1]: Started cri-containerd-43204a3fb049a83f92eb9a5003896266e883805a26333ea54c638db5e928175b.scope - libcontainer container 43204a3fb049a83f92eb9a5003896266e883805a26333ea54c638db5e928175b. Dec 12 17:26:15.533024 kubelet[2774]: E1212 17:26:15.532994 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.533024 kubelet[2774]: W1212 17:26:15.533016 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.533024 kubelet[2774]: E1212 17:26:15.533037 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.533917 kubelet[2774]: E1212 17:26:15.533889 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.533917 kubelet[2774]: W1212 17:26:15.533915 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.534124 kubelet[2774]: E1212 17:26:15.533932 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.534747 kubelet[2774]: E1212 17:26:15.534716 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.534747 kubelet[2774]: W1212 17:26:15.534746 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.534956 kubelet[2774]: E1212 17:26:15.534764 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.535928 kubelet[2774]: E1212 17:26:15.535906 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.535928 kubelet[2774]: W1212 17:26:15.535929 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.536161 kubelet[2774]: E1212 17:26:15.535944 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.536442 kubelet[2774]: E1212 17:26:15.536377 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.536442 kubelet[2774]: W1212 17:26:15.536408 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.536442 kubelet[2774]: E1212 17:26:15.536423 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.536927 kubelet[2774]: E1212 17:26:15.536907 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.536927 kubelet[2774]: W1212 17:26:15.536923 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.537049 kubelet[2774]: E1212 17:26:15.537034 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.537577 kubelet[2774]: E1212 17:26:15.537558 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.537734 kubelet[2774]: W1212 17:26:15.537712 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.537798 kubelet[2774]: E1212 17:26:15.537739 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.538977 kubelet[2774]: E1212 17:26:15.538951 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.538977 kubelet[2774]: W1212 17:26:15.538969 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.539205 kubelet[2774]: E1212 17:26:15.538986 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.540013 kubelet[2774]: E1212 17:26:15.539850 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.540013 kubelet[2774]: W1212 17:26:15.539976 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.540013 kubelet[2774]: E1212 17:26:15.539992 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.540577 kubelet[2774]: E1212 17:26:15.540553 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.540577 kubelet[2774]: W1212 17:26:15.540570 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.540770 kubelet[2774]: E1212 17:26:15.540741 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.541497 kubelet[2774]: E1212 17:26:15.541418 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.541497 kubelet[2774]: W1212 17:26:15.541433 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.541497 kubelet[2774]: E1212 17:26:15.541446 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.542465 kubelet[2774]: E1212 17:26:15.542038 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.542465 kubelet[2774]: W1212 17:26:15.542052 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.542465 kubelet[2774]: E1212 17:26:15.542093 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.542692 kubelet[2774]: E1212 17:26:15.542656 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.542692 kubelet[2774]: W1212 17:26:15.542670 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.542692 kubelet[2774]: E1212 17:26:15.542683 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.543588 kubelet[2774]: E1212 17:26:15.543557 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.543803 kubelet[2774]: W1212 17:26:15.543593 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.543803 kubelet[2774]: E1212 17:26:15.543692 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.544939 kubelet[2774]: E1212 17:26:15.544589 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.544939 kubelet[2774]: W1212 17:26:15.544941 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.545227 kubelet[2774]: E1212 17:26:15.544960 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.545814 kubelet[2774]: E1212 17:26:15.545792 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.545814 kubelet[2774]: W1212 17:26:15.545808 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.545814 kubelet[2774]: E1212 17:26:15.545821 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.547149 kubelet[2774]: E1212 17:26:15.547122 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.547149 kubelet[2774]: W1212 17:26:15.547143 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.547309 kubelet[2774]: E1212 17:26:15.547160 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.548019 kubelet[2774]: E1212 17:26:15.547988 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.548019 kubelet[2774]: W1212 17:26:15.548007 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.548019 kubelet[2774]: E1212 17:26:15.548023 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.549178 kubelet[2774]: E1212 17:26:15.549153 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.549178 kubelet[2774]: W1212 17:26:15.549174 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.549435 kubelet[2774]: E1212 17:26:15.549191 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.550382 kubelet[2774]: E1212 17:26:15.550346 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.550382 kubelet[2774]: W1212 17:26:15.550381 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.550382 kubelet[2774]: E1212 17:26:15.550400 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.550734 containerd[1508]: time="2025-12-12T17:26:15.549984934Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-c7d585977-xl5q8,Uid:79074293-956c-4474-b150-9a400de37268,Namespace:calico-system,Attempt:0,} returns sandbox id \"43204a3fb049a83f92eb9a5003896266e883805a26333ea54c638db5e928175b\"" Dec 12 17:26:15.551407 kubelet[2774]: E1212 17:26:15.551128 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.551407 kubelet[2774]: W1212 17:26:15.551143 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.551407 kubelet[2774]: E1212 17:26:15.551156 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.552541 kubelet[2774]: E1212 17:26:15.552515 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.552541 kubelet[2774]: W1212 17:26:15.552534 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.553246 kubelet[2774]: E1212 17:26:15.552549 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.553246 kubelet[2774]: E1212 17:26:15.552913 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.553246 kubelet[2774]: W1212 17:26:15.552923 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.553246 kubelet[2774]: E1212 17:26:15.552934 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.553246 kubelet[2774]: E1212 17:26:15.553144 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.553246 kubelet[2774]: W1212 17:26:15.553152 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.553246 kubelet[2774]: E1212 17:26:15.553163 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.553475 kubelet[2774]: E1212 17:26:15.553449 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.553475 kubelet[2774]: W1212 17:26:15.553461 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.553560 kubelet[2774]: E1212 17:26:15.553474 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:15.556841 containerd[1508]: time="2025-12-12T17:26:15.556688629Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Dec 12 17:26:15.581658 kubelet[2774]: E1212 17:26:15.581476 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:15.581658 kubelet[2774]: W1212 17:26:15.581500 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:15.581658 kubelet[2774]: E1212 17:26:15.581521 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:16.324577 kubelet[2774]: E1212 17:26:16.323323 2774 secret.go:189] Couldn't get secret calico-system/node-certs: failed to sync secret cache: timed out waiting for the condition Dec 12 17:26:16.324577 kubelet[2774]: E1212 17:26:16.323486 2774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57cf4d09-0311-4b4e-b150-193cc67abd90-node-certs podName:57cf4d09-0311-4b4e-b150-193cc67abd90 nodeName:}" failed. No retries permitted until 2025-12-12 17:26:16.823452457 +0000 UTC m=+25.614634325 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-certs" (UniqueName: "kubernetes.io/secret/57cf4d09-0311-4b4e-b150-193cc67abd90-node-certs") pod "calico-node-xgkvx" (UID: "57cf4d09-0311-4b4e-b150-193cc67abd90") : failed to sync secret cache: timed out waiting for the condition Dec 12 17:26:16.347425 kubelet[2774]: E1212 17:26:16.347243 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:16.347425 kubelet[2774]: W1212 17:26:16.347282 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:16.347425 kubelet[2774]: E1212 17:26:16.347311 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:16.448970 kubelet[2774]: E1212 17:26:16.448898 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:16.448970 kubelet[2774]: W1212 17:26:16.448960 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:16.449579 kubelet[2774]: E1212 17:26:16.449020 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:16.550109 kubelet[2774]: E1212 17:26:16.549981 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:16.550109 kubelet[2774]: W1212 17:26:16.550008 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:16.550109 kubelet[2774]: E1212 17:26:16.550037 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:16.651653 kubelet[2774]: E1212 17:26:16.651518 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:16.652301 kubelet[2774]: W1212 17:26:16.652136 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:16.652301 kubelet[2774]: E1212 17:26:16.652190 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:16.753119 kubelet[2774]: E1212 17:26:16.753082 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:16.753119 kubelet[2774]: W1212 17:26:16.753109 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:16.753295 kubelet[2774]: E1212 17:26:16.753135 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:16.854915 kubelet[2774]: E1212 17:26:16.854812 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:16.855545 kubelet[2774]: W1212 17:26:16.854989 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:16.855545 kubelet[2774]: E1212 17:26:16.855096 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:16.856465 kubelet[2774]: E1212 17:26:16.856392 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:16.856861 kubelet[2774]: W1212 17:26:16.856425 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:16.856861 kubelet[2774]: E1212 17:26:16.856595 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:16.857857 kubelet[2774]: E1212 17:26:16.857804 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:16.858182 kubelet[2774]: W1212 17:26:16.857825 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:16.858182 kubelet[2774]: E1212 17:26:16.858009 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:16.859415 kubelet[2774]: E1212 17:26:16.859234 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:16.859415 kubelet[2774]: W1212 17:26:16.859253 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:16.859415 kubelet[2774]: E1212 17:26:16.859269 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:16.860547 kubelet[2774]: E1212 17:26:16.860174 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:16.860547 kubelet[2774]: W1212 17:26:16.860190 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:16.860547 kubelet[2774]: E1212 17:26:16.860206 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:16.868412 kubelet[2774]: E1212 17:26:16.868309 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:16.868412 kubelet[2774]: W1212 17:26:16.868328 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:16.868412 kubelet[2774]: E1212 17:26:16.868347 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:16.984819 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1797615704.mount: Deactivated successfully. Dec 12 17:26:17.004186 containerd[1508]: time="2025-12-12T17:26:17.003567143Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-xgkvx,Uid:57cf4d09-0311-4b4e-b150-193cc67abd90,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:17.026685 containerd[1508]: time="2025-12-12T17:26:17.026635638Z" level=info msg="connecting to shim 1306ed6e479b5021fd13cd9befe59da8cce68f1e76e0d016f09831904df142dc" address="unix:///run/containerd/s/0fc2b361f4217aee53a58d7c4191bcb63a8a2d28d54776cfa3fee3719a838d6a" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:17.054953 systemd[1]: Started cri-containerd-1306ed6e479b5021fd13cd9befe59da8cce68f1e76e0d016f09831904df142dc.scope - libcontainer container 1306ed6e479b5021fd13cd9befe59da8cce68f1e76e0d016f09831904df142dc. Dec 12 17:26:17.090692 containerd[1508]: time="2025-12-12T17:26:17.090581947Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-xgkvx,Uid:57cf4d09-0311-4b4e-b150-193cc67abd90,Namespace:calico-system,Attempt:0,} returns sandbox id \"1306ed6e479b5021fd13cd9befe59da8cce68f1e76e0d016f09831904df142dc\"" Dec 12 17:26:17.354820 kubelet[2774]: E1212 17:26:17.353994 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nnwtm" podUID="df83329f-2747-4a89-9a6a-7b20123df2cf" Dec 12 17:26:19.354415 kubelet[2774]: E1212 17:26:19.354015 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nnwtm" podUID="df83329f-2747-4a89-9a6a-7b20123df2cf" Dec 12 17:26:21.354567 kubelet[2774]: E1212 17:26:21.354505 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nnwtm" podUID="df83329f-2747-4a89-9a6a-7b20123df2cf" Dec 12 17:26:21.732739 containerd[1508]: time="2025-12-12T17:26:21.732690743Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:21.733975 containerd[1508]: time="2025-12-12T17:26:21.733837186Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=33090687" Dec 12 17:26:21.736351 containerd[1508]: time="2025-12-12T17:26:21.735898031Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:21.739611 containerd[1508]: time="2025-12-12T17:26:21.739541160Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:21.741375 containerd[1508]: time="2025-12-12T17:26:21.741262324Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 6.184525135s" Dec 12 17:26:21.741564 containerd[1508]: time="2025-12-12T17:26:21.741532045Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Dec 12 17:26:21.743485 containerd[1508]: time="2025-12-12T17:26:21.743066609Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Dec 12 17:26:21.761164 containerd[1508]: time="2025-12-12T17:26:21.761125574Z" level=info msg="CreateContainer within sandbox \"43204a3fb049a83f92eb9a5003896266e883805a26333ea54c638db5e928175b\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Dec 12 17:26:21.774658 containerd[1508]: time="2025-12-12T17:26:21.772953443Z" level=info msg="Container d8201fd00f658b5eb5e69c029db2fd5bc848d1c67301093297a3b06961281397: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:26:21.778514 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3152643129.mount: Deactivated successfully. Dec 12 17:26:21.784446 containerd[1508]: time="2025-12-12T17:26:21.784354152Z" level=info msg="CreateContainer within sandbox \"43204a3fb049a83f92eb9a5003896266e883805a26333ea54c638db5e928175b\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"d8201fd00f658b5eb5e69c029db2fd5bc848d1c67301093297a3b06961281397\"" Dec 12 17:26:21.786228 containerd[1508]: time="2025-12-12T17:26:21.786176276Z" level=info msg="StartContainer for \"d8201fd00f658b5eb5e69c029db2fd5bc848d1c67301093297a3b06961281397\"" Dec 12 17:26:21.789563 containerd[1508]: time="2025-12-12T17:26:21.789512205Z" level=info msg="connecting to shim d8201fd00f658b5eb5e69c029db2fd5bc848d1c67301093297a3b06961281397" address="unix:///run/containerd/s/f31ca35adddacf05f3663ffacddc893592eb0f0c6c6e58736c16300beb833d49" protocol=ttrpc version=3 Dec 12 17:26:21.820226 systemd[1]: Started cri-containerd-d8201fd00f658b5eb5e69c029db2fd5bc848d1c67301093297a3b06961281397.scope - libcontainer container d8201fd00f658b5eb5e69c029db2fd5bc848d1c67301093297a3b06961281397. Dec 12 17:26:21.867518 containerd[1508]: time="2025-12-12T17:26:21.867455839Z" level=info msg="StartContainer for \"d8201fd00f658b5eb5e69c029db2fd5bc848d1c67301093297a3b06961281397\" returns successfully" Dec 12 17:26:22.524324 kubelet[2774]: I1212 17:26:22.523560 2774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-c7d585977-xl5q8" podStartSLOduration=1.336936157 podStartE2EDuration="7.523540217s" podCreationTimestamp="2025-12-12 17:26:15 +0000 UTC" firstStartedPulling="2025-12-12 17:26:15.556166228 +0000 UTC m=+24.347348096" lastFinishedPulling="2025-12-12 17:26:21.742770288 +0000 UTC m=+30.533952156" observedRunningTime="2025-12-12 17:26:22.521736613 +0000 UTC m=+31.312918521" watchObservedRunningTime="2025-12-12 17:26:22.523540217 +0000 UTC m=+31.314722165" Dec 12 17:26:22.566371 kubelet[2774]: E1212 17:26:22.566318 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.566371 kubelet[2774]: W1212 17:26:22.566355 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.566371 kubelet[2774]: E1212 17:26:22.566385 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.566953 kubelet[2774]: E1212 17:26:22.566648 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.566953 kubelet[2774]: W1212 17:26:22.566663 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.566953 kubelet[2774]: E1212 17:26:22.566719 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.566953 kubelet[2774]: E1212 17:26:22.566918 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.566953 kubelet[2774]: W1212 17:26:22.566930 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.566953 kubelet[2774]: E1212 17:26:22.566943 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.567548 kubelet[2774]: E1212 17:26:22.567115 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.567548 kubelet[2774]: W1212 17:26:22.567125 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.567548 kubelet[2774]: E1212 17:26:22.567136 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.567548 kubelet[2774]: E1212 17:26:22.567322 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.567548 kubelet[2774]: W1212 17:26:22.567333 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.567548 kubelet[2774]: E1212 17:26:22.567344 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.567548 kubelet[2774]: E1212 17:26:22.567512 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.567548 kubelet[2774]: W1212 17:26:22.567522 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.567548 kubelet[2774]: E1212 17:26:22.567533 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.568809 kubelet[2774]: E1212 17:26:22.567717 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.568809 kubelet[2774]: W1212 17:26:22.567728 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.568809 kubelet[2774]: E1212 17:26:22.567740 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.568809 kubelet[2774]: E1212 17:26:22.567917 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.568809 kubelet[2774]: W1212 17:26:22.567928 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.568809 kubelet[2774]: E1212 17:26:22.567942 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.568809 kubelet[2774]: E1212 17:26:22.568148 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.568809 kubelet[2774]: W1212 17:26:22.568160 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.568809 kubelet[2774]: E1212 17:26:22.568173 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.568809 kubelet[2774]: E1212 17:26:22.568394 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.569866 kubelet[2774]: W1212 17:26:22.568408 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.569866 kubelet[2774]: E1212 17:26:22.568421 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.569866 kubelet[2774]: E1212 17:26:22.568669 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.569866 kubelet[2774]: W1212 17:26:22.568682 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.569866 kubelet[2774]: E1212 17:26:22.568697 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.569866 kubelet[2774]: E1212 17:26:22.568895 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.569866 kubelet[2774]: W1212 17:26:22.568907 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.569866 kubelet[2774]: E1212 17:26:22.568921 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.569866 kubelet[2774]: E1212 17:26:22.569116 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.569866 kubelet[2774]: W1212 17:26:22.569127 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.570078 kubelet[2774]: E1212 17:26:22.569203 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.570078 kubelet[2774]: E1212 17:26:22.569470 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.570078 kubelet[2774]: W1212 17:26:22.569483 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.570078 kubelet[2774]: E1212 17:26:22.569499 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.570078 kubelet[2774]: E1212 17:26:22.569751 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.570078 kubelet[2774]: W1212 17:26:22.569764 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.570078 kubelet[2774]: E1212 17:26:22.569779 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.599475 kubelet[2774]: E1212 17:26:22.599284 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.599475 kubelet[2774]: W1212 17:26:22.599334 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.599475 kubelet[2774]: E1212 17:26:22.599357 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.599862 kubelet[2774]: E1212 17:26:22.599714 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.599862 kubelet[2774]: W1212 17:26:22.599731 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.600126 kubelet[2774]: E1212 17:26:22.599923 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.600255 kubelet[2774]: E1212 17:26:22.600225 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.600438 kubelet[2774]: W1212 17:26:22.600239 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.600438 kubelet[2774]: E1212 17:26:22.600390 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.600917 kubelet[2774]: E1212 17:26:22.600682 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.600917 kubelet[2774]: W1212 17:26:22.600699 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.600917 kubelet[2774]: E1212 17:26:22.600735 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.601032 kubelet[2774]: E1212 17:26:22.600941 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.601032 kubelet[2774]: W1212 17:26:22.600952 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.601032 kubelet[2774]: E1212 17:26:22.600964 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.601207 kubelet[2774]: E1212 17:26:22.601120 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.601207 kubelet[2774]: W1212 17:26:22.601136 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.601207 kubelet[2774]: E1212 17:26:22.601147 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.601352 kubelet[2774]: E1212 17:26:22.601345 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.601393 kubelet[2774]: W1212 17:26:22.601354 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.601393 kubelet[2774]: E1212 17:26:22.601365 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.601799 kubelet[2774]: E1212 17:26:22.601717 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.601799 kubelet[2774]: W1212 17:26:22.601730 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.601799 kubelet[2774]: E1212 17:26:22.601743 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.602123 kubelet[2774]: E1212 17:26:22.602044 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.602123 kubelet[2774]: W1212 17:26:22.602058 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.602123 kubelet[2774]: E1212 17:26:22.602070 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.602477 kubelet[2774]: E1212 17:26:22.602465 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.602647 kubelet[2774]: W1212 17:26:22.602533 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.602647 kubelet[2774]: E1212 17:26:22.602550 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.602856 kubelet[2774]: E1212 17:26:22.602844 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.602960 kubelet[2774]: W1212 17:26:22.602946 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.603096 kubelet[2774]: E1212 17:26:22.603025 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.603408 kubelet[2774]: E1212 17:26:22.603393 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.603493 kubelet[2774]: W1212 17:26:22.603480 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.603550 kubelet[2774]: E1212 17:26:22.603540 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.603898 kubelet[2774]: E1212 17:26:22.603826 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.603898 kubelet[2774]: W1212 17:26:22.603839 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.603898 kubelet[2774]: E1212 17:26:22.603851 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.604246 kubelet[2774]: E1212 17:26:22.604150 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.604246 kubelet[2774]: W1212 17:26:22.604163 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.604246 kubelet[2774]: E1212 17:26:22.604176 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.604675 kubelet[2774]: E1212 17:26:22.604591 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.604739 kubelet[2774]: W1212 17:26:22.604726 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.604789 kubelet[2774]: E1212 17:26:22.604779 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.605084 kubelet[2774]: E1212 17:26:22.605049 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.605084 kubelet[2774]: W1212 17:26:22.605061 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.605084 kubelet[2774]: E1212 17:26:22.605072 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.605471 kubelet[2774]: E1212 17:26:22.605396 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.605545 kubelet[2774]: W1212 17:26:22.605533 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.605593 kubelet[2774]: E1212 17:26:22.605584 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:22.605945 kubelet[2774]: E1212 17:26:22.605902 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:26:22.605945 kubelet[2774]: W1212 17:26:22.605914 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:26:22.605945 kubelet[2774]: E1212 17:26:22.605925 2774 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:26:23.158116 containerd[1508]: time="2025-12-12T17:26:23.158067791Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:23.159997 containerd[1508]: time="2025-12-12T17:26:23.159945595Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4266741" Dec 12 17:26:23.160998 containerd[1508]: time="2025-12-12T17:26:23.160954158Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:23.165071 containerd[1508]: time="2025-12-12T17:26:23.164922888Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:23.165372 containerd[1508]: time="2025-12-12T17:26:23.165314089Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 1.42220284s" Dec 12 17:26:23.165562 containerd[1508]: time="2025-12-12T17:26:23.165478930Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Dec 12 17:26:23.172417 containerd[1508]: time="2025-12-12T17:26:23.171997786Z" level=info msg="CreateContainer within sandbox \"1306ed6e479b5021fd13cd9befe59da8cce68f1e76e0d016f09831904df142dc\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Dec 12 17:26:23.184244 containerd[1508]: time="2025-12-12T17:26:23.183584976Z" level=info msg="Container 3f3c6332c9374932e5ee4f8e7002d4947a24750f1b4a135b47ce447f7fd213a9: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:26:23.193830 containerd[1508]: time="2025-12-12T17:26:23.193748002Z" level=info msg="CreateContainer within sandbox \"1306ed6e479b5021fd13cd9befe59da8cce68f1e76e0d016f09831904df142dc\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"3f3c6332c9374932e5ee4f8e7002d4947a24750f1b4a135b47ce447f7fd213a9\"" Dec 12 17:26:23.194660 containerd[1508]: time="2025-12-12T17:26:23.194596804Z" level=info msg="StartContainer for \"3f3c6332c9374932e5ee4f8e7002d4947a24750f1b4a135b47ce447f7fd213a9\"" Dec 12 17:26:23.198609 containerd[1508]: time="2025-12-12T17:26:23.198299254Z" level=info msg="connecting to shim 3f3c6332c9374932e5ee4f8e7002d4947a24750f1b4a135b47ce447f7fd213a9" address="unix:///run/containerd/s/0fc2b361f4217aee53a58d7c4191bcb63a8a2d28d54776cfa3fee3719a838d6a" protocol=ttrpc version=3 Dec 12 17:26:23.229891 systemd[1]: Started cri-containerd-3f3c6332c9374932e5ee4f8e7002d4947a24750f1b4a135b47ce447f7fd213a9.scope - libcontainer container 3f3c6332c9374932e5ee4f8e7002d4947a24750f1b4a135b47ce447f7fd213a9. Dec 12 17:26:23.305720 containerd[1508]: time="2025-12-12T17:26:23.305665890Z" level=info msg="StartContainer for \"3f3c6332c9374932e5ee4f8e7002d4947a24750f1b4a135b47ce447f7fd213a9\" returns successfully" Dec 12 17:26:23.323854 systemd[1]: cri-containerd-3f3c6332c9374932e5ee4f8e7002d4947a24750f1b4a135b47ce447f7fd213a9.scope: Deactivated successfully. Dec 12 17:26:23.330008 containerd[1508]: time="2025-12-12T17:26:23.329605311Z" level=info msg="received container exit event container_id:\"3f3c6332c9374932e5ee4f8e7002d4947a24750f1b4a135b47ce447f7fd213a9\" id:\"3f3c6332c9374932e5ee4f8e7002d4947a24750f1b4a135b47ce447f7fd213a9\" pid:3466 exited_at:{seconds:1765560383 nanos:329114030}" Dec 12 17:26:23.354692 kubelet[2774]: E1212 17:26:23.354013 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nnwtm" podUID="df83329f-2747-4a89-9a6a-7b20123df2cf" Dec 12 17:26:23.364270 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3f3c6332c9374932e5ee4f8e7002d4947a24750f1b4a135b47ce447f7fd213a9-rootfs.mount: Deactivated successfully. Dec 12 17:26:23.510575 kubelet[2774]: I1212 17:26:23.510541 2774 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 12 17:26:23.513316 containerd[1508]: time="2025-12-12T17:26:23.513251223Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Dec 12 17:26:25.358770 kubelet[2774]: E1212 17:26:25.358355 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nnwtm" podUID="df83329f-2747-4a89-9a6a-7b20123df2cf" Dec 12 17:26:26.155982 containerd[1508]: time="2025-12-12T17:26:26.155911484Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:26.157855 containerd[1508]: time="2025-12-12T17:26:26.157772529Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65925816" Dec 12 17:26:26.158727 containerd[1508]: time="2025-12-12T17:26:26.158674651Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:26.162650 containerd[1508]: time="2025-12-12T17:26:26.162524781Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:26.163329 containerd[1508]: time="2025-12-12T17:26:26.163181863Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 2.64987328s" Dec 12 17:26:26.163329 containerd[1508]: time="2025-12-12T17:26:26.163217143Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Dec 12 17:26:26.169973 containerd[1508]: time="2025-12-12T17:26:26.169894881Z" level=info msg="CreateContainer within sandbox \"1306ed6e479b5021fd13cd9befe59da8cce68f1e76e0d016f09831904df142dc\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Dec 12 17:26:26.182686 containerd[1508]: time="2025-12-12T17:26:26.181477072Z" level=info msg="Container 7711cc5f67b8f1b4b4d29e35fdc4b5e18df51d0d3e4eb8deedab7971f253db89: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:26:26.199238 containerd[1508]: time="2025-12-12T17:26:26.199187159Z" level=info msg="CreateContainer within sandbox \"1306ed6e479b5021fd13cd9befe59da8cce68f1e76e0d016f09831904df142dc\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"7711cc5f67b8f1b4b4d29e35fdc4b5e18df51d0d3e4eb8deedab7971f253db89\"" Dec 12 17:26:26.200304 containerd[1508]: time="2025-12-12T17:26:26.200251162Z" level=info msg="StartContainer for \"7711cc5f67b8f1b4b4d29e35fdc4b5e18df51d0d3e4eb8deedab7971f253db89\"" Dec 12 17:26:26.204016 containerd[1508]: time="2025-12-12T17:26:26.203971492Z" level=info msg="connecting to shim 7711cc5f67b8f1b4b4d29e35fdc4b5e18df51d0d3e4eb8deedab7971f253db89" address="unix:///run/containerd/s/0fc2b361f4217aee53a58d7c4191bcb63a8a2d28d54776cfa3fee3719a838d6a" protocol=ttrpc version=3 Dec 12 17:26:26.232884 systemd[1]: Started cri-containerd-7711cc5f67b8f1b4b4d29e35fdc4b5e18df51d0d3e4eb8deedab7971f253db89.scope - libcontainer container 7711cc5f67b8f1b4b4d29e35fdc4b5e18df51d0d3e4eb8deedab7971f253db89. Dec 12 17:26:26.314586 containerd[1508]: time="2025-12-12T17:26:26.314534987Z" level=info msg="StartContainer for \"7711cc5f67b8f1b4b4d29e35fdc4b5e18df51d0d3e4eb8deedab7971f253db89\" returns successfully" Dec 12 17:26:26.873054 systemd[1]: cri-containerd-7711cc5f67b8f1b4b4d29e35fdc4b5e18df51d0d3e4eb8deedab7971f253db89.scope: Deactivated successfully. Dec 12 17:26:26.873395 systemd[1]: cri-containerd-7711cc5f67b8f1b4b4d29e35fdc4b5e18df51d0d3e4eb8deedab7971f253db89.scope: Consumed 547ms CPU time, 186M memory peak, 165.9M written to disk. Dec 12 17:26:26.876043 containerd[1508]: time="2025-12-12T17:26:26.876002203Z" level=info msg="received container exit event container_id:\"7711cc5f67b8f1b4b4d29e35fdc4b5e18df51d0d3e4eb8deedab7971f253db89\" id:\"7711cc5f67b8f1b4b4d29e35fdc4b5e18df51d0d3e4eb8deedab7971f253db89\" pid:3526 exited_at:{seconds:1765560386 nanos:875610802}" Dec 12 17:26:26.901488 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7711cc5f67b8f1b4b4d29e35fdc4b5e18df51d0d3e4eb8deedab7971f253db89-rootfs.mount: Deactivated successfully. Dec 12 17:26:26.933560 kubelet[2774]: I1212 17:26:26.933138 2774 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Dec 12 17:26:26.999929 systemd[1]: Created slice kubepods-burstable-podf8cc5a88_2710_4948_9628_460c49c0e518.slice - libcontainer container kubepods-burstable-podf8cc5a88_2710_4948_9628_460c49c0e518.slice. Dec 12 17:26:27.013428 systemd[1]: Created slice kubepods-besteffort-pod02827e28_67cb_418d_b5bc_487012be465e.slice - libcontainer container kubepods-besteffort-pod02827e28_67cb_418d_b5bc_487012be465e.slice. Dec 12 17:26:27.031476 kubelet[2774]: I1212 17:26:27.031395 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/b8ca6b0c-01ec-442d-b526-fb52b3677452-calico-apiserver-certs\") pod \"calico-apiserver-b9d8f9566-clxs9\" (UID: \"b8ca6b0c-01ec-442d-b526-fb52b3677452\") " pod="calico-apiserver/calico-apiserver-b9d8f9566-clxs9" Dec 12 17:26:27.033108 kubelet[2774]: I1212 17:26:27.031449 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmwjr\" (UniqueName: \"kubernetes.io/projected/b8ca6b0c-01ec-442d-b526-fb52b3677452-kube-api-access-zmwjr\") pod \"calico-apiserver-b9d8f9566-clxs9\" (UID: \"b8ca6b0c-01ec-442d-b526-fb52b3677452\") " pod="calico-apiserver/calico-apiserver-b9d8f9566-clxs9" Dec 12 17:26:27.033108 kubelet[2774]: I1212 17:26:27.032949 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxvk4\" (UniqueName: \"kubernetes.io/projected/02827e28-67cb-418d-b5bc-487012be465e-kube-api-access-rxvk4\") pod \"calico-kube-controllers-7c69bf577-f29bx\" (UID: \"02827e28-67cb-418d-b5bc-487012be465e\") " pod="calico-system/calico-kube-controllers-7c69bf577-f29bx" Dec 12 17:26:27.033108 kubelet[2774]: I1212 17:26:27.033006 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c7d7af0-8037-467a-96ff-594249aec230-config-volume\") pod \"coredns-66bc5c9577-88wmf\" (UID: \"5c7d7af0-8037-467a-96ff-594249aec230\") " pod="kube-system/coredns-66bc5c9577-88wmf" Dec 12 17:26:27.033108 kubelet[2774]: I1212 17:26:27.033035 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8cc5a88-2710-4948-9628-460c49c0e518-config-volume\") pod \"coredns-66bc5c9577-xw9gv\" (UID: \"f8cc5a88-2710-4948-9628-460c49c0e518\") " pod="kube-system/coredns-66bc5c9577-xw9gv" Dec 12 17:26:27.033108 kubelet[2774]: I1212 17:26:27.033061 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02827e28-67cb-418d-b5bc-487012be465e-tigera-ca-bundle\") pod \"calico-kube-controllers-7c69bf577-f29bx\" (UID: \"02827e28-67cb-418d-b5bc-487012be465e\") " pod="calico-system/calico-kube-controllers-7c69bf577-f29bx" Dec 12 17:26:27.033293 kubelet[2774]: I1212 17:26:27.033082 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qndjw\" (UniqueName: \"kubernetes.io/projected/f8cc5a88-2710-4948-9628-460c49c0e518-kube-api-access-qndjw\") pod \"coredns-66bc5c9577-xw9gv\" (UID: \"f8cc5a88-2710-4948-9628-460c49c0e518\") " pod="kube-system/coredns-66bc5c9577-xw9gv" Dec 12 17:26:27.033694 kubelet[2774]: I1212 17:26:27.033503 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfl7l\" (UniqueName: \"kubernetes.io/projected/5c7d7af0-8037-467a-96ff-594249aec230-kube-api-access-zfl7l\") pod \"coredns-66bc5c9577-88wmf\" (UID: \"5c7d7af0-8037-467a-96ff-594249aec230\") " pod="kube-system/coredns-66bc5c9577-88wmf" Dec 12 17:26:27.034329 systemd[1]: Created slice kubepods-besteffort-podb8ca6b0c_01ec_442d_b526_fb52b3677452.slice - libcontainer container kubepods-besteffort-podb8ca6b0c_01ec_442d_b526_fb52b3677452.slice. Dec 12 17:26:27.049784 systemd[1]: Created slice kubepods-burstable-pod5c7d7af0_8037_467a_96ff_594249aec230.slice - libcontainer container kubepods-burstable-pod5c7d7af0_8037_467a_96ff_594249aec230.slice. Dec 12 17:26:27.060063 systemd[1]: Created slice kubepods-besteffort-pod7c9ca7d5_6db7_4a38_8560_e94f3e1a1488.slice - libcontainer container kubepods-besteffort-pod7c9ca7d5_6db7_4a38_8560_e94f3e1a1488.slice. Dec 12 17:26:27.071850 systemd[1]: Created slice kubepods-besteffort-pod6a3d2e6b_076c_416c_a1cd_cf20e895bb91.slice - libcontainer container kubepods-besteffort-pod6a3d2e6b_076c_416c_a1cd_cf20e895bb91.slice. Dec 12 17:26:27.082039 systemd[1]: Created slice kubepods-besteffort-podddab63ad_c74e_4d1e_9071_4b51ff05d58f.slice - libcontainer container kubepods-besteffort-podddab63ad_c74e_4d1e_9071_4b51ff05d58f.slice. Dec 12 17:26:27.135192 kubelet[2774]: I1212 17:26:27.134460 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a3d2e6b-076c-416c-a1cd-cf20e895bb91-whisker-ca-bundle\") pod \"whisker-5b9666c8df-qpnsc\" (UID: \"6a3d2e6b-076c-416c-a1cd-cf20e895bb91\") " pod="calico-system/whisker-5b9666c8df-qpnsc" Dec 12 17:26:27.135192 kubelet[2774]: I1212 17:26:27.134495 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddab63ad-c74e-4d1e-9071-4b51ff05d58f-config\") pod \"goldmane-7c778bb748-58gsb\" (UID: \"ddab63ad-c74e-4d1e-9071-4b51ff05d58f\") " pod="calico-system/goldmane-7c778bb748-58gsb" Dec 12 17:26:27.136033 kubelet[2774]: I1212 17:26:27.135197 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh6qz\" (UniqueName: \"kubernetes.io/projected/ddab63ad-c74e-4d1e-9071-4b51ff05d58f-kube-api-access-fh6qz\") pod \"goldmane-7c778bb748-58gsb\" (UID: \"ddab63ad-c74e-4d1e-9071-4b51ff05d58f\") " pod="calico-system/goldmane-7c778bb748-58gsb" Dec 12 17:26:27.136033 kubelet[2774]: I1212 17:26:27.135276 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/6a3d2e6b-076c-416c-a1cd-cf20e895bb91-whisker-backend-key-pair\") pod \"whisker-5b9666c8df-qpnsc\" (UID: \"6a3d2e6b-076c-416c-a1cd-cf20e895bb91\") " pod="calico-system/whisker-5b9666c8df-qpnsc" Dec 12 17:26:27.136033 kubelet[2774]: I1212 17:26:27.135296 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc8tc\" (UniqueName: \"kubernetes.io/projected/6a3d2e6b-076c-416c-a1cd-cf20e895bb91-kube-api-access-mc8tc\") pod \"whisker-5b9666c8df-qpnsc\" (UID: \"6a3d2e6b-076c-416c-a1cd-cf20e895bb91\") " pod="calico-system/whisker-5b9666c8df-qpnsc" Dec 12 17:26:27.136033 kubelet[2774]: I1212 17:26:27.135315 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddab63ad-c74e-4d1e-9071-4b51ff05d58f-goldmane-ca-bundle\") pod \"goldmane-7c778bb748-58gsb\" (UID: \"ddab63ad-c74e-4d1e-9071-4b51ff05d58f\") " pod="calico-system/goldmane-7c778bb748-58gsb" Dec 12 17:26:27.136033 kubelet[2774]: I1212 17:26:27.135331 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/ddab63ad-c74e-4d1e-9071-4b51ff05d58f-goldmane-key-pair\") pod \"goldmane-7c778bb748-58gsb\" (UID: \"ddab63ad-c74e-4d1e-9071-4b51ff05d58f\") " pod="calico-system/goldmane-7c778bb748-58gsb" Dec 12 17:26:27.136165 kubelet[2774]: I1212 17:26:27.135385 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/7c9ca7d5-6db7-4a38-8560-e94f3e1a1488-calico-apiserver-certs\") pod \"calico-apiserver-b9d8f9566-tg4kr\" (UID: \"7c9ca7d5-6db7-4a38-8560-e94f3e1a1488\") " pod="calico-apiserver/calico-apiserver-b9d8f9566-tg4kr" Dec 12 17:26:27.136165 kubelet[2774]: I1212 17:26:27.135399 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fdcg\" (UniqueName: \"kubernetes.io/projected/7c9ca7d5-6db7-4a38-8560-e94f3e1a1488-kube-api-access-9fdcg\") pod \"calico-apiserver-b9d8f9566-tg4kr\" (UID: \"7c9ca7d5-6db7-4a38-8560-e94f3e1a1488\") " pod="calico-apiserver/calico-apiserver-b9d8f9566-tg4kr" Dec 12 17:26:27.317936 containerd[1508]: time="2025-12-12T17:26:27.317890510Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-xw9gv,Uid:f8cc5a88-2710-4948-9628-460c49c0e518,Namespace:kube-system,Attempt:0,}" Dec 12 17:26:27.328556 containerd[1508]: time="2025-12-12T17:26:27.327845937Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c69bf577-f29bx,Uid:02827e28-67cb-418d-b5bc-487012be465e,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:27.346751 containerd[1508]: time="2025-12-12T17:26:27.346716028Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b9d8f9566-clxs9,Uid:b8ca6b0c-01ec-442d-b526-fb52b3677452,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:26:27.364408 systemd[1]: Created slice kubepods-besteffort-poddf83329f_2747_4a89_9a6a_7b20123df2cf.slice - libcontainer container kubepods-besteffort-poddf83329f_2747_4a89_9a6a_7b20123df2cf.slice. Dec 12 17:26:27.365287 containerd[1508]: time="2025-12-12T17:26:27.363948074Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-88wmf,Uid:5c7d7af0-8037-467a-96ff-594249aec230,Namespace:kube-system,Attempt:0,}" Dec 12 17:26:27.369494 containerd[1508]: time="2025-12-12T17:26:27.369108568Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b9d8f9566-tg4kr,Uid:7c9ca7d5-6db7-4a38-8560-e94f3e1a1488,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:26:27.373131 containerd[1508]: time="2025-12-12T17:26:27.372995899Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-nnwtm,Uid:df83329f-2747-4a89-9a6a-7b20123df2cf,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:27.382100 containerd[1508]: time="2025-12-12T17:26:27.381915443Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5b9666c8df-qpnsc,Uid:6a3d2e6b-076c-416c-a1cd-cf20e895bb91,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:27.390312 containerd[1508]: time="2025-12-12T17:26:27.389133222Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-58gsb,Uid:ddab63ad-c74e-4d1e-9071-4b51ff05d58f,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:27.553409 containerd[1508]: time="2025-12-12T17:26:27.553082744Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Dec 12 17:26:27.566723 containerd[1508]: time="2025-12-12T17:26:27.566308420Z" level=error msg="Failed to destroy network for sandbox \"23e9af855c60f70fee9c2d7dc35954b5a84d9a56c39eb1d5016d5a26456ac607\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:27.567781 containerd[1508]: time="2025-12-12T17:26:27.567741944Z" level=error msg="Failed to destroy network for sandbox \"a1d0c35e3335a7add85c17e586f2cc04fc6b93e93431c8899fe18cba2792be74\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:27.576947 containerd[1508]: time="2025-12-12T17:26:27.576724248Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c69bf577-f29bx,Uid:02827e28-67cb-418d-b5bc-487012be465e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"23e9af855c60f70fee9c2d7dc35954b5a84d9a56c39eb1d5016d5a26456ac607\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:27.578105 kubelet[2774]: E1212 17:26:27.577934 2774 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"23e9af855c60f70fee9c2d7dc35954b5a84d9a56c39eb1d5016d5a26456ac607\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:27.579049 kubelet[2774]: E1212 17:26:27.578139 2774 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"23e9af855c60f70fee9c2d7dc35954b5a84d9a56c39eb1d5016d5a26456ac607\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7c69bf577-f29bx" Dec 12 17:26:27.579049 kubelet[2774]: E1212 17:26:27.578160 2774 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"23e9af855c60f70fee9c2d7dc35954b5a84d9a56c39eb1d5016d5a26456ac607\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7c69bf577-f29bx" Dec 12 17:26:27.579049 kubelet[2774]: E1212 17:26:27.578505 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7c69bf577-f29bx_calico-system(02827e28-67cb-418d-b5bc-487012be465e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7c69bf577-f29bx_calico-system(02827e28-67cb-418d-b5bc-487012be465e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"23e9af855c60f70fee9c2d7dc35954b5a84d9a56c39eb1d5016d5a26456ac607\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7c69bf577-f29bx" podUID="02827e28-67cb-418d-b5bc-487012be465e" Dec 12 17:26:27.583192 containerd[1508]: time="2025-12-12T17:26:27.582698704Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-88wmf,Uid:5c7d7af0-8037-467a-96ff-594249aec230,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1d0c35e3335a7add85c17e586f2cc04fc6b93e93431c8899fe18cba2792be74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:27.585462 kubelet[2774]: E1212 17:26:27.585237 2774 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1d0c35e3335a7add85c17e586f2cc04fc6b93e93431c8899fe18cba2792be74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:27.585462 kubelet[2774]: E1212 17:26:27.585307 2774 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1d0c35e3335a7add85c17e586f2cc04fc6b93e93431c8899fe18cba2792be74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-88wmf" Dec 12 17:26:27.585462 kubelet[2774]: E1212 17:26:27.585325 2774 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1d0c35e3335a7add85c17e586f2cc04fc6b93e93431c8899fe18cba2792be74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-88wmf" Dec 12 17:26:27.586399 kubelet[2774]: E1212 17:26:27.585376 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-88wmf_kube-system(5c7d7af0-8037-467a-96ff-594249aec230)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-88wmf_kube-system(5c7d7af0-8037-467a-96ff-594249aec230)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a1d0c35e3335a7add85c17e586f2cc04fc6b93e93431c8899fe18cba2792be74\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-88wmf" podUID="5c7d7af0-8037-467a-96ff-594249aec230" Dec 12 17:26:27.609013 containerd[1508]: time="2025-12-12T17:26:27.608964575Z" level=error msg="Failed to destroy network for sandbox \"fce0650598298fc28b9a0a6ebb82e784de448ea961e8756024c2302e9c169d5f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:27.615561 containerd[1508]: time="2025-12-12T17:26:27.615432112Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b9d8f9566-tg4kr,Uid:7c9ca7d5-6db7-4a38-8560-e94f3e1a1488,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fce0650598298fc28b9a0a6ebb82e784de448ea961e8756024c2302e9c169d5f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:27.616011 kubelet[2774]: E1212 17:26:27.615949 2774 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fce0650598298fc28b9a0a6ebb82e784de448ea961e8756024c2302e9c169d5f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:27.616074 kubelet[2774]: E1212 17:26:27.616021 2774 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fce0650598298fc28b9a0a6ebb82e784de448ea961e8756024c2302e9c169d5f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b9d8f9566-tg4kr" Dec 12 17:26:27.616074 kubelet[2774]: E1212 17:26:27.616042 2774 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fce0650598298fc28b9a0a6ebb82e784de448ea961e8756024c2302e9c169d5f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b9d8f9566-tg4kr" Dec 12 17:26:27.616074 kubelet[2774]: E1212 17:26:27.616098 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b9d8f9566-tg4kr_calico-apiserver(7c9ca7d5-6db7-4a38-8560-e94f3e1a1488)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b9d8f9566-tg4kr_calico-apiserver(7c9ca7d5-6db7-4a38-8560-e94f3e1a1488)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fce0650598298fc28b9a0a6ebb82e784de448ea961e8756024c2302e9c169d5f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-tg4kr" podUID="7c9ca7d5-6db7-4a38-8560-e94f3e1a1488" Dec 12 17:26:27.621680 containerd[1508]: time="2025-12-12T17:26:27.621439168Z" level=error msg="Failed to destroy network for sandbox \"a12662dd44632d8294bd6327984b8a5664603475f2216a9a73bfdeb486faff0b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:27.626369 containerd[1508]: time="2025-12-12T17:26:27.626319581Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-xw9gv,Uid:f8cc5a88-2710-4948-9628-460c49c0e518,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a12662dd44632d8294bd6327984b8a5664603475f2216a9a73bfdeb486faff0b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:27.627933 kubelet[2774]: E1212 17:26:27.627871 2774 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a12662dd44632d8294bd6327984b8a5664603475f2216a9a73bfdeb486faff0b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:27.628060 kubelet[2774]: E1212 17:26:27.627957 2774 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a12662dd44632d8294bd6327984b8a5664603475f2216a9a73bfdeb486faff0b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-xw9gv" Dec 12 17:26:27.628060 kubelet[2774]: E1212 17:26:27.627977 2774 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a12662dd44632d8294bd6327984b8a5664603475f2216a9a73bfdeb486faff0b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-xw9gv" Dec 12 17:26:27.628060 kubelet[2774]: E1212 17:26:27.628033 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-xw9gv_kube-system(f8cc5a88-2710-4948-9628-460c49c0e518)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-xw9gv_kube-system(f8cc5a88-2710-4948-9628-460c49c0e518)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a12662dd44632d8294bd6327984b8a5664603475f2216a9a73bfdeb486faff0b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-xw9gv" podUID="f8cc5a88-2710-4948-9628-460c49c0e518" Dec 12 17:26:27.631163 containerd[1508]: time="2025-12-12T17:26:27.631112034Z" level=error msg="Failed to destroy network for sandbox \"6d81c689d7216b309828a22b38061f71a224ab5488a02242e864bdd075dec1f4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:27.634506 containerd[1508]: time="2025-12-12T17:26:27.634278243Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b9d8f9566-clxs9,Uid:b8ca6b0c-01ec-442d-b526-fb52b3677452,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6d81c689d7216b309828a22b38061f71a224ab5488a02242e864bdd075dec1f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:27.635418 kubelet[2774]: E1212 17:26:27.634562 2774 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6d81c689d7216b309828a22b38061f71a224ab5488a02242e864bdd075dec1f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:27.635418 kubelet[2774]: E1212 17:26:27.634648 2774 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6d81c689d7216b309828a22b38061f71a224ab5488a02242e864bdd075dec1f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b9d8f9566-clxs9" Dec 12 17:26:27.635418 kubelet[2774]: E1212 17:26:27.634671 2774 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6d81c689d7216b309828a22b38061f71a224ab5488a02242e864bdd075dec1f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b9d8f9566-clxs9" Dec 12 17:26:27.635550 kubelet[2774]: E1212 17:26:27.634722 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b9d8f9566-clxs9_calico-apiserver(b8ca6b0c-01ec-442d-b526-fb52b3677452)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b9d8f9566-clxs9_calico-apiserver(b8ca6b0c-01ec-442d-b526-fb52b3677452)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6d81c689d7216b309828a22b38061f71a224ab5488a02242e864bdd075dec1f4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-clxs9" podUID="b8ca6b0c-01ec-442d-b526-fb52b3677452" Dec 12 17:26:27.654677 containerd[1508]: time="2025-12-12T17:26:27.654395417Z" level=error msg="Failed to destroy network for sandbox \"516ea57e898a064f58b64910b747c9b5ecae155b45736d7304f85dc4e977e9ac\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:27.655192 containerd[1508]: time="2025-12-12T17:26:27.654947819Z" level=error msg="Failed to destroy network for sandbox \"c16e3690b806ff69c649e01ebb52837a7eb13d5548dd7a5d55b5bfe23a124529\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:27.657984 containerd[1508]: time="2025-12-12T17:26:27.657929187Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-nnwtm,Uid:df83329f-2747-4a89-9a6a-7b20123df2cf,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"516ea57e898a064f58b64910b747c9b5ecae155b45736d7304f85dc4e977e9ac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:27.658233 kubelet[2774]: E1212 17:26:27.658171 2774 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"516ea57e898a064f58b64910b747c9b5ecae155b45736d7304f85dc4e977e9ac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:27.658233 kubelet[2774]: E1212 17:26:27.658228 2774 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"516ea57e898a064f58b64910b747c9b5ecae155b45736d7304f85dc4e977e9ac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-nnwtm" Dec 12 17:26:27.658388 kubelet[2774]: E1212 17:26:27.658256 2774 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"516ea57e898a064f58b64910b747c9b5ecae155b45736d7304f85dc4e977e9ac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-nnwtm" Dec 12 17:26:27.658388 kubelet[2774]: E1212 17:26:27.658306 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-nnwtm_calico-system(df83329f-2747-4a89-9a6a-7b20123df2cf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-nnwtm_calico-system(df83329f-2747-4a89-9a6a-7b20123df2cf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"516ea57e898a064f58b64910b747c9b5ecae155b45736d7304f85dc4e977e9ac\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-nnwtm" podUID="df83329f-2747-4a89-9a6a-7b20123df2cf" Dec 12 17:26:27.659230 containerd[1508]: time="2025-12-12T17:26:27.659192270Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5b9666c8df-qpnsc,Uid:6a3d2e6b-076c-416c-a1cd-cf20e895bb91,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c16e3690b806ff69c649e01ebb52837a7eb13d5548dd7a5d55b5bfe23a124529\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:27.661015 kubelet[2774]: E1212 17:26:27.660965 2774 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c16e3690b806ff69c649e01ebb52837a7eb13d5548dd7a5d55b5bfe23a124529\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:27.661106 kubelet[2774]: E1212 17:26:27.661034 2774 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c16e3690b806ff69c649e01ebb52837a7eb13d5548dd7a5d55b5bfe23a124529\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5b9666c8df-qpnsc" Dec 12 17:26:27.661106 kubelet[2774]: E1212 17:26:27.661069 2774 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c16e3690b806ff69c649e01ebb52837a7eb13d5548dd7a5d55b5bfe23a124529\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5b9666c8df-qpnsc" Dec 12 17:26:27.661493 kubelet[2774]: E1212 17:26:27.661227 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5b9666c8df-qpnsc_calico-system(6a3d2e6b-076c-416c-a1cd-cf20e895bb91)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5b9666c8df-qpnsc_calico-system(6a3d2e6b-076c-416c-a1cd-cf20e895bb91)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c16e3690b806ff69c649e01ebb52837a7eb13d5548dd7a5d55b5bfe23a124529\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5b9666c8df-qpnsc" podUID="6a3d2e6b-076c-416c-a1cd-cf20e895bb91" Dec 12 17:26:27.668508 containerd[1508]: time="2025-12-12T17:26:27.668396695Z" level=error msg="Failed to destroy network for sandbox \"f999759d4eb8dff3ac888c346b7189b1bb99eceb3cf064d1717bca5a4496cb5e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:27.670416 containerd[1508]: time="2025-12-12T17:26:27.670371500Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-58gsb,Uid:ddab63ad-c74e-4d1e-9071-4b51ff05d58f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f999759d4eb8dff3ac888c346b7189b1bb99eceb3cf064d1717bca5a4496cb5e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:27.671941 kubelet[2774]: E1212 17:26:27.671896 2774 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f999759d4eb8dff3ac888c346b7189b1bb99eceb3cf064d1717bca5a4496cb5e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:26:27.672151 kubelet[2774]: E1212 17:26:27.672102 2774 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f999759d4eb8dff3ac888c346b7189b1bb99eceb3cf064d1717bca5a4496cb5e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-58gsb" Dec 12 17:26:27.672151 kubelet[2774]: E1212 17:26:27.672130 2774 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f999759d4eb8dff3ac888c346b7189b1bb99eceb3cf064d1717bca5a4496cb5e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-58gsb" Dec 12 17:26:27.672393 kubelet[2774]: E1212 17:26:27.672296 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7c778bb748-58gsb_calico-system(ddab63ad-c74e-4d1e-9071-4b51ff05d58f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7c778bb748-58gsb_calico-system(ddab63ad-c74e-4d1e-9071-4b51ff05d58f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f999759d4eb8dff3ac888c346b7189b1bb99eceb3cf064d1717bca5a4496cb5e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7c778bb748-58gsb" podUID="ddab63ad-c74e-4d1e-9071-4b51ff05d58f" Dec 12 17:26:32.102411 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2567980883.mount: Deactivated successfully. Dec 12 17:26:32.139423 containerd[1508]: time="2025-12-12T17:26:32.139355488Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150934562" Dec 12 17:26:32.144004 containerd[1508]: time="2025-12-12T17:26:32.143951781Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 4.590822797s" Dec 12 17:26:32.144004 containerd[1508]: time="2025-12-12T17:26:32.143997141Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Dec 12 17:26:32.147377 containerd[1508]: time="2025-12-12T17:26:32.147165270Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:32.148752 containerd[1508]: time="2025-12-12T17:26:32.148533114Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:32.150864 containerd[1508]: time="2025-12-12T17:26:32.150812720Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:26:32.170981 containerd[1508]: time="2025-12-12T17:26:32.170937697Z" level=info msg="CreateContainer within sandbox \"1306ed6e479b5021fd13cd9befe59da8cce68f1e76e0d016f09831904df142dc\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Dec 12 17:26:32.186688 containerd[1508]: time="2025-12-12T17:26:32.184313615Z" level=info msg="Container e98c188cdc865eb9b76f9bd28a9a889e20ea9ce0a5a4fa2513f9e4ee0c7e58c9: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:26:32.185115 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4047778652.mount: Deactivated successfully. Dec 12 17:26:32.202175 containerd[1508]: time="2025-12-12T17:26:32.202050386Z" level=info msg="CreateContainer within sandbox \"1306ed6e479b5021fd13cd9befe59da8cce68f1e76e0d016f09831904df142dc\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"e98c188cdc865eb9b76f9bd28a9a889e20ea9ce0a5a4fa2513f9e4ee0c7e58c9\"" Dec 12 17:26:32.203679 containerd[1508]: time="2025-12-12T17:26:32.202781388Z" level=info msg="StartContainer for \"e98c188cdc865eb9b76f9bd28a9a889e20ea9ce0a5a4fa2513f9e4ee0c7e58c9\"" Dec 12 17:26:32.204736 containerd[1508]: time="2025-12-12T17:26:32.204698833Z" level=info msg="connecting to shim e98c188cdc865eb9b76f9bd28a9a889e20ea9ce0a5a4fa2513f9e4ee0c7e58c9" address="unix:///run/containerd/s/0fc2b361f4217aee53a58d7c4191bcb63a8a2d28d54776cfa3fee3719a838d6a" protocol=ttrpc version=3 Dec 12 17:26:32.274054 systemd[1]: Started cri-containerd-e98c188cdc865eb9b76f9bd28a9a889e20ea9ce0a5a4fa2513f9e4ee0c7e58c9.scope - libcontainer container e98c188cdc865eb9b76f9bd28a9a889e20ea9ce0a5a4fa2513f9e4ee0c7e58c9. Dec 12 17:26:32.360746 containerd[1508]: time="2025-12-12T17:26:32.360614395Z" level=info msg="StartContainer for \"e98c188cdc865eb9b76f9bd28a9a889e20ea9ce0a5a4fa2513f9e4ee0c7e58c9\" returns successfully" Dec 12 17:26:32.502643 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Dec 12 17:26:32.502764 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Dec 12 17:26:32.608401 kubelet[2774]: I1212 17:26:32.608313 2774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-xgkvx" podStartSLOduration=2.555856345 podStartE2EDuration="17.608292416s" podCreationTimestamp="2025-12-12 17:26:15 +0000 UTC" firstStartedPulling="2025-12-12 17:26:17.092374752 +0000 UTC m=+25.883556620" lastFinishedPulling="2025-12-12 17:26:32.144810823 +0000 UTC m=+40.935992691" observedRunningTime="2025-12-12 17:26:32.607356894 +0000 UTC m=+41.398538762" watchObservedRunningTime="2025-12-12 17:26:32.608292416 +0000 UTC m=+41.399474244" Dec 12 17:26:32.783233 kubelet[2774]: I1212 17:26:32.783141 2774 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a3d2e6b-076c-416c-a1cd-cf20e895bb91-whisker-ca-bundle\") pod \"6a3d2e6b-076c-416c-a1cd-cf20e895bb91\" (UID: \"6a3d2e6b-076c-416c-a1cd-cf20e895bb91\") " Dec 12 17:26:32.783637 kubelet[2774]: I1212 17:26:32.783217 2774 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/6a3d2e6b-076c-416c-a1cd-cf20e895bb91-whisker-backend-key-pair\") pod \"6a3d2e6b-076c-416c-a1cd-cf20e895bb91\" (UID: \"6a3d2e6b-076c-416c-a1cd-cf20e895bb91\") " Dec 12 17:26:32.783637 kubelet[2774]: I1212 17:26:32.783433 2774 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc8tc\" (UniqueName: \"kubernetes.io/projected/6a3d2e6b-076c-416c-a1cd-cf20e895bb91-kube-api-access-mc8tc\") pod \"6a3d2e6b-076c-416c-a1cd-cf20e895bb91\" (UID: \"6a3d2e6b-076c-416c-a1cd-cf20e895bb91\") " Dec 12 17:26:32.783637 kubelet[2774]: I1212 17:26:32.783592 2774 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a3d2e6b-076c-416c-a1cd-cf20e895bb91-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "6a3d2e6b-076c-416c-a1cd-cf20e895bb91" (UID: "6a3d2e6b-076c-416c-a1cd-cf20e895bb91"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Dec 12 17:26:32.788780 kubelet[2774]: I1212 17:26:32.788723 2774 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a3d2e6b-076c-416c-a1cd-cf20e895bb91-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "6a3d2e6b-076c-416c-a1cd-cf20e895bb91" (UID: "6a3d2e6b-076c-416c-a1cd-cf20e895bb91"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Dec 12 17:26:32.790375 kubelet[2774]: I1212 17:26:32.790071 2774 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a3d2e6b-076c-416c-a1cd-cf20e895bb91-kube-api-access-mc8tc" (OuterVolumeSpecName: "kube-api-access-mc8tc") pod "6a3d2e6b-076c-416c-a1cd-cf20e895bb91" (UID: "6a3d2e6b-076c-416c-a1cd-cf20e895bb91"). InnerVolumeSpecName "kube-api-access-mc8tc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Dec 12 17:26:32.884840 kubelet[2774]: I1212 17:26:32.884788 2774 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a3d2e6b-076c-416c-a1cd-cf20e895bb91-whisker-ca-bundle\") on node \"ci-4459-2-2-1-a1e622265d\" DevicePath \"\"" Dec 12 17:26:32.884840 kubelet[2774]: I1212 17:26:32.884842 2774 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/6a3d2e6b-076c-416c-a1cd-cf20e895bb91-whisker-backend-key-pair\") on node \"ci-4459-2-2-1-a1e622265d\" DevicePath \"\"" Dec 12 17:26:32.885078 kubelet[2774]: I1212 17:26:32.884856 2774 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mc8tc\" (UniqueName: \"kubernetes.io/projected/6a3d2e6b-076c-416c-a1cd-cf20e895bb91-kube-api-access-mc8tc\") on node \"ci-4459-2-2-1-a1e622265d\" DevicePath \"\"" Dec 12 17:26:33.105955 systemd[1]: var-lib-kubelet-pods-6a3d2e6b\x2d076c\x2d416c\x2da1cd\x2dcf20e895bb91-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dmc8tc.mount: Deactivated successfully. Dec 12 17:26:33.106106 systemd[1]: var-lib-kubelet-pods-6a3d2e6b\x2d076c\x2d416c\x2da1cd\x2dcf20e895bb91-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Dec 12 17:26:33.365148 systemd[1]: Removed slice kubepods-besteffort-pod6a3d2e6b_076c_416c_a1cd_cf20e895bb91.slice - libcontainer container kubepods-besteffort-pod6a3d2e6b_076c_416c_a1cd_cf20e895bb91.slice. Dec 12 17:26:33.585736 kubelet[2774]: I1212 17:26:33.585663 2774 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 12 17:26:33.683033 systemd[1]: Created slice kubepods-besteffort-pod73777a70_bcad_484a_aa29_795f5b22b302.slice - libcontainer container kubepods-besteffort-pod73777a70_bcad_484a_aa29_795f5b22b302.slice. Dec 12 17:26:33.791420 kubelet[2774]: I1212 17:26:33.791339 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtsv2\" (UniqueName: \"kubernetes.io/projected/73777a70-bcad-484a-aa29-795f5b22b302-kube-api-access-jtsv2\") pod \"whisker-fd8bd9d8f-pgxdr\" (UID: \"73777a70-bcad-484a-aa29-795f5b22b302\") " pod="calico-system/whisker-fd8bd9d8f-pgxdr" Dec 12 17:26:33.792323 kubelet[2774]: I1212 17:26:33.792048 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73777a70-bcad-484a-aa29-795f5b22b302-whisker-ca-bundle\") pod \"whisker-fd8bd9d8f-pgxdr\" (UID: \"73777a70-bcad-484a-aa29-795f5b22b302\") " pod="calico-system/whisker-fd8bd9d8f-pgxdr" Dec 12 17:26:33.792323 kubelet[2774]: I1212 17:26:33.792174 2774 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/73777a70-bcad-484a-aa29-795f5b22b302-whisker-backend-key-pair\") pod \"whisker-fd8bd9d8f-pgxdr\" (UID: \"73777a70-bcad-484a-aa29-795f5b22b302\") " pod="calico-system/whisker-fd8bd9d8f-pgxdr" Dec 12 17:26:33.992036 containerd[1508]: time="2025-12-12T17:26:33.991959081Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-fd8bd9d8f-pgxdr,Uid:73777a70-bcad-484a-aa29-795f5b22b302,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:34.301507 systemd-networkd[1418]: cali3e4faf284c3: Link UP Dec 12 17:26:34.302855 systemd-networkd[1418]: cali3e4faf284c3: Gained carrier Dec 12 17:26:34.329652 containerd[1508]: 2025-12-12 17:26:34.047 [INFO][3896] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 12 17:26:34.329652 containerd[1508]: 2025-12-12 17:26:34.131 [INFO][3896] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--1--a1e622265d-k8s-whisker--fd8bd9d8f--pgxdr-eth0 whisker-fd8bd9d8f- calico-system 73777a70-bcad-484a-aa29-795f5b22b302 882 0 2025-12-12 17:26:33 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:fd8bd9d8f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-2-1-a1e622265d whisker-fd8bd9d8f-pgxdr eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali3e4faf284c3 [] [] }} ContainerID="0d8f3b6ae4e9a75f7b7c4f75ea4251e7710dff3ee9f98c52233c2e0ad04ef8da" Namespace="calico-system" Pod="whisker-fd8bd9d8f-pgxdr" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-whisker--fd8bd9d8f--pgxdr-" Dec 12 17:26:34.329652 containerd[1508]: 2025-12-12 17:26:34.131 [INFO][3896] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0d8f3b6ae4e9a75f7b7c4f75ea4251e7710dff3ee9f98c52233c2e0ad04ef8da" Namespace="calico-system" Pod="whisker-fd8bd9d8f-pgxdr" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-whisker--fd8bd9d8f--pgxdr-eth0" Dec 12 17:26:34.329652 containerd[1508]: 2025-12-12 17:26:34.214 [INFO][3951] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0d8f3b6ae4e9a75f7b7c4f75ea4251e7710dff3ee9f98c52233c2e0ad04ef8da" HandleID="k8s-pod-network.0d8f3b6ae4e9a75f7b7c4f75ea4251e7710dff3ee9f98c52233c2e0ad04ef8da" Workload="ci--4459--2--2--1--a1e622265d-k8s-whisker--fd8bd9d8f--pgxdr-eth0" Dec 12 17:26:34.330681 containerd[1508]: 2025-12-12 17:26:34.214 [INFO][3951] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="0d8f3b6ae4e9a75f7b7c4f75ea4251e7710dff3ee9f98c52233c2e0ad04ef8da" HandleID="k8s-pod-network.0d8f3b6ae4e9a75f7b7c4f75ea4251e7710dff3ee9f98c52233c2e0ad04ef8da" Workload="ci--4459--2--2--1--a1e622265d-k8s-whisker--fd8bd9d8f--pgxdr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024afe0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-1-a1e622265d", "pod":"whisker-fd8bd9d8f-pgxdr", "timestamp":"2025-12-12 17:26:34.214388602 +0000 UTC"}, Hostname:"ci-4459-2-2-1-a1e622265d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:26:34.330681 containerd[1508]: 2025-12-12 17:26:34.214 [INFO][3951] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:26:34.330681 containerd[1508]: 2025-12-12 17:26:34.214 [INFO][3951] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:26:34.330681 containerd[1508]: 2025-12-12 17:26:34.214 [INFO][3951] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-1-a1e622265d' Dec 12 17:26:34.330681 containerd[1508]: 2025-12-12 17:26:34.232 [INFO][3951] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0d8f3b6ae4e9a75f7b7c4f75ea4251e7710dff3ee9f98c52233c2e0ad04ef8da" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:34.330681 containerd[1508]: 2025-12-12 17:26:34.243 [INFO][3951] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:34.330681 containerd[1508]: 2025-12-12 17:26:34.252 [INFO][3951] ipam/ipam.go 511: Trying affinity for 192.168.72.0/26 host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:34.330681 containerd[1508]: 2025-12-12 17:26:34.256 [INFO][3951] ipam/ipam.go 158: Attempting to load block cidr=192.168.72.0/26 host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:34.330681 containerd[1508]: 2025-12-12 17:26:34.260 [INFO][3951] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:34.331021 containerd[1508]: 2025-12-12 17:26:34.260 [INFO][3951] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.0d8f3b6ae4e9a75f7b7c4f75ea4251e7710dff3ee9f98c52233c2e0ad04ef8da" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:34.331021 containerd[1508]: 2025-12-12 17:26:34.263 [INFO][3951] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.0d8f3b6ae4e9a75f7b7c4f75ea4251e7710dff3ee9f98c52233c2e0ad04ef8da Dec 12 17:26:34.331021 containerd[1508]: 2025-12-12 17:26:34.270 [INFO][3951] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.0d8f3b6ae4e9a75f7b7c4f75ea4251e7710dff3ee9f98c52233c2e0ad04ef8da" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:34.331021 containerd[1508]: 2025-12-12 17:26:34.281 [INFO][3951] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.72.1/26] block=192.168.72.0/26 handle="k8s-pod-network.0d8f3b6ae4e9a75f7b7c4f75ea4251e7710dff3ee9f98c52233c2e0ad04ef8da" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:34.331021 containerd[1508]: 2025-12-12 17:26:34.281 [INFO][3951] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.72.1/26] handle="k8s-pod-network.0d8f3b6ae4e9a75f7b7c4f75ea4251e7710dff3ee9f98c52233c2e0ad04ef8da" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:34.331021 containerd[1508]: 2025-12-12 17:26:34.281 [INFO][3951] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:26:34.331021 containerd[1508]: 2025-12-12 17:26:34.281 [INFO][3951] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.72.1/26] IPv6=[] ContainerID="0d8f3b6ae4e9a75f7b7c4f75ea4251e7710dff3ee9f98c52233c2e0ad04ef8da" HandleID="k8s-pod-network.0d8f3b6ae4e9a75f7b7c4f75ea4251e7710dff3ee9f98c52233c2e0ad04ef8da" Workload="ci--4459--2--2--1--a1e622265d-k8s-whisker--fd8bd9d8f--pgxdr-eth0" Dec 12 17:26:34.331466 containerd[1508]: 2025-12-12 17:26:34.286 [INFO][3896] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0d8f3b6ae4e9a75f7b7c4f75ea4251e7710dff3ee9f98c52233c2e0ad04ef8da" Namespace="calico-system" Pod="whisker-fd8bd9d8f-pgxdr" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-whisker--fd8bd9d8f--pgxdr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--1--a1e622265d-k8s-whisker--fd8bd9d8f--pgxdr-eth0", GenerateName:"whisker-fd8bd9d8f-", Namespace:"calico-system", SelfLink:"", UID:"73777a70-bcad-484a-aa29-795f5b22b302", ResourceVersion:"882", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"fd8bd9d8f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-1-a1e622265d", ContainerID:"", Pod:"whisker-fd8bd9d8f-pgxdr", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.72.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali3e4faf284c3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:34.331466 containerd[1508]: 2025-12-12 17:26:34.286 [INFO][3896] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.1/32] ContainerID="0d8f3b6ae4e9a75f7b7c4f75ea4251e7710dff3ee9f98c52233c2e0ad04ef8da" Namespace="calico-system" Pod="whisker-fd8bd9d8f-pgxdr" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-whisker--fd8bd9d8f--pgxdr-eth0" Dec 12 17:26:34.332004 containerd[1508]: 2025-12-12 17:26:34.286 [INFO][3896] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3e4faf284c3 ContainerID="0d8f3b6ae4e9a75f7b7c4f75ea4251e7710dff3ee9f98c52233c2e0ad04ef8da" Namespace="calico-system" Pod="whisker-fd8bd9d8f-pgxdr" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-whisker--fd8bd9d8f--pgxdr-eth0" Dec 12 17:26:34.332004 containerd[1508]: 2025-12-12 17:26:34.304 [INFO][3896] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0d8f3b6ae4e9a75f7b7c4f75ea4251e7710dff3ee9f98c52233c2e0ad04ef8da" Namespace="calico-system" Pod="whisker-fd8bd9d8f-pgxdr" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-whisker--fd8bd9d8f--pgxdr-eth0" Dec 12 17:26:34.332599 containerd[1508]: 2025-12-12 17:26:34.306 [INFO][3896] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0d8f3b6ae4e9a75f7b7c4f75ea4251e7710dff3ee9f98c52233c2e0ad04ef8da" Namespace="calico-system" Pod="whisker-fd8bd9d8f-pgxdr" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-whisker--fd8bd9d8f--pgxdr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--1--a1e622265d-k8s-whisker--fd8bd9d8f--pgxdr-eth0", GenerateName:"whisker-fd8bd9d8f-", Namespace:"calico-system", SelfLink:"", UID:"73777a70-bcad-484a-aa29-795f5b22b302", ResourceVersion:"882", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"fd8bd9d8f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-1-a1e622265d", ContainerID:"0d8f3b6ae4e9a75f7b7c4f75ea4251e7710dff3ee9f98c52233c2e0ad04ef8da", Pod:"whisker-fd8bd9d8f-pgxdr", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.72.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali3e4faf284c3", MAC:"82:99:6e:35:d2:77", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:34.333749 containerd[1508]: 2025-12-12 17:26:34.324 [INFO][3896] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0d8f3b6ae4e9a75f7b7c4f75ea4251e7710dff3ee9f98c52233c2e0ad04ef8da" Namespace="calico-system" Pod="whisker-fd8bd9d8f-pgxdr" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-whisker--fd8bd9d8f--pgxdr-eth0" Dec 12 17:26:34.393909 containerd[1508]: time="2025-12-12T17:26:34.393812999Z" level=info msg="connecting to shim 0d8f3b6ae4e9a75f7b7c4f75ea4251e7710dff3ee9f98c52233c2e0ad04ef8da" address="unix:///run/containerd/s/cd98695279ce16210d2fbc9d7af2a555eaeec0e8718e0972b82cfff9ddd07a2e" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:34.479942 systemd[1]: Started cri-containerd-0d8f3b6ae4e9a75f7b7c4f75ea4251e7710dff3ee9f98c52233c2e0ad04ef8da.scope - libcontainer container 0d8f3b6ae4e9a75f7b7c4f75ea4251e7710dff3ee9f98c52233c2e0ad04ef8da. Dec 12 17:26:34.578411 containerd[1508]: time="2025-12-12T17:26:34.578241690Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-fd8bd9d8f-pgxdr,Uid:73777a70-bcad-484a-aa29-795f5b22b302,Namespace:calico-system,Attempt:0,} returns sandbox id \"0d8f3b6ae4e9a75f7b7c4f75ea4251e7710dff3ee9f98c52233c2e0ad04ef8da\"" Dec 12 17:26:34.583308 containerd[1508]: time="2025-12-12T17:26:34.583261785Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:26:34.916488 containerd[1508]: time="2025-12-12T17:26:34.916144984Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:34.917805 containerd[1508]: time="2025-12-12T17:26:34.917741389Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:26:34.917934 containerd[1508]: time="2025-12-12T17:26:34.917869549Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:26:34.920298 kubelet[2774]: E1212 17:26:34.920157 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:26:34.920298 kubelet[2774]: E1212 17:26:34.920269 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:26:34.921418 kubelet[2774]: E1212 17:26:34.920359 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-fd8bd9d8f-pgxdr_calico-system(73777a70-bcad-484a-aa29-795f5b22b302): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:34.923509 containerd[1508]: time="2025-12-12T17:26:34.923463005Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:26:35.110867 kubelet[2774]: I1212 17:26:35.110816 2774 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 12 17:26:35.283083 containerd[1508]: time="2025-12-12T17:26:35.282693287Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:35.285244 containerd[1508]: time="2025-12-12T17:26:35.284665613Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:26:35.285244 containerd[1508]: time="2025-12-12T17:26:35.284782293Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:26:35.285411 kubelet[2774]: E1212 17:26:35.284947 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:26:35.285411 kubelet[2774]: E1212 17:26:35.285036 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:26:35.285411 kubelet[2774]: E1212 17:26:35.285111 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-fd8bd9d8f-pgxdr_calico-system(73777a70-bcad-484a-aa29-795f5b22b302): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:35.285511 kubelet[2774]: E1212 17:26:35.285157 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fd8bd9d8f-pgxdr" podUID="73777a70-bcad-484a-aa29-795f5b22b302" Dec 12 17:26:35.358958 kubelet[2774]: I1212 17:26:35.358848 2774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a3d2e6b-076c-416c-a1cd-cf20e895bb91" path="/var/lib/kubelet/pods/6a3d2e6b-076c-416c-a1cd-cf20e895bb91/volumes" Dec 12 17:26:35.596459 kubelet[2774]: E1212 17:26:35.596228 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fd8bd9d8f-pgxdr" podUID="73777a70-bcad-484a-aa29-795f5b22b302" Dec 12 17:26:36.004766 systemd-networkd[1418]: vxlan.calico: Link UP Dec 12 17:26:36.004775 systemd-networkd[1418]: vxlan.calico: Gained carrier Dec 12 17:26:36.291895 systemd-networkd[1418]: cali3e4faf284c3: Gained IPv6LL Dec 12 17:26:37.698916 systemd-networkd[1418]: vxlan.calico: Gained IPv6LL Dec 12 17:26:40.358048 containerd[1508]: time="2025-12-12T17:26:40.357951091Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-nnwtm,Uid:df83329f-2747-4a89-9a6a-7b20123df2cf,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:40.506360 systemd-networkd[1418]: calif435fcd0075: Link UP Dec 12 17:26:40.508459 systemd-networkd[1418]: calif435fcd0075: Gained carrier Dec 12 17:26:40.529217 containerd[1508]: 2025-12-12 17:26:40.409 [INFO][4156] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--1--a1e622265d-k8s-csi--node--driver--nnwtm-eth0 csi-node-driver- calico-system df83329f-2747-4a89-9a6a-7b20123df2cf 717 0 2025-12-12 17:26:15 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:9d99788f7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-2-1-a1e622265d csi-node-driver-nnwtm eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calif435fcd0075 [] [] }} ContainerID="411d7fced35de72e5a764e6a910d8cfda21d1536a620cf0b4a77aa2a532c0387" Namespace="calico-system" Pod="csi-node-driver-nnwtm" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-csi--node--driver--nnwtm-" Dec 12 17:26:40.529217 containerd[1508]: 2025-12-12 17:26:40.409 [INFO][4156] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="411d7fced35de72e5a764e6a910d8cfda21d1536a620cf0b4a77aa2a532c0387" Namespace="calico-system" Pod="csi-node-driver-nnwtm" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-csi--node--driver--nnwtm-eth0" Dec 12 17:26:40.529217 containerd[1508]: 2025-12-12 17:26:40.445 [INFO][4165] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="411d7fced35de72e5a764e6a910d8cfda21d1536a620cf0b4a77aa2a532c0387" HandleID="k8s-pod-network.411d7fced35de72e5a764e6a910d8cfda21d1536a620cf0b4a77aa2a532c0387" Workload="ci--4459--2--2--1--a1e622265d-k8s-csi--node--driver--nnwtm-eth0" Dec 12 17:26:40.530007 containerd[1508]: 2025-12-12 17:26:40.446 [INFO][4165] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="411d7fced35de72e5a764e6a910d8cfda21d1536a620cf0b4a77aa2a532c0387" HandleID="k8s-pod-network.411d7fced35de72e5a764e6a910d8cfda21d1536a620cf0b4a77aa2a532c0387" Workload="ci--4459--2--2--1--a1e622265d-k8s-csi--node--driver--nnwtm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ab440), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-1-a1e622265d", "pod":"csi-node-driver-nnwtm", "timestamp":"2025-12-12 17:26:40.445964236 +0000 UTC"}, Hostname:"ci-4459-2-2-1-a1e622265d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:26:40.530007 containerd[1508]: 2025-12-12 17:26:40.446 [INFO][4165] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:26:40.530007 containerd[1508]: 2025-12-12 17:26:40.446 [INFO][4165] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:26:40.530007 containerd[1508]: 2025-12-12 17:26:40.446 [INFO][4165] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-1-a1e622265d' Dec 12 17:26:40.530007 containerd[1508]: 2025-12-12 17:26:40.458 [INFO][4165] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.411d7fced35de72e5a764e6a910d8cfda21d1536a620cf0b4a77aa2a532c0387" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:40.530007 containerd[1508]: 2025-12-12 17:26:40.464 [INFO][4165] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:40.530007 containerd[1508]: 2025-12-12 17:26:40.472 [INFO][4165] ipam/ipam.go 511: Trying affinity for 192.168.72.0/26 host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:40.530007 containerd[1508]: 2025-12-12 17:26:40.475 [INFO][4165] ipam/ipam.go 158: Attempting to load block cidr=192.168.72.0/26 host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:40.530007 containerd[1508]: 2025-12-12 17:26:40.478 [INFO][4165] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:40.530218 containerd[1508]: 2025-12-12 17:26:40.479 [INFO][4165] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.411d7fced35de72e5a764e6a910d8cfda21d1536a620cf0b4a77aa2a532c0387" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:40.530218 containerd[1508]: 2025-12-12 17:26:40.481 [INFO][4165] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.411d7fced35de72e5a764e6a910d8cfda21d1536a620cf0b4a77aa2a532c0387 Dec 12 17:26:40.530218 containerd[1508]: 2025-12-12 17:26:40.489 [INFO][4165] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.411d7fced35de72e5a764e6a910d8cfda21d1536a620cf0b4a77aa2a532c0387" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:40.530218 containerd[1508]: 2025-12-12 17:26:40.499 [INFO][4165] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.72.2/26] block=192.168.72.0/26 handle="k8s-pod-network.411d7fced35de72e5a764e6a910d8cfda21d1536a620cf0b4a77aa2a532c0387" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:40.530218 containerd[1508]: 2025-12-12 17:26:40.499 [INFO][4165] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.72.2/26] handle="k8s-pod-network.411d7fced35de72e5a764e6a910d8cfda21d1536a620cf0b4a77aa2a532c0387" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:40.530218 containerd[1508]: 2025-12-12 17:26:40.499 [INFO][4165] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:26:40.530218 containerd[1508]: 2025-12-12 17:26:40.499 [INFO][4165] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.72.2/26] IPv6=[] ContainerID="411d7fced35de72e5a764e6a910d8cfda21d1536a620cf0b4a77aa2a532c0387" HandleID="k8s-pod-network.411d7fced35de72e5a764e6a910d8cfda21d1536a620cf0b4a77aa2a532c0387" Workload="ci--4459--2--2--1--a1e622265d-k8s-csi--node--driver--nnwtm-eth0" Dec 12 17:26:40.530362 containerd[1508]: 2025-12-12 17:26:40.502 [INFO][4156] cni-plugin/k8s.go 418: Populated endpoint ContainerID="411d7fced35de72e5a764e6a910d8cfda21d1536a620cf0b4a77aa2a532c0387" Namespace="calico-system" Pod="csi-node-driver-nnwtm" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-csi--node--driver--nnwtm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--1--a1e622265d-k8s-csi--node--driver--nnwtm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"df83329f-2747-4a89-9a6a-7b20123df2cf", ResourceVersion:"717", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-1-a1e622265d", ContainerID:"", Pod:"csi-node-driver-nnwtm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.72.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif435fcd0075", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:40.530416 containerd[1508]: 2025-12-12 17:26:40.502 [INFO][4156] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.2/32] ContainerID="411d7fced35de72e5a764e6a910d8cfda21d1536a620cf0b4a77aa2a532c0387" Namespace="calico-system" Pod="csi-node-driver-nnwtm" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-csi--node--driver--nnwtm-eth0" Dec 12 17:26:40.530416 containerd[1508]: 2025-12-12 17:26:40.503 [INFO][4156] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif435fcd0075 ContainerID="411d7fced35de72e5a764e6a910d8cfda21d1536a620cf0b4a77aa2a532c0387" Namespace="calico-system" Pod="csi-node-driver-nnwtm" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-csi--node--driver--nnwtm-eth0" Dec 12 17:26:40.530416 containerd[1508]: 2025-12-12 17:26:40.508 [INFO][4156] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="411d7fced35de72e5a764e6a910d8cfda21d1536a620cf0b4a77aa2a532c0387" Namespace="calico-system" Pod="csi-node-driver-nnwtm" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-csi--node--driver--nnwtm-eth0" Dec 12 17:26:40.530480 containerd[1508]: 2025-12-12 17:26:40.509 [INFO][4156] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="411d7fced35de72e5a764e6a910d8cfda21d1536a620cf0b4a77aa2a532c0387" Namespace="calico-system" Pod="csi-node-driver-nnwtm" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-csi--node--driver--nnwtm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--1--a1e622265d-k8s-csi--node--driver--nnwtm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"df83329f-2747-4a89-9a6a-7b20123df2cf", ResourceVersion:"717", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-1-a1e622265d", ContainerID:"411d7fced35de72e5a764e6a910d8cfda21d1536a620cf0b4a77aa2a532c0387", Pod:"csi-node-driver-nnwtm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.72.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif435fcd0075", MAC:"96:76:b4:90:bb:9f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:40.530529 containerd[1508]: 2025-12-12 17:26:40.523 [INFO][4156] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="411d7fced35de72e5a764e6a910d8cfda21d1536a620cf0b4a77aa2a532c0387" Namespace="calico-system" Pod="csi-node-driver-nnwtm" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-csi--node--driver--nnwtm-eth0" Dec 12 17:26:40.565147 containerd[1508]: time="2025-12-12T17:26:40.565064515Z" level=info msg="connecting to shim 411d7fced35de72e5a764e6a910d8cfda21d1536a620cf0b4a77aa2a532c0387" address="unix:///run/containerd/s/c4d70bd95676694b11db1381f976867ab6228c3f35044ad14832d139121899ac" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:40.607790 systemd[1]: Started cri-containerd-411d7fced35de72e5a764e6a910d8cfda21d1536a620cf0b4a77aa2a532c0387.scope - libcontainer container 411d7fced35de72e5a764e6a910d8cfda21d1536a620cf0b4a77aa2a532c0387. Dec 12 17:26:40.647285 containerd[1508]: time="2025-12-12T17:26:40.647223882Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-nnwtm,Uid:df83329f-2747-4a89-9a6a-7b20123df2cf,Namespace:calico-system,Attempt:0,} returns sandbox id \"411d7fced35de72e5a764e6a910d8cfda21d1536a620cf0b4a77aa2a532c0387\"" Dec 12 17:26:40.649866 containerd[1508]: time="2025-12-12T17:26:40.649802010Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:26:41.183658 containerd[1508]: time="2025-12-12T17:26:41.183409901Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:41.185412 containerd[1508]: time="2025-12-12T17:26:41.185286506Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:26:41.185740 containerd[1508]: time="2025-12-12T17:26:41.185373427Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:26:41.186661 kubelet[2774]: E1212 17:26:41.186566 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:26:41.187435 kubelet[2774]: E1212 17:26:41.187005 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:26:41.187435 kubelet[2774]: E1212 17:26:41.187151 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-nnwtm_calico-system(df83329f-2747-4a89-9a6a-7b20123df2cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:41.189503 containerd[1508]: time="2025-12-12T17:26:41.188989278Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:26:41.359337 containerd[1508]: time="2025-12-12T17:26:41.358754952Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b9d8f9566-tg4kr,Uid:7c9ca7d5-6db7-4a38-8560-e94f3e1a1488,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:26:41.361603 containerd[1508]: time="2025-12-12T17:26:41.361545561Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-xw9gv,Uid:f8cc5a88-2710-4948-9628-460c49c0e518,Namespace:kube-system,Attempt:0,}" Dec 12 17:26:41.362100 containerd[1508]: time="2025-12-12T17:26:41.361569081Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-88wmf,Uid:5c7d7af0-8037-467a-96ff-594249aec230,Namespace:kube-system,Attempt:0,}" Dec 12 17:26:41.363648 containerd[1508]: time="2025-12-12T17:26:41.363576527Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c69bf577-f29bx,Uid:02827e28-67cb-418d-b5bc-487012be465e,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:41.530438 containerd[1508]: time="2025-12-12T17:26:41.530374073Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:41.541897 containerd[1508]: time="2025-12-12T17:26:41.541793467Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:26:41.542179 containerd[1508]: time="2025-12-12T17:26:41.541919668Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:26:41.542316 kubelet[2774]: E1212 17:26:41.542221 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:26:41.542395 kubelet[2774]: E1212 17:26:41.542324 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:26:41.542529 kubelet[2774]: E1212 17:26:41.542506 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-nnwtm_calico-system(df83329f-2747-4a89-9a6a-7b20123df2cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:41.543155 kubelet[2774]: E1212 17:26:41.543070 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nnwtm" podUID="df83329f-2747-4a89-9a6a-7b20123df2cf" Dec 12 17:26:41.627020 kubelet[2774]: E1212 17:26:41.626671 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nnwtm" podUID="df83329f-2747-4a89-9a6a-7b20123df2cf" Dec 12 17:26:41.667684 systemd-networkd[1418]: cali99da4534265: Link UP Dec 12 17:26:41.669080 systemd-networkd[1418]: cali99da4534265: Gained carrier Dec 12 17:26:41.693228 containerd[1508]: 2025-12-12 17:26:41.456 [INFO][4233] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--tg4kr-eth0 calico-apiserver-b9d8f9566- calico-apiserver 7c9ca7d5-6db7-4a38-8560-e94f3e1a1488 813 0 2025-12-12 17:26:06 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:b9d8f9566 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-1-a1e622265d calico-apiserver-b9d8f9566-tg4kr eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali99da4534265 [] [] }} ContainerID="38011ad2c3e5d40921890a885d6fb99bc99934e9e48050f5b1849343f66cc3ab" Namespace="calico-apiserver" Pod="calico-apiserver-b9d8f9566-tg4kr" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--tg4kr-" Dec 12 17:26:41.693228 containerd[1508]: 2025-12-12 17:26:41.456 [INFO][4233] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="38011ad2c3e5d40921890a885d6fb99bc99934e9e48050f5b1849343f66cc3ab" Namespace="calico-apiserver" Pod="calico-apiserver-b9d8f9566-tg4kr" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--tg4kr-eth0" Dec 12 17:26:41.693228 containerd[1508]: 2025-12-12 17:26:41.537 [INFO][4278] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="38011ad2c3e5d40921890a885d6fb99bc99934e9e48050f5b1849343f66cc3ab" HandleID="k8s-pod-network.38011ad2c3e5d40921890a885d6fb99bc99934e9e48050f5b1849343f66cc3ab" Workload="ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--tg4kr-eth0" Dec 12 17:26:41.693455 containerd[1508]: 2025-12-12 17:26:41.537 [INFO][4278] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="38011ad2c3e5d40921890a885d6fb99bc99934e9e48050f5b1849343f66cc3ab" HandleID="k8s-pod-network.38011ad2c3e5d40921890a885d6fb99bc99934e9e48050f5b1849343f66cc3ab" Workload="ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--tg4kr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000353750), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-1-a1e622265d", "pod":"calico-apiserver-b9d8f9566-tg4kr", "timestamp":"2025-12-12 17:26:41.537443814 +0000 UTC"}, Hostname:"ci-4459-2-2-1-a1e622265d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:26:41.693455 containerd[1508]: 2025-12-12 17:26:41.537 [INFO][4278] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:26:41.693455 containerd[1508]: 2025-12-12 17:26:41.537 [INFO][4278] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:26:41.693455 containerd[1508]: 2025-12-12 17:26:41.538 [INFO][4278] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-1-a1e622265d' Dec 12 17:26:41.693455 containerd[1508]: 2025-12-12 17:26:41.556 [INFO][4278] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.38011ad2c3e5d40921890a885d6fb99bc99934e9e48050f5b1849343f66cc3ab" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.693455 containerd[1508]: 2025-12-12 17:26:41.568 [INFO][4278] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.693455 containerd[1508]: 2025-12-12 17:26:41.584 [INFO][4278] ipam/ipam.go 511: Trying affinity for 192.168.72.0/26 host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.693455 containerd[1508]: 2025-12-12 17:26:41.589 [INFO][4278] ipam/ipam.go 158: Attempting to load block cidr=192.168.72.0/26 host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.693455 containerd[1508]: 2025-12-12 17:26:41.599 [INFO][4278] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.693670 containerd[1508]: 2025-12-12 17:26:41.599 [INFO][4278] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.38011ad2c3e5d40921890a885d6fb99bc99934e9e48050f5b1849343f66cc3ab" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.693670 containerd[1508]: 2025-12-12 17:26:41.607 [INFO][4278] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.38011ad2c3e5d40921890a885d6fb99bc99934e9e48050f5b1849343f66cc3ab Dec 12 17:26:41.693670 containerd[1508]: 2025-12-12 17:26:41.645 [INFO][4278] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.38011ad2c3e5d40921890a885d6fb99bc99934e9e48050f5b1849343f66cc3ab" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.693670 containerd[1508]: 2025-12-12 17:26:41.659 [INFO][4278] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.72.3/26] block=192.168.72.0/26 handle="k8s-pod-network.38011ad2c3e5d40921890a885d6fb99bc99934e9e48050f5b1849343f66cc3ab" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.693670 containerd[1508]: 2025-12-12 17:26:41.659 [INFO][4278] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.72.3/26] handle="k8s-pod-network.38011ad2c3e5d40921890a885d6fb99bc99934e9e48050f5b1849343f66cc3ab" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.693670 containerd[1508]: 2025-12-12 17:26:41.659 [INFO][4278] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:26:41.693670 containerd[1508]: 2025-12-12 17:26:41.659 [INFO][4278] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.72.3/26] IPv6=[] ContainerID="38011ad2c3e5d40921890a885d6fb99bc99934e9e48050f5b1849343f66cc3ab" HandleID="k8s-pod-network.38011ad2c3e5d40921890a885d6fb99bc99934e9e48050f5b1849343f66cc3ab" Workload="ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--tg4kr-eth0" Dec 12 17:26:41.693808 containerd[1508]: 2025-12-12 17:26:41.663 [INFO][4233] cni-plugin/k8s.go 418: Populated endpoint ContainerID="38011ad2c3e5d40921890a885d6fb99bc99934e9e48050f5b1849343f66cc3ab" Namespace="calico-apiserver" Pod="calico-apiserver-b9d8f9566-tg4kr" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--tg4kr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--tg4kr-eth0", GenerateName:"calico-apiserver-b9d8f9566-", Namespace:"calico-apiserver", SelfLink:"", UID:"7c9ca7d5-6db7-4a38-8560-e94f3e1a1488", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b9d8f9566", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-1-a1e622265d", ContainerID:"", Pod:"calico-apiserver-b9d8f9566-tg4kr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.72.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali99da4534265", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:41.693859 containerd[1508]: 2025-12-12 17:26:41.663 [INFO][4233] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.3/32] ContainerID="38011ad2c3e5d40921890a885d6fb99bc99934e9e48050f5b1849343f66cc3ab" Namespace="calico-apiserver" Pod="calico-apiserver-b9d8f9566-tg4kr" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--tg4kr-eth0" Dec 12 17:26:41.693859 containerd[1508]: 2025-12-12 17:26:41.663 [INFO][4233] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali99da4534265 ContainerID="38011ad2c3e5d40921890a885d6fb99bc99934e9e48050f5b1849343f66cc3ab" Namespace="calico-apiserver" Pod="calico-apiserver-b9d8f9566-tg4kr" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--tg4kr-eth0" Dec 12 17:26:41.693859 containerd[1508]: 2025-12-12 17:26:41.672 [INFO][4233] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="38011ad2c3e5d40921890a885d6fb99bc99934e9e48050f5b1849343f66cc3ab" Namespace="calico-apiserver" Pod="calico-apiserver-b9d8f9566-tg4kr" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--tg4kr-eth0" Dec 12 17:26:41.693922 containerd[1508]: 2025-12-12 17:26:41.673 [INFO][4233] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="38011ad2c3e5d40921890a885d6fb99bc99934e9e48050f5b1849343f66cc3ab" Namespace="calico-apiserver" Pod="calico-apiserver-b9d8f9566-tg4kr" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--tg4kr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--tg4kr-eth0", GenerateName:"calico-apiserver-b9d8f9566-", Namespace:"calico-apiserver", SelfLink:"", UID:"7c9ca7d5-6db7-4a38-8560-e94f3e1a1488", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b9d8f9566", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-1-a1e622265d", ContainerID:"38011ad2c3e5d40921890a885d6fb99bc99934e9e48050f5b1849343f66cc3ab", Pod:"calico-apiserver-b9d8f9566-tg4kr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.72.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali99da4534265", MAC:"1a:cb:00:88:5d:e8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:41.693968 containerd[1508]: 2025-12-12 17:26:41.687 [INFO][4233] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="38011ad2c3e5d40921890a885d6fb99bc99934e9e48050f5b1849343f66cc3ab" Namespace="calico-apiserver" Pod="calico-apiserver-b9d8f9566-tg4kr" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--tg4kr-eth0" Dec 12 17:26:41.761325 systemd-networkd[1418]: cali4a2c90f28d0: Link UP Dec 12 17:26:41.761980 systemd-networkd[1418]: cali4a2c90f28d0: Gained carrier Dec 12 17:26:41.765418 containerd[1508]: time="2025-12-12T17:26:41.764000501Z" level=info msg="connecting to shim 38011ad2c3e5d40921890a885d6fb99bc99934e9e48050f5b1849343f66cc3ab" address="unix:///run/containerd/s/d619ff8443277e55d94983df1d6105069b59a696877a65fde8f4ffe8c44c00ee" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:41.788189 containerd[1508]: 2025-12-12 17:26:41.484 [INFO][4234] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--88wmf-eth0 coredns-66bc5c9577- kube-system 5c7d7af0-8037-467a-96ff-594249aec230 817 0 2025-12-12 17:25:56 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-1-a1e622265d coredns-66bc5c9577-88wmf eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali4a2c90f28d0 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240" Namespace="kube-system" Pod="coredns-66bc5c9577-88wmf" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--88wmf-" Dec 12 17:26:41.788189 containerd[1508]: 2025-12-12 17:26:41.484 [INFO][4234] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240" Namespace="kube-system" Pod="coredns-66bc5c9577-88wmf" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--88wmf-eth0" Dec 12 17:26:41.788189 containerd[1508]: 2025-12-12 17:26:41.636 [INFO][4286] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240" HandleID="k8s-pod-network.05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240" Workload="ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--88wmf-eth0" Dec 12 17:26:41.788373 containerd[1508]: 2025-12-12 17:26:41.637 [INFO][4286] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240" HandleID="k8s-pod-network.05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240" Workload="ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--88wmf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001207b0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-1-a1e622265d", "pod":"coredns-66bc5c9577-88wmf", "timestamp":"2025-12-12 17:26:41.636654755 +0000 UTC"}, Hostname:"ci-4459-2-2-1-a1e622265d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:26:41.788373 containerd[1508]: 2025-12-12 17:26:41.637 [INFO][4286] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:26:41.788373 containerd[1508]: 2025-12-12 17:26:41.659 [INFO][4286] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:26:41.788373 containerd[1508]: 2025-12-12 17:26:41.659 [INFO][4286] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-1-a1e622265d' Dec 12 17:26:41.788373 containerd[1508]: 2025-12-12 17:26:41.680 [INFO][4286] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.788373 containerd[1508]: 2025-12-12 17:26:41.696 [INFO][4286] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.788373 containerd[1508]: 2025-12-12 17:26:41.704 [INFO][4286] ipam/ipam.go 511: Trying affinity for 192.168.72.0/26 host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.788373 containerd[1508]: 2025-12-12 17:26:41.707 [INFO][4286] ipam/ipam.go 158: Attempting to load block cidr=192.168.72.0/26 host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.788373 containerd[1508]: 2025-12-12 17:26:41.711 [INFO][4286] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.788576 containerd[1508]: 2025-12-12 17:26:41.711 [INFO][4286] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.788576 containerd[1508]: 2025-12-12 17:26:41.715 [INFO][4286] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240 Dec 12 17:26:41.788576 containerd[1508]: 2025-12-12 17:26:41.725 [INFO][4286] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.788576 containerd[1508]: 2025-12-12 17:26:41.739 [INFO][4286] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.72.4/26] block=192.168.72.0/26 handle="k8s-pod-network.05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.788576 containerd[1508]: 2025-12-12 17:26:41.739 [INFO][4286] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.72.4/26] handle="k8s-pod-network.05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.788576 containerd[1508]: 2025-12-12 17:26:41.739 [INFO][4286] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:26:41.788576 containerd[1508]: 2025-12-12 17:26:41.739 [INFO][4286] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.72.4/26] IPv6=[] ContainerID="05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240" HandleID="k8s-pod-network.05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240" Workload="ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--88wmf-eth0" Dec 12 17:26:41.789776 containerd[1508]: 2025-12-12 17:26:41.752 [INFO][4234] cni-plugin/k8s.go 418: Populated endpoint ContainerID="05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240" Namespace="kube-system" Pod="coredns-66bc5c9577-88wmf" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--88wmf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--88wmf-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"5c7d7af0-8037-467a-96ff-594249aec230", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 25, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-1-a1e622265d", ContainerID:"", Pod:"coredns-66bc5c9577-88wmf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.72.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4a2c90f28d0", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:41.789776 containerd[1508]: 2025-12-12 17:26:41.752 [INFO][4234] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.4/32] ContainerID="05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240" Namespace="kube-system" Pod="coredns-66bc5c9577-88wmf" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--88wmf-eth0" Dec 12 17:26:41.789776 containerd[1508]: 2025-12-12 17:26:41.752 [INFO][4234] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4a2c90f28d0 ContainerID="05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240" Namespace="kube-system" Pod="coredns-66bc5c9577-88wmf" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--88wmf-eth0" Dec 12 17:26:41.789776 containerd[1508]: 2025-12-12 17:26:41.761 [INFO][4234] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240" Namespace="kube-system" Pod="coredns-66bc5c9577-88wmf" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--88wmf-eth0" Dec 12 17:26:41.789776 containerd[1508]: 2025-12-12 17:26:41.765 [INFO][4234] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240" Namespace="kube-system" Pod="coredns-66bc5c9577-88wmf" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--88wmf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--88wmf-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"5c7d7af0-8037-467a-96ff-594249aec230", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 25, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-1-a1e622265d", ContainerID:"05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240", Pod:"coredns-66bc5c9577-88wmf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.72.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4a2c90f28d0", MAC:"3a:62:de:7a:ad:f4", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:41.789969 containerd[1508]: 2025-12-12 17:26:41.778 [INFO][4234] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240" Namespace="kube-system" Pod="coredns-66bc5c9577-88wmf" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--88wmf-eth0" Dec 12 17:26:41.840860 systemd[1]: Started cri-containerd-38011ad2c3e5d40921890a885d6fb99bc99934e9e48050f5b1849343f66cc3ab.scope - libcontainer container 38011ad2c3e5d40921890a885d6fb99bc99934e9e48050f5b1849343f66cc3ab. Dec 12 17:26:41.850907 containerd[1508]: time="2025-12-12T17:26:41.850832084Z" level=info msg="connecting to shim 05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240" address="unix:///run/containerd/s/7c9f35e900e32030e2cf5e150ce38087eae3dd24555e7efba8e2f30d084dea1c" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:41.885985 systemd-networkd[1418]: cali149b9eea634: Link UP Dec 12 17:26:41.889669 systemd-networkd[1418]: cali149b9eea634: Gained carrier Dec 12 17:26:41.914286 containerd[1508]: 2025-12-12 17:26:41.521 [INFO][4252] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--1--a1e622265d-k8s-calico--kube--controllers--7c69bf577--f29bx-eth0 calico-kube-controllers-7c69bf577- calico-system 02827e28-67cb-418d-b5bc-487012be465e 816 0 2025-12-12 17:26:15 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7c69bf577 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-2-1-a1e622265d calico-kube-controllers-7c69bf577-f29bx eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali149b9eea634 [] [] }} ContainerID="7b5015135562d79601971c9cbd7e8cf38b937b24ecfc14e36926341d3ebbc437" Namespace="calico-system" Pod="calico-kube-controllers-7c69bf577-f29bx" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-calico--kube--controllers--7c69bf577--f29bx-" Dec 12 17:26:41.914286 containerd[1508]: 2025-12-12 17:26:41.522 [INFO][4252] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7b5015135562d79601971c9cbd7e8cf38b937b24ecfc14e36926341d3ebbc437" Namespace="calico-system" Pod="calico-kube-controllers-7c69bf577-f29bx" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-calico--kube--controllers--7c69bf577--f29bx-eth0" Dec 12 17:26:41.914286 containerd[1508]: 2025-12-12 17:26:41.640 [INFO][4301] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7b5015135562d79601971c9cbd7e8cf38b937b24ecfc14e36926341d3ebbc437" HandleID="k8s-pod-network.7b5015135562d79601971c9cbd7e8cf38b937b24ecfc14e36926341d3ebbc437" Workload="ci--4459--2--2--1--a1e622265d-k8s-calico--kube--controllers--7c69bf577--f29bx-eth0" Dec 12 17:26:41.914286 containerd[1508]: 2025-12-12 17:26:41.640 [INFO][4301] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="7b5015135562d79601971c9cbd7e8cf38b937b24ecfc14e36926341d3ebbc437" HandleID="k8s-pod-network.7b5015135562d79601971c9cbd7e8cf38b937b24ecfc14e36926341d3ebbc437" Workload="ci--4459--2--2--1--a1e622265d-k8s-calico--kube--controllers--7c69bf577--f29bx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000304030), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-1-a1e622265d", "pod":"calico-kube-controllers-7c69bf577-f29bx", "timestamp":"2025-12-12 17:26:41.640835567 +0000 UTC"}, Hostname:"ci-4459-2-2-1-a1e622265d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:26:41.914286 containerd[1508]: 2025-12-12 17:26:41.641 [INFO][4301] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:26:41.914286 containerd[1508]: 2025-12-12 17:26:41.739 [INFO][4301] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:26:41.914286 containerd[1508]: 2025-12-12 17:26:41.739 [INFO][4301] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-1-a1e622265d' Dec 12 17:26:41.914286 containerd[1508]: 2025-12-12 17:26:41.780 [INFO][4301] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7b5015135562d79601971c9cbd7e8cf38b937b24ecfc14e36926341d3ebbc437" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.914286 containerd[1508]: 2025-12-12 17:26:41.802 [INFO][4301] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.914286 containerd[1508]: 2025-12-12 17:26:41.815 [INFO][4301] ipam/ipam.go 511: Trying affinity for 192.168.72.0/26 host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.914286 containerd[1508]: 2025-12-12 17:26:41.820 [INFO][4301] ipam/ipam.go 158: Attempting to load block cidr=192.168.72.0/26 host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.914286 containerd[1508]: 2025-12-12 17:26:41.826 [INFO][4301] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.914286 containerd[1508]: 2025-12-12 17:26:41.827 [INFO][4301] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.7b5015135562d79601971c9cbd7e8cf38b937b24ecfc14e36926341d3ebbc437" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.914286 containerd[1508]: 2025-12-12 17:26:41.833 [INFO][4301] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.7b5015135562d79601971c9cbd7e8cf38b937b24ecfc14e36926341d3ebbc437 Dec 12 17:26:41.914286 containerd[1508]: 2025-12-12 17:26:41.844 [INFO][4301] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.7b5015135562d79601971c9cbd7e8cf38b937b24ecfc14e36926341d3ebbc437" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.914286 containerd[1508]: 2025-12-12 17:26:41.862 [INFO][4301] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.72.5/26] block=192.168.72.0/26 handle="k8s-pod-network.7b5015135562d79601971c9cbd7e8cf38b937b24ecfc14e36926341d3ebbc437" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.914286 containerd[1508]: 2025-12-12 17:26:41.862 [INFO][4301] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.72.5/26] handle="k8s-pod-network.7b5015135562d79601971c9cbd7e8cf38b937b24ecfc14e36926341d3ebbc437" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:41.914286 containerd[1508]: 2025-12-12 17:26:41.862 [INFO][4301] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:26:41.914286 containerd[1508]: 2025-12-12 17:26:41.862 [INFO][4301] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.72.5/26] IPv6=[] ContainerID="7b5015135562d79601971c9cbd7e8cf38b937b24ecfc14e36926341d3ebbc437" HandleID="k8s-pod-network.7b5015135562d79601971c9cbd7e8cf38b937b24ecfc14e36926341d3ebbc437" Workload="ci--4459--2--2--1--a1e622265d-k8s-calico--kube--controllers--7c69bf577--f29bx-eth0" Dec 12 17:26:41.914855 containerd[1508]: 2025-12-12 17:26:41.867 [INFO][4252] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7b5015135562d79601971c9cbd7e8cf38b937b24ecfc14e36926341d3ebbc437" Namespace="calico-system" Pod="calico-kube-controllers-7c69bf577-f29bx" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-calico--kube--controllers--7c69bf577--f29bx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--1--a1e622265d-k8s-calico--kube--controllers--7c69bf577--f29bx-eth0", GenerateName:"calico-kube-controllers-7c69bf577-", Namespace:"calico-system", SelfLink:"", UID:"02827e28-67cb-418d-b5bc-487012be465e", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7c69bf577", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-1-a1e622265d", ContainerID:"", Pod:"calico-kube-controllers-7c69bf577-f29bx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.72.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali149b9eea634", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:41.914855 containerd[1508]: 2025-12-12 17:26:41.867 [INFO][4252] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.5/32] ContainerID="7b5015135562d79601971c9cbd7e8cf38b937b24ecfc14e36926341d3ebbc437" Namespace="calico-system" Pod="calico-kube-controllers-7c69bf577-f29bx" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-calico--kube--controllers--7c69bf577--f29bx-eth0" Dec 12 17:26:41.914855 containerd[1508]: 2025-12-12 17:26:41.867 [INFO][4252] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali149b9eea634 ContainerID="7b5015135562d79601971c9cbd7e8cf38b937b24ecfc14e36926341d3ebbc437" Namespace="calico-system" Pod="calico-kube-controllers-7c69bf577-f29bx" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-calico--kube--controllers--7c69bf577--f29bx-eth0" Dec 12 17:26:41.914855 containerd[1508]: 2025-12-12 17:26:41.893 [INFO][4252] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7b5015135562d79601971c9cbd7e8cf38b937b24ecfc14e36926341d3ebbc437" Namespace="calico-system" Pod="calico-kube-controllers-7c69bf577-f29bx" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-calico--kube--controllers--7c69bf577--f29bx-eth0" Dec 12 17:26:41.914855 containerd[1508]: 2025-12-12 17:26:41.894 [INFO][4252] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7b5015135562d79601971c9cbd7e8cf38b937b24ecfc14e36926341d3ebbc437" Namespace="calico-system" Pod="calico-kube-controllers-7c69bf577-f29bx" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-calico--kube--controllers--7c69bf577--f29bx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--1--a1e622265d-k8s-calico--kube--controllers--7c69bf577--f29bx-eth0", GenerateName:"calico-kube-controllers-7c69bf577-", Namespace:"calico-system", SelfLink:"", UID:"02827e28-67cb-418d-b5bc-487012be465e", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7c69bf577", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-1-a1e622265d", ContainerID:"7b5015135562d79601971c9cbd7e8cf38b937b24ecfc14e36926341d3ebbc437", Pod:"calico-kube-controllers-7c69bf577-f29bx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.72.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali149b9eea634", MAC:"d2:47:b0:99:70:a6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:41.914855 containerd[1508]: 2025-12-12 17:26:41.911 [INFO][4252] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7b5015135562d79601971c9cbd7e8cf38b937b24ecfc14e36926341d3ebbc437" Namespace="calico-system" Pod="calico-kube-controllers-7c69bf577-f29bx" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-calico--kube--controllers--7c69bf577--f29bx-eth0" Dec 12 17:26:41.931994 systemd[1]: Started cri-containerd-05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240.scope - libcontainer container 05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240. Dec 12 17:26:41.961227 containerd[1508]: time="2025-12-12T17:26:41.961182299Z" level=info msg="connecting to shim 7b5015135562d79601971c9cbd7e8cf38b937b24ecfc14e36926341d3ebbc437" address="unix:///run/containerd/s/a7020c9a6df7c7537e5074c426ccf6b0c31c7cb78c82900f7588f628eedb4203" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:41.982524 systemd-networkd[1418]: cali9c527f03beb: Link UP Dec 12 17:26:41.984414 systemd-networkd[1418]: cali9c527f03beb: Gained carrier Dec 12 17:26:42.038724 containerd[1508]: 2025-12-12 17:26:41.508 [INFO][4245] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--xw9gv-eth0 coredns-66bc5c9577- kube-system f8cc5a88-2710-4948-9628-460c49c0e518 809 0 2025-12-12 17:25:56 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-1-a1e622265d coredns-66bc5c9577-xw9gv eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali9c527f03beb [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5" Namespace="kube-system" Pod="coredns-66bc5c9577-xw9gv" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--xw9gv-" Dec 12 17:26:42.038724 containerd[1508]: 2025-12-12 17:26:41.508 [INFO][4245] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5" Namespace="kube-system" Pod="coredns-66bc5c9577-xw9gv" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--xw9gv-eth0" Dec 12 17:26:42.038724 containerd[1508]: 2025-12-12 17:26:41.656 [INFO][4295] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5" HandleID="k8s-pod-network.04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5" Workload="ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--xw9gv-eth0" Dec 12 17:26:42.038724 containerd[1508]: 2025-12-12 17:26:41.657 [INFO][4295] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5" HandleID="k8s-pod-network.04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5" Workload="ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--xw9gv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d950), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-1-a1e622265d", "pod":"coredns-66bc5c9577-xw9gv", "timestamp":"2025-12-12 17:26:41.656081654 +0000 UTC"}, Hostname:"ci-4459-2-2-1-a1e622265d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:26:42.038724 containerd[1508]: 2025-12-12 17:26:41.657 [INFO][4295] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:26:42.038724 containerd[1508]: 2025-12-12 17:26:41.862 [INFO][4295] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:26:42.038724 containerd[1508]: 2025-12-12 17:26:41.862 [INFO][4295] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-1-a1e622265d' Dec 12 17:26:42.038724 containerd[1508]: 2025-12-12 17:26:41.884 [INFO][4295] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.038724 containerd[1508]: 2025-12-12 17:26:41.900 [INFO][4295] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.038724 containerd[1508]: 2025-12-12 17:26:41.921 [INFO][4295] ipam/ipam.go 511: Trying affinity for 192.168.72.0/26 host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.038724 containerd[1508]: 2025-12-12 17:26:41.932 [INFO][4295] ipam/ipam.go 158: Attempting to load block cidr=192.168.72.0/26 host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.038724 containerd[1508]: 2025-12-12 17:26:41.941 [INFO][4295] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.038724 containerd[1508]: 2025-12-12 17:26:41.941 [INFO][4295] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.038724 containerd[1508]: 2025-12-12 17:26:41.944 [INFO][4295] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5 Dec 12 17:26:42.038724 containerd[1508]: 2025-12-12 17:26:41.954 [INFO][4295] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.038724 containerd[1508]: 2025-12-12 17:26:41.968 [INFO][4295] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.72.6/26] block=192.168.72.0/26 handle="k8s-pod-network.04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.038724 containerd[1508]: 2025-12-12 17:26:41.968 [INFO][4295] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.72.6/26] handle="k8s-pod-network.04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.038724 containerd[1508]: 2025-12-12 17:26:41.969 [INFO][4295] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:26:42.038724 containerd[1508]: 2025-12-12 17:26:41.969 [INFO][4295] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.72.6/26] IPv6=[] ContainerID="04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5" HandleID="k8s-pod-network.04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5" Workload="ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--xw9gv-eth0" Dec 12 17:26:42.039297 containerd[1508]: 2025-12-12 17:26:41.972 [INFO][4245] cni-plugin/k8s.go 418: Populated endpoint ContainerID="04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5" Namespace="kube-system" Pod="coredns-66bc5c9577-xw9gv" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--xw9gv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--xw9gv-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"f8cc5a88-2710-4948-9628-460c49c0e518", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 25, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-1-a1e622265d", ContainerID:"", Pod:"coredns-66bc5c9577-xw9gv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.72.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9c527f03beb", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:42.039297 containerd[1508]: 2025-12-12 17:26:41.973 [INFO][4245] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.6/32] ContainerID="04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5" Namespace="kube-system" Pod="coredns-66bc5c9577-xw9gv" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--xw9gv-eth0" Dec 12 17:26:42.039297 containerd[1508]: 2025-12-12 17:26:41.974 [INFO][4245] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9c527f03beb ContainerID="04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5" Namespace="kube-system" Pod="coredns-66bc5c9577-xw9gv" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--xw9gv-eth0" Dec 12 17:26:42.039297 containerd[1508]: 2025-12-12 17:26:41.980 [INFO][4245] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5" Namespace="kube-system" Pod="coredns-66bc5c9577-xw9gv" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--xw9gv-eth0" Dec 12 17:26:42.039297 containerd[1508]: 2025-12-12 17:26:41.984 [INFO][4245] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5" Namespace="kube-system" Pod="coredns-66bc5c9577-xw9gv" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--xw9gv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--xw9gv-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"f8cc5a88-2710-4948-9628-460c49c0e518", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 25, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-1-a1e622265d", ContainerID:"04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5", Pod:"coredns-66bc5c9577-xw9gv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.72.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9c527f03beb", MAC:"3e:4f:6c:8d:e8:93", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:42.039470 containerd[1508]: 2025-12-12 17:26:42.016 [INFO][4245] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5" Namespace="kube-system" Pod="coredns-66bc5c9577-xw9gv" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-coredns--66bc5c9577--xw9gv-eth0" Dec 12 17:26:42.052894 containerd[1508]: time="2025-12-12T17:26:42.052850217Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b9d8f9566-tg4kr,Uid:7c9ca7d5-6db7-4a38-8560-e94f3e1a1488,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"38011ad2c3e5d40921890a885d6fb99bc99934e9e48050f5b1849343f66cc3ab\"" Dec 12 17:26:42.060345 containerd[1508]: time="2025-12-12T17:26:42.060298120Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:26:42.076888 systemd[1]: Started cri-containerd-7b5015135562d79601971c9cbd7e8cf38b937b24ecfc14e36926341d3ebbc437.scope - libcontainer container 7b5015135562d79601971c9cbd7e8cf38b937b24ecfc14e36926341d3ebbc437. Dec 12 17:26:42.103682 containerd[1508]: time="2025-12-12T17:26:42.103637372Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-88wmf,Uid:5c7d7af0-8037-467a-96ff-594249aec230,Namespace:kube-system,Attempt:0,} returns sandbox id \"05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240\"" Dec 12 17:26:42.110669 containerd[1508]: time="2025-12-12T17:26:42.110254992Z" level=info msg="connecting to shim 04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5" address="unix:///run/containerd/s/81cb4f722084d24619c3a1ade944e44a1c58562f89c90735a58729ce65c9ac49" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:42.113213 containerd[1508]: time="2025-12-12T17:26:42.113176161Z" level=info msg="CreateContainer within sandbox \"05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 12 17:26:42.145812 containerd[1508]: time="2025-12-12T17:26:42.145728501Z" level=info msg="Container e43d29ddaabf6841180712fd7c6c785fd2a3f32a1c8951caff578f593ab3a65c: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:26:42.158959 systemd[1]: Started cri-containerd-04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5.scope - libcontainer container 04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5. Dec 12 17:26:42.162740 containerd[1508]: time="2025-12-12T17:26:42.162668112Z" level=info msg="CreateContainer within sandbox \"05354aaeece0da3f7ddc5ba7ffdc810e565cc864aafc6a73f5428a79ff419240\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e43d29ddaabf6841180712fd7c6c785fd2a3f32a1c8951caff578f593ab3a65c\"" Dec 12 17:26:42.163647 containerd[1508]: time="2025-12-12T17:26:42.163591635Z" level=info msg="StartContainer for \"e43d29ddaabf6841180712fd7c6c785fd2a3f32a1c8951caff578f593ab3a65c\"" Dec 12 17:26:42.166912 containerd[1508]: time="2025-12-12T17:26:42.166761365Z" level=info msg="connecting to shim e43d29ddaabf6841180712fd7c6c785fd2a3f32a1c8951caff578f593ab3a65c" address="unix:///run/containerd/s/7c9f35e900e32030e2cf5e150ce38087eae3dd24555e7efba8e2f30d084dea1c" protocol=ttrpc version=3 Dec 12 17:26:42.182220 containerd[1508]: time="2025-12-12T17:26:42.181826971Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c69bf577-f29bx,Uid:02827e28-67cb-418d-b5bc-487012be465e,Namespace:calico-system,Attempt:0,} returns sandbox id \"7b5015135562d79601971c9cbd7e8cf38b937b24ecfc14e36926341d3ebbc437\"" Dec 12 17:26:42.202887 systemd[1]: Started cri-containerd-e43d29ddaabf6841180712fd7c6c785fd2a3f32a1c8951caff578f593ab3a65c.scope - libcontainer container e43d29ddaabf6841180712fd7c6c785fd2a3f32a1c8951caff578f593ab3a65c. Dec 12 17:26:42.232180 containerd[1508]: time="2025-12-12T17:26:42.231980364Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-xw9gv,Uid:f8cc5a88-2710-4948-9628-460c49c0e518,Namespace:kube-system,Attempt:0,} returns sandbox id \"04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5\"" Dec 12 17:26:42.240074 containerd[1508]: time="2025-12-12T17:26:42.239294466Z" level=info msg="CreateContainer within sandbox \"04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 12 17:26:42.253167 containerd[1508]: time="2025-12-12T17:26:42.253121908Z" level=info msg="Container df338e09889b3e7fef489cd9f2994dcc48c38c9130ede80490bbdf6d3e4874b4: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:26:42.257658 containerd[1508]: time="2025-12-12T17:26:42.257608282Z" level=info msg="StartContainer for \"e43d29ddaabf6841180712fd7c6c785fd2a3f32a1c8951caff578f593ab3a65c\" returns successfully" Dec 12 17:26:42.263782 containerd[1508]: time="2025-12-12T17:26:42.262941938Z" level=info msg="CreateContainer within sandbox \"04a48f2f21c6dc82ba88c2daeae6ba56e883f6d5e332c48c8f9470cdd0fc09f5\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"df338e09889b3e7fef489cd9f2994dcc48c38c9130ede80490bbdf6d3e4874b4\"" Dec 12 17:26:42.264639 containerd[1508]: time="2025-12-12T17:26:42.264602103Z" level=info msg="StartContainer for \"df338e09889b3e7fef489cd9f2994dcc48c38c9130ede80490bbdf6d3e4874b4\"" Dec 12 17:26:42.277687 containerd[1508]: time="2025-12-12T17:26:42.277212822Z" level=info msg="connecting to shim df338e09889b3e7fef489cd9f2994dcc48c38c9130ede80490bbdf6d3e4874b4" address="unix:///run/containerd/s/81cb4f722084d24619c3a1ade944e44a1c58562f89c90735a58729ce65c9ac49" protocol=ttrpc version=3 Dec 12 17:26:42.302253 systemd[1]: Started cri-containerd-df338e09889b3e7fef489cd9f2994dcc48c38c9130ede80490bbdf6d3e4874b4.scope - libcontainer container df338e09889b3e7fef489cd9f2994dcc48c38c9130ede80490bbdf6d3e4874b4. Dec 12 17:26:42.358196 containerd[1508]: time="2025-12-12T17:26:42.358145949Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-58gsb,Uid:ddab63ad-c74e-4d1e-9071-4b51ff05d58f,Namespace:calico-system,Attempt:0,}" Dec 12 17:26:42.361697 containerd[1508]: time="2025-12-12T17:26:42.361660319Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b9d8f9566-clxs9,Uid:b8ca6b0c-01ec-442d-b526-fb52b3677452,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:26:42.364516 containerd[1508]: time="2025-12-12T17:26:42.364481088Z" level=info msg="StartContainer for \"df338e09889b3e7fef489cd9f2994dcc48c38c9130ede80490bbdf6d3e4874b4\" returns successfully" Dec 12 17:26:42.371782 systemd-networkd[1418]: calif435fcd0075: Gained IPv6LL Dec 12 17:26:42.407139 containerd[1508]: time="2025-12-12T17:26:42.406992858Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:42.420231 containerd[1508]: time="2025-12-12T17:26:42.420172818Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:26:42.422163 containerd[1508]: time="2025-12-12T17:26:42.420283338Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:26:42.422251 kubelet[2774]: E1212 17:26:42.420848 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:26:42.422251 kubelet[2774]: E1212 17:26:42.420971 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:26:42.422251 kubelet[2774]: E1212 17:26:42.421497 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-b9d8f9566-tg4kr_calico-apiserver(7c9ca7d5-6db7-4a38-8560-e94f3e1a1488): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:42.422251 kubelet[2774]: E1212 17:26:42.421541 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-tg4kr" podUID="7c9ca7d5-6db7-4a38-8560-e94f3e1a1488" Dec 12 17:26:42.423577 containerd[1508]: time="2025-12-12T17:26:42.423073027Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:26:42.633937 systemd-networkd[1418]: calicdd625d00c2: Link UP Dec 12 17:26:42.639221 systemd-networkd[1418]: calicdd625d00c2: Gained carrier Dec 12 17:26:42.639679 kubelet[2774]: E1212 17:26:42.639602 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-tg4kr" podUID="7c9ca7d5-6db7-4a38-8560-e94f3e1a1488" Dec 12 17:26:42.657738 kubelet[2774]: E1212 17:26:42.657601 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nnwtm" podUID="df83329f-2747-4a89-9a6a-7b20123df2cf" Dec 12 17:26:42.678378 containerd[1508]: 2025-12-12 17:26:42.453 [INFO][4595] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--clxs9-eth0 calico-apiserver-b9d8f9566- calico-apiserver b8ca6b0c-01ec-442d-b526-fb52b3677452 818 0 2025-12-12 17:26:06 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:b9d8f9566 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-1-a1e622265d calico-apiserver-b9d8f9566-clxs9 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calicdd625d00c2 [] [] }} ContainerID="e0c6598a2ead4c8fa1d229bd3b7b01e8fde58665190f0b4acec5ebb9e3e81bb8" Namespace="calico-apiserver" Pod="calico-apiserver-b9d8f9566-clxs9" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--clxs9-" Dec 12 17:26:42.678378 containerd[1508]: 2025-12-12 17:26:42.454 [INFO][4595] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e0c6598a2ead4c8fa1d229bd3b7b01e8fde58665190f0b4acec5ebb9e3e81bb8" Namespace="calico-apiserver" Pod="calico-apiserver-b9d8f9566-clxs9" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--clxs9-eth0" Dec 12 17:26:42.678378 containerd[1508]: 2025-12-12 17:26:42.515 [INFO][4619] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e0c6598a2ead4c8fa1d229bd3b7b01e8fde58665190f0b4acec5ebb9e3e81bb8" HandleID="k8s-pod-network.e0c6598a2ead4c8fa1d229bd3b7b01e8fde58665190f0b4acec5ebb9e3e81bb8" Workload="ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--clxs9-eth0" Dec 12 17:26:42.678378 containerd[1508]: 2025-12-12 17:26:42.515 [INFO][4619] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="e0c6598a2ead4c8fa1d229bd3b7b01e8fde58665190f0b4acec5ebb9e3e81bb8" HandleID="k8s-pod-network.e0c6598a2ead4c8fa1d229bd3b7b01e8fde58665190f0b4acec5ebb9e3e81bb8" Workload="ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--clxs9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400022f950), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-1-a1e622265d", "pod":"calico-apiserver-b9d8f9566-clxs9", "timestamp":"2025-12-12 17:26:42.515071787 +0000 UTC"}, Hostname:"ci-4459-2-2-1-a1e622265d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:26:42.678378 containerd[1508]: 2025-12-12 17:26:42.515 [INFO][4619] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:26:42.678378 containerd[1508]: 2025-12-12 17:26:42.515 [INFO][4619] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:26:42.678378 containerd[1508]: 2025-12-12 17:26:42.515 [INFO][4619] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-1-a1e622265d' Dec 12 17:26:42.678378 containerd[1508]: 2025-12-12 17:26:42.543 [INFO][4619] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e0c6598a2ead4c8fa1d229bd3b7b01e8fde58665190f0b4acec5ebb9e3e81bb8" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.678378 containerd[1508]: 2025-12-12 17:26:42.558 [INFO][4619] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.678378 containerd[1508]: 2025-12-12 17:26:42.571 [INFO][4619] ipam/ipam.go 511: Trying affinity for 192.168.72.0/26 host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.678378 containerd[1508]: 2025-12-12 17:26:42.575 [INFO][4619] ipam/ipam.go 158: Attempting to load block cidr=192.168.72.0/26 host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.678378 containerd[1508]: 2025-12-12 17:26:42.583 [INFO][4619] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.678378 containerd[1508]: 2025-12-12 17:26:42.584 [INFO][4619] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.e0c6598a2ead4c8fa1d229bd3b7b01e8fde58665190f0b4acec5ebb9e3e81bb8" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.678378 containerd[1508]: 2025-12-12 17:26:42.587 [INFO][4619] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.e0c6598a2ead4c8fa1d229bd3b7b01e8fde58665190f0b4acec5ebb9e3e81bb8 Dec 12 17:26:42.678378 containerd[1508]: 2025-12-12 17:26:42.604 [INFO][4619] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.e0c6598a2ead4c8fa1d229bd3b7b01e8fde58665190f0b4acec5ebb9e3e81bb8" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.678378 containerd[1508]: 2025-12-12 17:26:42.615 [INFO][4619] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.72.7/26] block=192.168.72.0/26 handle="k8s-pod-network.e0c6598a2ead4c8fa1d229bd3b7b01e8fde58665190f0b4acec5ebb9e3e81bb8" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.678378 containerd[1508]: 2025-12-12 17:26:42.615 [INFO][4619] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.72.7/26] handle="k8s-pod-network.e0c6598a2ead4c8fa1d229bd3b7b01e8fde58665190f0b4acec5ebb9e3e81bb8" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.678378 containerd[1508]: 2025-12-12 17:26:42.615 [INFO][4619] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:26:42.678378 containerd[1508]: 2025-12-12 17:26:42.615 [INFO][4619] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.72.7/26] IPv6=[] ContainerID="e0c6598a2ead4c8fa1d229bd3b7b01e8fde58665190f0b4acec5ebb9e3e81bb8" HandleID="k8s-pod-network.e0c6598a2ead4c8fa1d229bd3b7b01e8fde58665190f0b4acec5ebb9e3e81bb8" Workload="ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--clxs9-eth0" Dec 12 17:26:42.678967 containerd[1508]: 2025-12-12 17:26:42.619 [INFO][4595] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e0c6598a2ead4c8fa1d229bd3b7b01e8fde58665190f0b4acec5ebb9e3e81bb8" Namespace="calico-apiserver" Pod="calico-apiserver-b9d8f9566-clxs9" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--clxs9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--clxs9-eth0", GenerateName:"calico-apiserver-b9d8f9566-", Namespace:"calico-apiserver", SelfLink:"", UID:"b8ca6b0c-01ec-442d-b526-fb52b3677452", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b9d8f9566", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-1-a1e622265d", ContainerID:"", Pod:"calico-apiserver-b9d8f9566-clxs9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.72.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calicdd625d00c2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:42.678967 containerd[1508]: 2025-12-12 17:26:42.620 [INFO][4595] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.7/32] ContainerID="e0c6598a2ead4c8fa1d229bd3b7b01e8fde58665190f0b4acec5ebb9e3e81bb8" Namespace="calico-apiserver" Pod="calico-apiserver-b9d8f9566-clxs9" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--clxs9-eth0" Dec 12 17:26:42.678967 containerd[1508]: 2025-12-12 17:26:42.620 [INFO][4595] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicdd625d00c2 ContainerID="e0c6598a2ead4c8fa1d229bd3b7b01e8fde58665190f0b4acec5ebb9e3e81bb8" Namespace="calico-apiserver" Pod="calico-apiserver-b9d8f9566-clxs9" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--clxs9-eth0" Dec 12 17:26:42.678967 containerd[1508]: 2025-12-12 17:26:42.642 [INFO][4595] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e0c6598a2ead4c8fa1d229bd3b7b01e8fde58665190f0b4acec5ebb9e3e81bb8" Namespace="calico-apiserver" Pod="calico-apiserver-b9d8f9566-clxs9" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--clxs9-eth0" Dec 12 17:26:42.678967 containerd[1508]: 2025-12-12 17:26:42.649 [INFO][4595] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e0c6598a2ead4c8fa1d229bd3b7b01e8fde58665190f0b4acec5ebb9e3e81bb8" Namespace="calico-apiserver" Pod="calico-apiserver-b9d8f9566-clxs9" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--clxs9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--clxs9-eth0", GenerateName:"calico-apiserver-b9d8f9566-", Namespace:"calico-apiserver", SelfLink:"", UID:"b8ca6b0c-01ec-442d-b526-fb52b3677452", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b9d8f9566", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-1-a1e622265d", ContainerID:"e0c6598a2ead4c8fa1d229bd3b7b01e8fde58665190f0b4acec5ebb9e3e81bb8", Pod:"calico-apiserver-b9d8f9566-clxs9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.72.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calicdd625d00c2", MAC:"26:fb:12:a3:f8:ea", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:42.678967 containerd[1508]: 2025-12-12 17:26:42.670 [INFO][4595] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e0c6598a2ead4c8fa1d229bd3b7b01e8fde58665190f0b4acec5ebb9e3e81bb8" Namespace="calico-apiserver" Pod="calico-apiserver-b9d8f9566-clxs9" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-calico--apiserver--b9d8f9566--clxs9-eth0" Dec 12 17:26:42.730396 kubelet[2774]: I1212 17:26:42.730221 2774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-xw9gv" podStartSLOduration=46.730202723 podStartE2EDuration="46.730202723s" podCreationTimestamp="2025-12-12 17:25:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:26:42.730006643 +0000 UTC m=+51.521188711" watchObservedRunningTime="2025-12-12 17:26:42.730202723 +0000 UTC m=+51.521384591" Dec 12 17:26:42.741220 containerd[1508]: time="2025-12-12T17:26:42.741077077Z" level=info msg="connecting to shim e0c6598a2ead4c8fa1d229bd3b7b01e8fde58665190f0b4acec5ebb9e3e81bb8" address="unix:///run/containerd/s/bc657a5c74f693b2b2aba58f313c813a33819287bba571f72114240386705e6c" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:42.794855 systemd[1]: Started cri-containerd-e0c6598a2ead4c8fa1d229bd3b7b01e8fde58665190f0b4acec5ebb9e3e81bb8.scope - libcontainer container e0c6598a2ead4c8fa1d229bd3b7b01e8fde58665190f0b4acec5ebb9e3e81bb8. Dec 12 17:26:42.799499 containerd[1508]: time="2025-12-12T17:26:42.799391135Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:42.801215 containerd[1508]: time="2025-12-12T17:26:42.801173060Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:26:42.801538 containerd[1508]: time="2025-12-12T17:26:42.801378941Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:26:42.802026 kubelet[2774]: E1212 17:26:42.801977 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:26:42.802307 kubelet[2774]: E1212 17:26:42.802282 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:26:42.803304 kubelet[2774]: E1212 17:26:42.802803 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-7c69bf577-f29bx_calico-system(02827e28-67cb-418d-b5bc-487012be465e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:42.803304 kubelet[2774]: E1212 17:26:42.803254 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7c69bf577-f29bx" podUID="02827e28-67cb-418d-b5bc-487012be465e" Dec 12 17:26:42.844681 kubelet[2774]: I1212 17:26:42.839708 2774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-88wmf" podStartSLOduration=46.839692457 podStartE2EDuration="46.839692457s" podCreationTimestamp="2025-12-12 17:25:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:26:42.805748714 +0000 UTC m=+51.596930542" watchObservedRunningTime="2025-12-12 17:26:42.839692457 +0000 UTC m=+51.630874365" Dec 12 17:26:42.857326 systemd-networkd[1418]: cali4c565c737cd: Link UP Dec 12 17:26:42.859252 systemd-networkd[1418]: cali4c565c737cd: Gained carrier Dec 12 17:26:42.884794 systemd-networkd[1418]: cali4a2c90f28d0: Gained IPv6LL Dec 12 17:26:42.886633 containerd[1508]: 2025-12-12 17:26:42.534 [INFO][4593] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--1--a1e622265d-k8s-goldmane--7c778bb748--58gsb-eth0 goldmane-7c778bb748- calico-system ddab63ad-c74e-4d1e-9071-4b51ff05d58f 815 0 2025-12-12 17:26:12 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7c778bb748 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-2-1-a1e622265d goldmane-7c778bb748-58gsb eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali4c565c737cd [] [] }} ContainerID="5de365c9a373f7b46767d26006823cab44b5d141235a8bc0d16fe00b0573bd19" Namespace="calico-system" Pod="goldmane-7c778bb748-58gsb" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-goldmane--7c778bb748--58gsb-" Dec 12 17:26:42.886633 containerd[1508]: 2025-12-12 17:26:42.534 [INFO][4593] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5de365c9a373f7b46767d26006823cab44b5d141235a8bc0d16fe00b0573bd19" Namespace="calico-system" Pod="goldmane-7c778bb748-58gsb" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-goldmane--7c778bb748--58gsb-eth0" Dec 12 17:26:42.886633 containerd[1508]: 2025-12-12 17:26:42.606 [INFO][4634] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5de365c9a373f7b46767d26006823cab44b5d141235a8bc0d16fe00b0573bd19" HandleID="k8s-pod-network.5de365c9a373f7b46767d26006823cab44b5d141235a8bc0d16fe00b0573bd19" Workload="ci--4459--2--2--1--a1e622265d-k8s-goldmane--7c778bb748--58gsb-eth0" Dec 12 17:26:42.886633 containerd[1508]: 2025-12-12 17:26:42.607 [INFO][4634] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="5de365c9a373f7b46767d26006823cab44b5d141235a8bc0d16fe00b0573bd19" HandleID="k8s-pod-network.5de365c9a373f7b46767d26006823cab44b5d141235a8bc0d16fe00b0573bd19" Workload="ci--4459--2--2--1--a1e622265d-k8s-goldmane--7c778bb748--58gsb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3810), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-1-a1e622265d", "pod":"goldmane-7c778bb748-58gsb", "timestamp":"2025-12-12 17:26:42.606578586 +0000 UTC"}, Hostname:"ci-4459-2-2-1-a1e622265d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:26:42.886633 containerd[1508]: 2025-12-12 17:26:42.607 [INFO][4634] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:26:42.886633 containerd[1508]: 2025-12-12 17:26:42.616 [INFO][4634] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:26:42.886633 containerd[1508]: 2025-12-12 17:26:42.616 [INFO][4634] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-1-a1e622265d' Dec 12 17:26:42.886633 containerd[1508]: 2025-12-12 17:26:42.673 [INFO][4634] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5de365c9a373f7b46767d26006823cab44b5d141235a8bc0d16fe00b0573bd19" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.886633 containerd[1508]: 2025-12-12 17:26:42.701 [INFO][4634] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.886633 containerd[1508]: 2025-12-12 17:26:42.731 [INFO][4634] ipam/ipam.go 511: Trying affinity for 192.168.72.0/26 host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.886633 containerd[1508]: 2025-12-12 17:26:42.742 [INFO][4634] ipam/ipam.go 158: Attempting to load block cidr=192.168.72.0/26 host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.886633 containerd[1508]: 2025-12-12 17:26:42.779 [INFO][4634] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.886633 containerd[1508]: 2025-12-12 17:26:42.779 [INFO][4634] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.5de365c9a373f7b46767d26006823cab44b5d141235a8bc0d16fe00b0573bd19" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.886633 containerd[1508]: 2025-12-12 17:26:42.802 [INFO][4634] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.5de365c9a373f7b46767d26006823cab44b5d141235a8bc0d16fe00b0573bd19 Dec 12 17:26:42.886633 containerd[1508]: 2025-12-12 17:26:42.821 [INFO][4634] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.5de365c9a373f7b46767d26006823cab44b5d141235a8bc0d16fe00b0573bd19" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.886633 containerd[1508]: 2025-12-12 17:26:42.845 [INFO][4634] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.72.8/26] block=192.168.72.0/26 handle="k8s-pod-network.5de365c9a373f7b46767d26006823cab44b5d141235a8bc0d16fe00b0573bd19" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.886633 containerd[1508]: 2025-12-12 17:26:42.845 [INFO][4634] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.72.8/26] handle="k8s-pod-network.5de365c9a373f7b46767d26006823cab44b5d141235a8bc0d16fe00b0573bd19" host="ci-4459-2-2-1-a1e622265d" Dec 12 17:26:42.886633 containerd[1508]: 2025-12-12 17:26:42.845 [INFO][4634] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:26:42.886633 containerd[1508]: 2025-12-12 17:26:42.845 [INFO][4634] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.72.8/26] IPv6=[] ContainerID="5de365c9a373f7b46767d26006823cab44b5d141235a8bc0d16fe00b0573bd19" HandleID="k8s-pod-network.5de365c9a373f7b46767d26006823cab44b5d141235a8bc0d16fe00b0573bd19" Workload="ci--4459--2--2--1--a1e622265d-k8s-goldmane--7c778bb748--58gsb-eth0" Dec 12 17:26:42.887137 containerd[1508]: 2025-12-12 17:26:42.849 [INFO][4593] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5de365c9a373f7b46767d26006823cab44b5d141235a8bc0d16fe00b0573bd19" Namespace="calico-system" Pod="goldmane-7c778bb748-58gsb" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-goldmane--7c778bb748--58gsb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--1--a1e622265d-k8s-goldmane--7c778bb748--58gsb-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"ddab63ad-c74e-4d1e-9071-4b51ff05d58f", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-1-a1e622265d", ContainerID:"", Pod:"goldmane-7c778bb748-58gsb", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.72.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali4c565c737cd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:42.887137 containerd[1508]: 2025-12-12 17:26:42.850 [INFO][4593] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.8/32] ContainerID="5de365c9a373f7b46767d26006823cab44b5d141235a8bc0d16fe00b0573bd19" Namespace="calico-system" Pod="goldmane-7c778bb748-58gsb" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-goldmane--7c778bb748--58gsb-eth0" Dec 12 17:26:42.887137 containerd[1508]: 2025-12-12 17:26:42.850 [INFO][4593] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4c565c737cd ContainerID="5de365c9a373f7b46767d26006823cab44b5d141235a8bc0d16fe00b0573bd19" Namespace="calico-system" Pod="goldmane-7c778bb748-58gsb" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-goldmane--7c778bb748--58gsb-eth0" Dec 12 17:26:42.887137 containerd[1508]: 2025-12-12 17:26:42.857 [INFO][4593] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5de365c9a373f7b46767d26006823cab44b5d141235a8bc0d16fe00b0573bd19" Namespace="calico-system" Pod="goldmane-7c778bb748-58gsb" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-goldmane--7c778bb748--58gsb-eth0" Dec 12 17:26:42.887137 containerd[1508]: 2025-12-12 17:26:42.857 [INFO][4593] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5de365c9a373f7b46767d26006823cab44b5d141235a8bc0d16fe00b0573bd19" Namespace="calico-system" Pod="goldmane-7c778bb748-58gsb" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-goldmane--7c778bb748--58gsb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--1--a1e622265d-k8s-goldmane--7c778bb748--58gsb-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"ddab63ad-c74e-4d1e-9071-4b51ff05d58f", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 26, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-1-a1e622265d", ContainerID:"5de365c9a373f7b46767d26006823cab44b5d141235a8bc0d16fe00b0573bd19", Pod:"goldmane-7c778bb748-58gsb", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.72.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali4c565c737cd", MAC:"6a:e6:70:ce:c9:9d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:26:42.887137 containerd[1508]: 2025-12-12 17:26:42.874 [INFO][4593] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5de365c9a373f7b46767d26006823cab44b5d141235a8bc0d16fe00b0573bd19" Namespace="calico-system" Pod="goldmane-7c778bb748-58gsb" WorkloadEndpoint="ci--4459--2--2--1--a1e622265d-k8s-goldmane--7c778bb748--58gsb-eth0" Dec 12 17:26:42.931304 containerd[1508]: time="2025-12-12T17:26:42.931146816Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b9d8f9566-clxs9,Uid:b8ca6b0c-01ec-442d-b526-fb52b3677452,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"e0c6598a2ead4c8fa1d229bd3b7b01e8fde58665190f0b4acec5ebb9e3e81bb8\"" Dec 12 17:26:42.935389 containerd[1508]: time="2025-12-12T17:26:42.935313429Z" level=info msg="connecting to shim 5de365c9a373f7b46767d26006823cab44b5d141235a8bc0d16fe00b0573bd19" address="unix:///run/containerd/s/afb773479a032cfe3849330771ad33ffa04175baa265ecbb4f21e3e36de44042" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:26:42.937224 containerd[1508]: time="2025-12-12T17:26:42.937183395Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:26:42.978075 systemd[1]: Started cri-containerd-5de365c9a373f7b46767d26006823cab44b5d141235a8bc0d16fe00b0573bd19.scope - libcontainer container 5de365c9a373f7b46767d26006823cab44b5d141235a8bc0d16fe00b0573bd19. Dec 12 17:26:43.053820 containerd[1508]: time="2025-12-12T17:26:43.053760631Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-58gsb,Uid:ddab63ad-c74e-4d1e-9071-4b51ff05d58f,Namespace:calico-system,Attempt:0,} returns sandbox id \"5de365c9a373f7b46767d26006823cab44b5d141235a8bc0d16fe00b0573bd19\"" Dec 12 17:26:43.315061 containerd[1508]: time="2025-12-12T17:26:43.314889753Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:43.316800 containerd[1508]: time="2025-12-12T17:26:43.316712438Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:26:43.317107 containerd[1508]: time="2025-12-12T17:26:43.316762598Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:26:43.317689 kubelet[2774]: E1212 17:26:43.317331 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:26:43.317689 kubelet[2774]: E1212 17:26:43.317383 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:26:43.317689 kubelet[2774]: E1212 17:26:43.317552 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-b9d8f9566-clxs9_calico-apiserver(b8ca6b0c-01ec-442d-b526-fb52b3677452): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:43.318580 kubelet[2774]: E1212 17:26:43.317644 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-clxs9" podUID="b8ca6b0c-01ec-442d-b526-fb52b3677452" Dec 12 17:26:43.319187 containerd[1508]: time="2025-12-12T17:26:43.319123286Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:26:43.647153 kubelet[2774]: I1212 17:26:43.646593 2774 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 12 17:26:43.650883 systemd-networkd[1418]: cali99da4534265: Gained IPv6LL Dec 12 17:26:43.651271 systemd-networkd[1418]: cali9c527f03beb: Gained IPv6LL Dec 12 17:26:43.659886 containerd[1508]: time="2025-12-12T17:26:43.659686211Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:43.673780 containerd[1508]: time="2025-12-12T17:26:43.662724420Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:26:43.675963 containerd[1508]: time="2025-12-12T17:26:43.675682740Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:26:43.676291 kubelet[2774]: E1212 17:26:43.676154 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:26:43.676291 kubelet[2774]: E1212 17:26:43.676224 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:26:43.676638 kubelet[2774]: E1212 17:26:43.676516 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-58gsb_calico-system(ddab63ad-c74e-4d1e-9071-4b51ff05d58f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:43.676696 kubelet[2774]: E1212 17:26:43.676665 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-58gsb" podUID="ddab63ad-c74e-4d1e-9071-4b51ff05d58f" Dec 12 17:26:43.688645 kubelet[2774]: E1212 17:26:43.687024 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-clxs9" podUID="b8ca6b0c-01ec-442d-b526-fb52b3677452" Dec 12 17:26:43.688645 kubelet[2774]: E1212 17:26:43.687477 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7c69bf577-f29bx" podUID="02827e28-67cb-418d-b5bc-487012be465e" Dec 12 17:26:43.691744 kubelet[2774]: E1212 17:26:43.691682 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-tg4kr" podUID="7c9ca7d5-6db7-4a38-8560-e94f3e1a1488" Dec 12 17:26:43.842920 systemd-networkd[1418]: cali149b9eea634: Gained IPv6LL Dec 12 17:26:44.098992 systemd-networkd[1418]: cali4c565c737cd: Gained IPv6LL Dec 12 17:26:44.611966 systemd-networkd[1418]: calicdd625d00c2: Gained IPv6LL Dec 12 17:26:44.688954 kubelet[2774]: E1212 17:26:44.688914 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-clxs9" podUID="b8ca6b0c-01ec-442d-b526-fb52b3677452" Dec 12 17:26:44.692775 kubelet[2774]: E1212 17:26:44.692670 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-58gsb" podUID="ddab63ad-c74e-4d1e-9071-4b51ff05d58f" Dec 12 17:26:47.357637 containerd[1508]: time="2025-12-12T17:26:47.357191924Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:26:47.902925 containerd[1508]: time="2025-12-12T17:26:47.902813594Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:47.905537 containerd[1508]: time="2025-12-12T17:26:47.905381162Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:26:47.905537 containerd[1508]: time="2025-12-12T17:26:47.905493603Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:26:47.906104 kubelet[2774]: E1212 17:26:47.905995 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:26:47.907648 kubelet[2774]: E1212 17:26:47.906440 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:26:47.907854 kubelet[2774]: E1212 17:26:47.907824 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-fd8bd9d8f-pgxdr_calico-system(73777a70-bcad-484a-aa29-795f5b22b302): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:47.909435 containerd[1508]: time="2025-12-12T17:26:47.909404095Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:26:48.272282 containerd[1508]: time="2025-12-12T17:26:48.272231557Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:48.275066 containerd[1508]: time="2025-12-12T17:26:48.274991646Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:26:48.275865 containerd[1508]: time="2025-12-12T17:26:48.275026366Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:26:48.275898 kubelet[2774]: E1212 17:26:48.275235 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:26:48.275898 kubelet[2774]: E1212 17:26:48.275278 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:26:48.275898 kubelet[2774]: E1212 17:26:48.275340 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-fd8bd9d8f-pgxdr_calico-system(73777a70-bcad-484a-aa29-795f5b22b302): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:48.276118 kubelet[2774]: E1212 17:26:48.275379 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fd8bd9d8f-pgxdr" podUID="73777a70-bcad-484a-aa29-795f5b22b302" Dec 12 17:26:55.358738 containerd[1508]: time="2025-12-12T17:26:55.358648501Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:26:55.703138 containerd[1508]: time="2025-12-12T17:26:55.702864299Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:55.704464 containerd[1508]: time="2025-12-12T17:26:55.704410504Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:26:55.704701 containerd[1508]: time="2025-12-12T17:26:55.704419104Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:26:55.704883 kubelet[2774]: E1212 17:26:55.704843 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:26:55.704883 kubelet[2774]: E1212 17:26:55.704893 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:26:55.705355 kubelet[2774]: E1212 17:26:55.705012 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-b9d8f9566-tg4kr_calico-apiserver(7c9ca7d5-6db7-4a38-8560-e94f3e1a1488): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:55.705355 kubelet[2774]: E1212 17:26:55.705046 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-tg4kr" podUID="7c9ca7d5-6db7-4a38-8560-e94f3e1a1488" Dec 12 17:26:56.356646 containerd[1508]: time="2025-12-12T17:26:56.356126024Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:26:56.713277 containerd[1508]: time="2025-12-12T17:26:56.713231188Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:56.714934 containerd[1508]: time="2025-12-12T17:26:56.714877954Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:26:56.715061 containerd[1508]: time="2025-12-12T17:26:56.714991914Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:26:56.715321 kubelet[2774]: E1212 17:26:56.715275 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:26:56.715610 kubelet[2774]: E1212 17:26:56.715337 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:26:56.715675 kubelet[2774]: E1212 17:26:56.715611 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-b9d8f9566-clxs9_calico-apiserver(b8ca6b0c-01ec-442d-b526-fb52b3677452): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:56.715711 kubelet[2774]: E1212 17:26:56.715674 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-clxs9" podUID="b8ca6b0c-01ec-442d-b526-fb52b3677452" Dec 12 17:26:57.357670 containerd[1508]: time="2025-12-12T17:26:57.357156291Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:26:57.715600 containerd[1508]: time="2025-12-12T17:26:57.715530183Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:57.717111 containerd[1508]: time="2025-12-12T17:26:57.717038548Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:26:57.717245 containerd[1508]: time="2025-12-12T17:26:57.717162188Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:26:57.717500 kubelet[2774]: E1212 17:26:57.717401 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:26:57.717500 kubelet[2774]: E1212 17:26:57.717469 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:26:57.719498 kubelet[2774]: E1212 17:26:57.717601 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-nnwtm_calico-system(df83329f-2747-4a89-9a6a-7b20123df2cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:57.720999 containerd[1508]: time="2025-12-12T17:26:57.720116198Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:26:58.092927 containerd[1508]: time="2025-12-12T17:26:58.092790098Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:58.094690 containerd[1508]: time="2025-12-12T17:26:58.094599224Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:26:58.094690 containerd[1508]: time="2025-12-12T17:26:58.094657144Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:26:58.095129 kubelet[2774]: E1212 17:26:58.095070 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:26:58.095311 kubelet[2774]: E1212 17:26:58.095235 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:26:58.095422 kubelet[2774]: E1212 17:26:58.095403 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-nnwtm_calico-system(df83329f-2747-4a89-9a6a-7b20123df2cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:58.095591 kubelet[2774]: E1212 17:26:58.095470 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nnwtm" podUID="df83329f-2747-4a89-9a6a-7b20123df2cf" Dec 12 17:26:58.359542 containerd[1508]: time="2025-12-12T17:26:58.359019292Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:26:58.359998 kubelet[2774]: E1212 17:26:58.359935 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fd8bd9d8f-pgxdr" podUID="73777a70-bcad-484a-aa29-795f5b22b302" Dec 12 17:26:58.714135 containerd[1508]: time="2025-12-12T17:26:58.714079657Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:58.715947 containerd[1508]: time="2025-12-12T17:26:58.715876783Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:26:58.716289 containerd[1508]: time="2025-12-12T17:26:58.715991623Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:26:58.717984 kubelet[2774]: E1212 17:26:58.717799 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:26:58.717984 kubelet[2774]: E1212 17:26:58.717855 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:26:58.718326 kubelet[2774]: E1212 17:26:58.718066 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-58gsb_calico-system(ddab63ad-c74e-4d1e-9071-4b51ff05d58f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:58.718326 kubelet[2774]: E1212 17:26:58.718110 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-58gsb" podUID="ddab63ad-c74e-4d1e-9071-4b51ff05d58f" Dec 12 17:26:58.719245 containerd[1508]: time="2025-12-12T17:26:58.719212274Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:26:59.054966 containerd[1508]: time="2025-12-12T17:26:59.054658176Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:26:59.056520 containerd[1508]: time="2025-12-12T17:26:59.056376381Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:26:59.056520 containerd[1508]: time="2025-12-12T17:26:59.056440301Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:26:59.056924 kubelet[2774]: E1212 17:26:59.056861 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:26:59.057113 kubelet[2774]: E1212 17:26:59.057040 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:26:59.057228 kubelet[2774]: E1212 17:26:59.057208 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-7c69bf577-f29bx_calico-system(02827e28-67cb-418d-b5bc-487012be465e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:26:59.057413 kubelet[2774]: E1212 17:26:59.057364 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7c69bf577-f29bx" podUID="02827e28-67cb-418d-b5bc-487012be465e" Dec 12 17:27:06.356834 kubelet[2774]: E1212 17:27:06.356765 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-tg4kr" podUID="7c9ca7d5-6db7-4a38-8560-e94f3e1a1488" Dec 12 17:27:08.356155 kubelet[2774]: E1212 17:27:08.355695 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-clxs9" podUID="b8ca6b0c-01ec-442d-b526-fb52b3677452" Dec 12 17:27:10.356351 kubelet[2774]: E1212 17:27:10.356121 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7c69bf577-f29bx" podUID="02827e28-67cb-418d-b5bc-487012be465e" Dec 12 17:27:10.360465 kubelet[2774]: E1212 17:27:10.360403 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nnwtm" podUID="df83329f-2747-4a89-9a6a-7b20123df2cf" Dec 12 17:27:13.366645 containerd[1508]: time="2025-12-12T17:27:13.365832959Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:27:13.367058 kubelet[2774]: E1212 17:27:13.366284 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-58gsb" podUID="ddab63ad-c74e-4d1e-9071-4b51ff05d58f" Dec 12 17:27:13.691642 containerd[1508]: time="2025-12-12T17:27:13.691428111Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:13.693666 containerd[1508]: time="2025-12-12T17:27:13.693392158Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:27:13.693666 containerd[1508]: time="2025-12-12T17:27:13.693430678Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:27:13.693933 kubelet[2774]: E1212 17:27:13.693834 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:27:13.693985 kubelet[2774]: E1212 17:27:13.693903 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:27:13.695189 kubelet[2774]: E1212 17:27:13.694072 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-fd8bd9d8f-pgxdr_calico-system(73777a70-bcad-484a-aa29-795f5b22b302): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:13.697075 containerd[1508]: time="2025-12-12T17:27:13.696789370Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:27:14.028230 containerd[1508]: time="2025-12-12T17:27:14.028093581Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:14.030123 containerd[1508]: time="2025-12-12T17:27:14.030028828Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:27:14.030270 containerd[1508]: time="2025-12-12T17:27:14.030186628Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:27:14.030957 kubelet[2774]: E1212 17:27:14.030896 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:27:14.031066 kubelet[2774]: E1212 17:27:14.030979 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:27:14.031191 kubelet[2774]: E1212 17:27:14.031118 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-fd8bd9d8f-pgxdr_calico-system(73777a70-bcad-484a-aa29-795f5b22b302): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:14.031236 kubelet[2774]: E1212 17:27:14.031198 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fd8bd9d8f-pgxdr" podUID="73777a70-bcad-484a-aa29-795f5b22b302" Dec 12 17:27:17.375306 containerd[1508]: time="2025-12-12T17:27:17.375252020Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:27:17.715586 containerd[1508]: time="2025-12-12T17:27:17.715508111Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:17.717733 containerd[1508]: time="2025-12-12T17:27:17.717671838Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:27:17.717904 containerd[1508]: time="2025-12-12T17:27:17.717782078Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:27:17.718558 kubelet[2774]: E1212 17:27:17.718514 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:27:17.720974 kubelet[2774]: E1212 17:27:17.720685 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:27:17.720974 kubelet[2774]: E1212 17:27:17.720886 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-b9d8f9566-tg4kr_calico-apiserver(7c9ca7d5-6db7-4a38-8560-e94f3e1a1488): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:17.720974 kubelet[2774]: E1212 17:27:17.720926 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-tg4kr" podUID="7c9ca7d5-6db7-4a38-8560-e94f3e1a1488" Dec 12 17:27:21.361386 containerd[1508]: time="2025-12-12T17:27:21.360906096Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:27:21.695237 containerd[1508]: time="2025-12-12T17:27:21.695153934Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:21.697014 containerd[1508]: time="2025-12-12T17:27:21.696781820Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:27:21.697014 containerd[1508]: time="2025-12-12T17:27:21.696844220Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:27:21.697211 kubelet[2774]: E1212 17:27:21.697060 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:27:21.697211 kubelet[2774]: E1212 17:27:21.697111 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:27:21.697657 kubelet[2774]: E1212 17:27:21.697247 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-b9d8f9566-clxs9_calico-apiserver(b8ca6b0c-01ec-442d-b526-fb52b3677452): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:21.697657 kubelet[2774]: E1212 17:27:21.697280 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-clxs9" podUID="b8ca6b0c-01ec-442d-b526-fb52b3677452" Dec 12 17:27:23.356949 containerd[1508]: time="2025-12-12T17:27:23.356607937Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:27:23.698971 containerd[1508]: time="2025-12-12T17:27:23.698766006Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:23.700427 containerd[1508]: time="2025-12-12T17:27:23.700308291Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:27:23.700427 containerd[1508]: time="2025-12-12T17:27:23.700360091Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:27:23.702834 kubelet[2774]: E1212 17:27:23.702787 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:27:23.703841 kubelet[2774]: E1212 17:27:23.703236 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:27:23.703841 kubelet[2774]: E1212 17:27:23.703339 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-nnwtm_calico-system(df83329f-2747-4a89-9a6a-7b20123df2cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:23.704681 containerd[1508]: time="2025-12-12T17:27:23.704373945Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:27:24.050074 containerd[1508]: time="2025-12-12T17:27:24.049952986Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:24.051637 containerd[1508]: time="2025-12-12T17:27:24.051518751Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:27:24.052558 containerd[1508]: time="2025-12-12T17:27:24.051846393Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:27:24.052666 kubelet[2774]: E1212 17:27:24.051988 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:27:24.052666 kubelet[2774]: E1212 17:27:24.052033 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:27:24.052666 kubelet[2774]: E1212 17:27:24.052096 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-nnwtm_calico-system(df83329f-2747-4a89-9a6a-7b20123df2cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:24.052807 kubelet[2774]: E1212 17:27:24.052135 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nnwtm" podUID="df83329f-2747-4a89-9a6a-7b20123df2cf" Dec 12 17:27:24.356058 containerd[1508]: time="2025-12-12T17:27:24.355578769Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:27:24.709350 containerd[1508]: time="2025-12-12T17:27:24.709291880Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:24.710899 containerd[1508]: time="2025-12-12T17:27:24.710751525Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:27:24.710899 containerd[1508]: time="2025-12-12T17:27:24.710870725Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:27:24.711139 kubelet[2774]: E1212 17:27:24.711037 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:27:24.711139 kubelet[2774]: E1212 17:27:24.711097 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:27:24.713066 kubelet[2774]: E1212 17:27:24.711196 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-7c69bf577-f29bx_calico-system(02827e28-67cb-418d-b5bc-487012be465e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:24.713066 kubelet[2774]: E1212 17:27:24.711235 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7c69bf577-f29bx" podUID="02827e28-67cb-418d-b5bc-487012be465e" Dec 12 17:27:26.357940 kubelet[2774]: E1212 17:27:26.357880 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fd8bd9d8f-pgxdr" podUID="73777a70-bcad-484a-aa29-795f5b22b302" Dec 12 17:27:27.358005 containerd[1508]: time="2025-12-12T17:27:27.357852513Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:27:27.698116 containerd[1508]: time="2025-12-12T17:27:27.697834500Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:27:27.699597 containerd[1508]: time="2025-12-12T17:27:27.699437706Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:27:27.699597 containerd[1508]: time="2025-12-12T17:27:27.699554146Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:27:27.700253 kubelet[2774]: E1212 17:27:27.700205 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:27:27.701534 kubelet[2774]: E1212 17:27:27.700795 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:27:27.702241 kubelet[2774]: E1212 17:27:27.702212 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-58gsb_calico-system(ddab63ad-c74e-4d1e-9071-4b51ff05d58f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:27:27.702500 kubelet[2774]: E1212 17:27:27.702427 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-58gsb" podUID="ddab63ad-c74e-4d1e-9071-4b51ff05d58f" Dec 12 17:27:32.356984 kubelet[2774]: E1212 17:27:32.356908 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-tg4kr" podUID="7c9ca7d5-6db7-4a38-8560-e94f3e1a1488" Dec 12 17:27:36.358449 kubelet[2774]: E1212 17:27:36.357319 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-clxs9" podUID="b8ca6b0c-01ec-442d-b526-fb52b3677452" Dec 12 17:27:36.359766 kubelet[2774]: E1212 17:27:36.357946 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7c69bf577-f29bx" podUID="02827e28-67cb-418d-b5bc-487012be465e" Dec 12 17:27:36.360164 kubelet[2774]: E1212 17:27:36.360128 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nnwtm" podUID="df83329f-2747-4a89-9a6a-7b20123df2cf" Dec 12 17:27:39.357657 kubelet[2774]: E1212 17:27:39.357580 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-58gsb" podUID="ddab63ad-c74e-4d1e-9071-4b51ff05d58f" Dec 12 17:27:39.368545 kubelet[2774]: E1212 17:27:39.368497 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fd8bd9d8f-pgxdr" podUID="73777a70-bcad-484a-aa29-795f5b22b302" Dec 12 17:27:45.362646 kubelet[2774]: E1212 17:27:45.362134 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-tg4kr" podUID="7c9ca7d5-6db7-4a38-8560-e94f3e1a1488" Dec 12 17:27:49.356086 kubelet[2774]: E1212 17:27:49.355993 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7c69bf577-f29bx" podUID="02827e28-67cb-418d-b5bc-487012be465e" Dec 12 17:27:50.358566 kubelet[2774]: E1212 17:27:50.358487 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nnwtm" podUID="df83329f-2747-4a89-9a6a-7b20123df2cf" Dec 12 17:27:51.357674 kubelet[2774]: E1212 17:27:51.356647 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-clxs9" podUID="b8ca6b0c-01ec-442d-b526-fb52b3677452" Dec 12 17:27:51.971289 systemd[1]: Started sshd@8-23.88.120.93:22-182.52.236.80:59338.service - OpenSSH per-connection server daemon (182.52.236.80:59338). Dec 12 17:27:53.358205 kubelet[2774]: E1212 17:27:53.358136 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-58gsb" podUID="ddab63ad-c74e-4d1e-9071-4b51ff05d58f" Dec 12 17:27:53.361875 kubelet[2774]: E1212 17:27:53.361808 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fd8bd9d8f-pgxdr" podUID="73777a70-bcad-484a-aa29-795f5b22b302" Dec 12 17:27:54.178653 sshd[4901]: Invalid user support from 182.52.236.80 port 59338 Dec 12 17:27:54.845108 sshd-session[4904]: pam_faillock(sshd:auth): User unknown Dec 12 17:27:54.850338 sshd[4901]: Postponed keyboard-interactive for invalid user support from 182.52.236.80 port 59338 ssh2 [preauth] Dec 12 17:27:55.382134 sshd-session[4904]: pam_unix(sshd:auth): check pass; user unknown Dec 12 17:27:55.382186 sshd-session[4904]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=182.52.236.80 Dec 12 17:27:55.382954 sshd-session[4904]: pam_faillock(sshd:auth): User unknown Dec 12 17:27:57.608762 sshd[4901]: PAM: Permission denied for illegal user support from 182.52.236.80 Dec 12 17:27:57.609707 sshd[4901]: Failed keyboard-interactive/pam for invalid user support from 182.52.236.80 port 59338 ssh2 Dec 12 17:27:58.177489 sshd[4901]: Connection closed by invalid user support 182.52.236.80 port 59338 [preauth] Dec 12 17:27:58.181730 systemd[1]: sshd@8-23.88.120.93:22-182.52.236.80:59338.service: Deactivated successfully. Dec 12 17:28:00.357756 containerd[1508]: time="2025-12-12T17:28:00.357674628Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:28:00.359768 kubelet[2774]: E1212 17:28:00.359688 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7c69bf577-f29bx" podUID="02827e28-67cb-418d-b5bc-487012be465e" Dec 12 17:28:00.702654 containerd[1508]: time="2025-12-12T17:28:00.702481263Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:28:00.704652 containerd[1508]: time="2025-12-12T17:28:00.704420030Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:28:00.704652 containerd[1508]: time="2025-12-12T17:28:00.704469230Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:28:00.705315 kubelet[2774]: E1212 17:28:00.705190 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:28:00.705315 kubelet[2774]: E1212 17:28:00.705280 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:28:00.705793 kubelet[2774]: E1212 17:28:00.705729 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-b9d8f9566-tg4kr_calico-apiserver(7c9ca7d5-6db7-4a38-8560-e94f3e1a1488): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:28:00.705793 kubelet[2774]: E1212 17:28:00.705770 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-tg4kr" podUID="7c9ca7d5-6db7-4a38-8560-e94f3e1a1488" Dec 12 17:28:02.356137 containerd[1508]: time="2025-12-12T17:28:02.356068871Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:28:02.694029 containerd[1508]: time="2025-12-12T17:28:02.693965603Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:28:02.695578 containerd[1508]: time="2025-12-12T17:28:02.695506128Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:28:02.695759 containerd[1508]: time="2025-12-12T17:28:02.695636809Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:28:02.696328 kubelet[2774]: E1212 17:28:02.696020 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:28:02.696328 kubelet[2774]: E1212 17:28:02.696092 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:28:02.696328 kubelet[2774]: E1212 17:28:02.696213 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-b9d8f9566-clxs9_calico-apiserver(b8ca6b0c-01ec-442d-b526-fb52b3677452): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:28:02.696328 kubelet[2774]: E1212 17:28:02.696269 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-clxs9" podUID="b8ca6b0c-01ec-442d-b526-fb52b3677452" Dec 12 17:28:04.357308 containerd[1508]: time="2025-12-12T17:28:04.356747928Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:28:04.692613 containerd[1508]: time="2025-12-12T17:28:04.692302693Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:28:04.701458 containerd[1508]: time="2025-12-12T17:28:04.701242045Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:28:04.701458 containerd[1508]: time="2025-12-12T17:28:04.701304285Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:28:04.701883 kubelet[2774]: E1212 17:28:04.701841 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:28:04.702253 kubelet[2774]: E1212 17:28:04.701897 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:28:04.708592 kubelet[2774]: E1212 17:28:04.701987 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-nnwtm_calico-system(df83329f-2747-4a89-9a6a-7b20123df2cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:28:04.711969 containerd[1508]: time="2025-12-12T17:28:04.711925123Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:28:05.058400 containerd[1508]: time="2025-12-12T17:28:05.058215646Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:28:05.060356 containerd[1508]: time="2025-12-12T17:28:05.060268933Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:28:05.060539 containerd[1508]: time="2025-12-12T17:28:05.060411934Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:28:05.060965 kubelet[2774]: E1212 17:28:05.060885 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:28:05.060965 kubelet[2774]: E1212 17:28:05.060956 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:28:05.061243 kubelet[2774]: E1212 17:28:05.061043 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-nnwtm_calico-system(df83329f-2747-4a89-9a6a-7b20123df2cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:28:05.061243 kubelet[2774]: E1212 17:28:05.061092 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nnwtm" podUID="df83329f-2747-4a89-9a6a-7b20123df2cf" Dec 12 17:28:07.356432 kubelet[2774]: E1212 17:28:07.356360 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-58gsb" podUID="ddab63ad-c74e-4d1e-9071-4b51ff05d58f" Dec 12 17:28:08.356487 containerd[1508]: time="2025-12-12T17:28:08.356199572Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:28:08.897678 containerd[1508]: time="2025-12-12T17:28:08.897387838Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:28:08.900081 containerd[1508]: time="2025-12-12T17:28:08.899951127Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:28:08.900081 containerd[1508]: time="2025-12-12T17:28:08.900002767Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:28:08.900271 kubelet[2774]: E1212 17:28:08.900232 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:28:08.900587 kubelet[2774]: E1212 17:28:08.900293 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:28:08.900587 kubelet[2774]: E1212 17:28:08.900375 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-fd8bd9d8f-pgxdr_calico-system(73777a70-bcad-484a-aa29-795f5b22b302): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:28:08.902703 containerd[1508]: time="2025-12-12T17:28:08.902608536Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:28:09.253862 containerd[1508]: time="2025-12-12T17:28:09.253799199Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:28:09.255579 containerd[1508]: time="2025-12-12T17:28:09.255512085Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:28:09.256533 containerd[1508]: time="2025-12-12T17:28:09.255548805Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:28:09.256774 kubelet[2774]: E1212 17:28:09.256723 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:28:09.257068 kubelet[2774]: E1212 17:28:09.256883 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:28:09.257068 kubelet[2774]: E1212 17:28:09.256973 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-fd8bd9d8f-pgxdr_calico-system(73777a70-bcad-484a-aa29-795f5b22b302): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:28:09.257068 kubelet[2774]: E1212 17:28:09.257016 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fd8bd9d8f-pgxdr" podUID="73777a70-bcad-484a-aa29-795f5b22b302" Dec 12 17:28:14.357344 kubelet[2774]: E1212 17:28:14.356991 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-clxs9" podUID="b8ca6b0c-01ec-442d-b526-fb52b3677452" Dec 12 17:28:14.358015 containerd[1508]: time="2025-12-12T17:28:14.357465202Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:28:14.705646 containerd[1508]: time="2025-12-12T17:28:14.705554855Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:28:14.707245 containerd[1508]: time="2025-12-12T17:28:14.707089460Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:28:14.707245 containerd[1508]: time="2025-12-12T17:28:14.707244901Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:28:14.707491 kubelet[2774]: E1212 17:28:14.707440 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:28:14.707566 kubelet[2774]: E1212 17:28:14.707501 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:28:14.707647 kubelet[2774]: E1212 17:28:14.707594 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-7c69bf577-f29bx_calico-system(02827e28-67cb-418d-b5bc-487012be465e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:28:14.707777 kubelet[2774]: E1212 17:28:14.707694 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7c69bf577-f29bx" podUID="02827e28-67cb-418d-b5bc-487012be465e" Dec 12 17:28:15.363936 kubelet[2774]: E1212 17:28:15.363880 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-tg4kr" podUID="7c9ca7d5-6db7-4a38-8560-e94f3e1a1488" Dec 12 17:28:16.357107 kubelet[2774]: E1212 17:28:16.357053 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nnwtm" podUID="df83329f-2747-4a89-9a6a-7b20123df2cf" Dec 12 17:28:21.364058 kubelet[2774]: E1212 17:28:21.363999 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fd8bd9d8f-pgxdr" podUID="73777a70-bcad-484a-aa29-795f5b22b302" Dec 12 17:28:22.359350 containerd[1508]: time="2025-12-12T17:28:22.359307203Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:28:22.709649 containerd[1508]: time="2025-12-12T17:28:22.708927944Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:28:22.710800 containerd[1508]: time="2025-12-12T17:28:22.710738551Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:28:22.711029 containerd[1508]: time="2025-12-12T17:28:22.710817311Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:28:22.711222 kubelet[2774]: E1212 17:28:22.711183 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:28:22.711508 kubelet[2774]: E1212 17:28:22.711234 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:28:22.711656 kubelet[2774]: E1212 17:28:22.711575 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-58gsb_calico-system(ddab63ad-c74e-4d1e-9071-4b51ff05d58f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:28:22.711969 kubelet[2774]: E1212 17:28:22.711922 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-58gsb" podUID="ddab63ad-c74e-4d1e-9071-4b51ff05d58f" Dec 12 17:28:26.356614 kubelet[2774]: E1212 17:28:26.356391 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-tg4kr" podUID="7c9ca7d5-6db7-4a38-8560-e94f3e1a1488" Dec 12 17:28:26.486055 systemd[1]: Started sshd@9-23.88.120.93:22-139.178.89.65:47002.service - OpenSSH per-connection server daemon (139.178.89.65:47002). Dec 12 17:28:27.357415 kubelet[2774]: E1212 17:28:27.356782 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7c69bf577-f29bx" podUID="02827e28-67cb-418d-b5bc-487012be465e" Dec 12 17:28:27.501504 sshd[4967]: Accepted publickey for core from 139.178.89.65 port 47002 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:28:27.505037 sshd-session[4967]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:27.511670 systemd-logind[1487]: New session 8 of user core. Dec 12 17:28:27.517844 systemd[1]: Started session-8.scope - Session 8 of User core. Dec 12 17:28:28.358016 kubelet[2774]: E1212 17:28:28.357764 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nnwtm" podUID="df83329f-2747-4a89-9a6a-7b20123df2cf" Dec 12 17:28:28.358016 kubelet[2774]: E1212 17:28:28.357917 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-clxs9" podUID="b8ca6b0c-01ec-442d-b526-fb52b3677452" Dec 12 17:28:28.369652 sshd[4972]: Connection closed by 139.178.89.65 port 47002 Dec 12 17:28:28.370830 sshd-session[4967]: pam_unix(sshd:session): session closed for user core Dec 12 17:28:28.377227 systemd[1]: sshd@9-23.88.120.93:22-139.178.89.65:47002.service: Deactivated successfully. Dec 12 17:28:28.382972 systemd[1]: session-8.scope: Deactivated successfully. Dec 12 17:28:28.385966 systemd-logind[1487]: Session 8 logged out. Waiting for processes to exit. Dec 12 17:28:28.394046 systemd-logind[1487]: Removed session 8. Dec 12 17:28:33.538216 systemd[1]: Started sshd@10-23.88.120.93:22-139.178.89.65:38798.service - OpenSSH per-connection server daemon (139.178.89.65:38798). Dec 12 17:28:34.515151 sshd[4988]: Accepted publickey for core from 139.178.89.65 port 38798 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:28:34.518477 sshd-session[4988]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:34.527927 systemd-logind[1487]: New session 9 of user core. Dec 12 17:28:34.533917 systemd[1]: Started session-9.scope - Session 9 of User core. Dec 12 17:28:35.333286 sshd[4992]: Connection closed by 139.178.89.65 port 38798 Dec 12 17:28:35.332530 sshd-session[4988]: pam_unix(sshd:session): session closed for user core Dec 12 17:28:35.339921 systemd[1]: sshd@10-23.88.120.93:22-139.178.89.65:38798.service: Deactivated successfully. Dec 12 17:28:35.345599 systemd[1]: session-9.scope: Deactivated successfully. Dec 12 17:28:35.349493 systemd-logind[1487]: Session 9 logged out. Waiting for processes to exit. Dec 12 17:28:35.352229 systemd-logind[1487]: Removed session 9. Dec 12 17:28:35.362302 kubelet[2774]: E1212 17:28:35.362092 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fd8bd9d8f-pgxdr" podUID="73777a70-bcad-484a-aa29-795f5b22b302" Dec 12 17:28:35.504103 systemd[1]: Started sshd@11-23.88.120.93:22-139.178.89.65:38812.service - OpenSSH per-connection server daemon (139.178.89.65:38812). Dec 12 17:28:36.358143 kubelet[2774]: E1212 17:28:36.357600 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-58gsb" podUID="ddab63ad-c74e-4d1e-9071-4b51ff05d58f" Dec 12 17:28:36.498914 sshd[5005]: Accepted publickey for core from 139.178.89.65 port 38812 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:28:36.501988 sshd-session[5005]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:36.511542 systemd-logind[1487]: New session 10 of user core. Dec 12 17:28:36.526107 systemd[1]: Started session-10.scope - Session 10 of User core. Dec 12 17:28:37.329795 sshd[5008]: Connection closed by 139.178.89.65 port 38812 Dec 12 17:28:37.330528 sshd-session[5005]: pam_unix(sshd:session): session closed for user core Dec 12 17:28:37.338909 systemd-logind[1487]: Session 10 logged out. Waiting for processes to exit. Dec 12 17:28:37.339158 systemd[1]: sshd@11-23.88.120.93:22-139.178.89.65:38812.service: Deactivated successfully. Dec 12 17:28:37.344278 systemd[1]: session-10.scope: Deactivated successfully. Dec 12 17:28:37.348802 systemd-logind[1487]: Removed session 10. Dec 12 17:28:37.498870 systemd[1]: Started sshd@12-23.88.120.93:22-139.178.89.65:38818.service - OpenSSH per-connection server daemon (139.178.89.65:38818). Dec 12 17:28:38.355069 kubelet[2774]: E1212 17:28:38.355010 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-tg4kr" podUID="7c9ca7d5-6db7-4a38-8560-e94f3e1a1488" Dec 12 17:28:38.507669 sshd[5018]: Accepted publickey for core from 139.178.89.65 port 38818 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:28:38.510545 sshd-session[5018]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:38.519280 systemd-logind[1487]: New session 11 of user core. Dec 12 17:28:38.524157 systemd[1]: Started session-11.scope - Session 11 of User core. Dec 12 17:28:39.312156 sshd[5021]: Connection closed by 139.178.89.65 port 38818 Dec 12 17:28:39.312844 sshd-session[5018]: pam_unix(sshd:session): session closed for user core Dec 12 17:28:39.316937 systemd-logind[1487]: Session 11 logged out. Waiting for processes to exit. Dec 12 17:28:39.317290 systemd[1]: sshd@12-23.88.120.93:22-139.178.89.65:38818.service: Deactivated successfully. Dec 12 17:28:39.323435 systemd[1]: session-11.scope: Deactivated successfully. Dec 12 17:28:39.329326 systemd-logind[1487]: Removed session 11. Dec 12 17:28:42.355701 kubelet[2774]: E1212 17:28:42.355587 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7c69bf577-f29bx" podUID="02827e28-67cb-418d-b5bc-487012be465e" Dec 12 17:28:43.361666 kubelet[2774]: E1212 17:28:43.361509 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-clxs9" podUID="b8ca6b0c-01ec-442d-b526-fb52b3677452" Dec 12 17:28:43.363083 kubelet[2774]: E1212 17:28:43.363034 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nnwtm" podUID="df83329f-2747-4a89-9a6a-7b20123df2cf" Dec 12 17:28:44.479160 systemd[1]: Started sshd@13-23.88.120.93:22-139.178.89.65:38518.service - OpenSSH per-connection server daemon (139.178.89.65:38518). Dec 12 17:28:45.478092 sshd[5060]: Accepted publickey for core from 139.178.89.65 port 38518 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:28:45.480164 sshd-session[5060]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:45.486678 systemd-logind[1487]: New session 12 of user core. Dec 12 17:28:45.492132 systemd[1]: Started session-12.scope - Session 12 of User core. Dec 12 17:28:46.242660 sshd[5063]: Connection closed by 139.178.89.65 port 38518 Dec 12 17:28:46.243460 sshd-session[5060]: pam_unix(sshd:session): session closed for user core Dec 12 17:28:46.248473 systemd[1]: sshd@13-23.88.120.93:22-139.178.89.65:38518.service: Deactivated successfully. Dec 12 17:28:46.252283 systemd[1]: session-12.scope: Deactivated successfully. Dec 12 17:28:46.253810 systemd-logind[1487]: Session 12 logged out. Waiting for processes to exit. Dec 12 17:28:46.257533 systemd-logind[1487]: Removed session 12. Dec 12 17:28:47.357452 kubelet[2774]: E1212 17:28:47.357264 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fd8bd9d8f-pgxdr" podUID="73777a70-bcad-484a-aa29-795f5b22b302" Dec 12 17:28:49.356477 kubelet[2774]: E1212 17:28:49.356414 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-58gsb" podUID="ddab63ad-c74e-4d1e-9071-4b51ff05d58f" Dec 12 17:28:51.357380 kubelet[2774]: E1212 17:28:51.356828 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-tg4kr" podUID="7c9ca7d5-6db7-4a38-8560-e94f3e1a1488" Dec 12 17:28:51.414955 systemd[1]: Started sshd@14-23.88.120.93:22-139.178.89.65:32818.service - OpenSSH per-connection server daemon (139.178.89.65:32818). Dec 12 17:28:52.420766 sshd[5077]: Accepted publickey for core from 139.178.89.65 port 32818 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:28:52.423546 sshd-session[5077]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:52.432771 systemd-logind[1487]: New session 13 of user core. Dec 12 17:28:52.437923 systemd[1]: Started session-13.scope - Session 13 of User core. Dec 12 17:28:53.198241 sshd[5080]: Connection closed by 139.178.89.65 port 32818 Dec 12 17:28:53.199849 sshd-session[5077]: pam_unix(sshd:session): session closed for user core Dec 12 17:28:53.205458 systemd-logind[1487]: Session 13 logged out. Waiting for processes to exit. Dec 12 17:28:53.207268 systemd[1]: sshd@14-23.88.120.93:22-139.178.89.65:32818.service: Deactivated successfully. Dec 12 17:28:53.212199 systemd[1]: session-13.scope: Deactivated successfully. Dec 12 17:28:53.216910 systemd-logind[1487]: Removed session 13. Dec 12 17:28:56.355535 kubelet[2774]: E1212 17:28:56.355420 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-clxs9" podUID="b8ca6b0c-01ec-442d-b526-fb52b3677452" Dec 12 17:28:56.356362 kubelet[2774]: E1212 17:28:56.355919 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7c69bf577-f29bx" podUID="02827e28-67cb-418d-b5bc-487012be465e" Dec 12 17:28:58.359765 kubelet[2774]: E1212 17:28:58.359694 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nnwtm" podUID="df83329f-2747-4a89-9a6a-7b20123df2cf" Dec 12 17:28:58.366113 systemd[1]: Started sshd@15-23.88.120.93:22-139.178.89.65:32832.service - OpenSSH per-connection server daemon (139.178.89.65:32832). Dec 12 17:28:59.342942 sshd[5094]: Accepted publickey for core from 139.178.89.65 port 32832 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:28:59.345477 sshd-session[5094]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:28:59.354419 systemd-logind[1487]: New session 14 of user core. Dec 12 17:28:59.359861 systemd[1]: Started session-14.scope - Session 14 of User core. Dec 12 17:28:59.369340 kubelet[2774]: E1212 17:28:59.368798 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fd8bd9d8f-pgxdr" podUID="73777a70-bcad-484a-aa29-795f5b22b302" Dec 12 17:29:00.092551 sshd[5097]: Connection closed by 139.178.89.65 port 32832 Dec 12 17:29:00.094027 sshd-session[5094]: pam_unix(sshd:session): session closed for user core Dec 12 17:29:00.100131 systemd[1]: sshd@15-23.88.120.93:22-139.178.89.65:32832.service: Deactivated successfully. Dec 12 17:29:00.103635 systemd[1]: session-14.scope: Deactivated successfully. Dec 12 17:29:00.109439 systemd-logind[1487]: Session 14 logged out. Waiting for processes to exit. Dec 12 17:29:00.111007 systemd-logind[1487]: Removed session 14. Dec 12 17:29:00.261727 systemd[1]: Started sshd@16-23.88.120.93:22-139.178.89.65:32842.service - OpenSSH per-connection server daemon (139.178.89.65:32842). Dec 12 17:29:01.234772 sshd[5109]: Accepted publickey for core from 139.178.89.65 port 32842 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:29:01.236894 sshd-session[5109]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:29:01.245538 systemd-logind[1487]: New session 15 of user core. Dec 12 17:29:01.251872 systemd[1]: Started session-15.scope - Session 15 of User core. Dec 12 17:29:02.137442 sshd[5112]: Connection closed by 139.178.89.65 port 32842 Dec 12 17:29:02.140638 sshd-session[5109]: pam_unix(sshd:session): session closed for user core Dec 12 17:29:02.146485 systemd[1]: sshd@16-23.88.120.93:22-139.178.89.65:32842.service: Deactivated successfully. Dec 12 17:29:02.150598 systemd[1]: session-15.scope: Deactivated successfully. Dec 12 17:29:02.153106 systemd-logind[1487]: Session 15 logged out. Waiting for processes to exit. Dec 12 17:29:02.155169 systemd-logind[1487]: Removed session 15. Dec 12 17:29:02.304493 systemd[1]: Started sshd@17-23.88.120.93:22-139.178.89.65:38872.service - OpenSSH per-connection server daemon (139.178.89.65:38872). Dec 12 17:29:03.293700 sshd[5122]: Accepted publickey for core from 139.178.89.65 port 38872 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:29:03.295336 sshd-session[5122]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:29:03.304297 systemd-logind[1487]: New session 16 of user core. Dec 12 17:29:03.310896 systemd[1]: Started session-16.scope - Session 16 of User core. Dec 12 17:29:03.355481 kubelet[2774]: E1212 17:29:03.355421 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-tg4kr" podUID="7c9ca7d5-6db7-4a38-8560-e94f3e1a1488" Dec 12 17:29:04.355543 kubelet[2774]: E1212 17:29:04.355474 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-58gsb" podUID="ddab63ad-c74e-4d1e-9071-4b51ff05d58f" Dec 12 17:29:04.903590 sshd[5125]: Connection closed by 139.178.89.65 port 38872 Dec 12 17:29:04.903433 sshd-session[5122]: pam_unix(sshd:session): session closed for user core Dec 12 17:29:04.910482 systemd-logind[1487]: Session 16 logged out. Waiting for processes to exit. Dec 12 17:29:04.911089 systemd[1]: sshd@17-23.88.120.93:22-139.178.89.65:38872.service: Deactivated successfully. Dec 12 17:29:04.916470 systemd[1]: session-16.scope: Deactivated successfully. Dec 12 17:29:04.918969 systemd-logind[1487]: Removed session 16. Dec 12 17:29:05.076304 systemd[1]: Started sshd@18-23.88.120.93:22-139.178.89.65:38886.service - OpenSSH per-connection server daemon (139.178.89.65:38886). Dec 12 17:29:06.102990 sshd[5140]: Accepted publickey for core from 139.178.89.65 port 38886 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:29:06.105230 sshd-session[5140]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:29:06.119056 systemd-logind[1487]: New session 17 of user core. Dec 12 17:29:06.124874 systemd[1]: Started session-17.scope - Session 17 of User core. Dec 12 17:29:07.062822 sshd[5145]: Connection closed by 139.178.89.65 port 38886 Dec 12 17:29:07.063856 sshd-session[5140]: pam_unix(sshd:session): session closed for user core Dec 12 17:29:07.072386 systemd[1]: sshd@18-23.88.120.93:22-139.178.89.65:38886.service: Deactivated successfully. Dec 12 17:29:07.078561 systemd[1]: session-17.scope: Deactivated successfully. Dec 12 17:29:07.081877 systemd-logind[1487]: Session 17 logged out. Waiting for processes to exit. Dec 12 17:29:07.087429 systemd-logind[1487]: Removed session 17. Dec 12 17:29:07.236273 systemd[1]: Started sshd@19-23.88.120.93:22-139.178.89.65:38888.service - OpenSSH per-connection server daemon (139.178.89.65:38888). Dec 12 17:29:07.357567 kubelet[2774]: E1212 17:29:07.355979 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7c69bf577-f29bx" podUID="02827e28-67cb-418d-b5bc-487012be465e" Dec 12 17:29:08.225799 sshd[5155]: Accepted publickey for core from 139.178.89.65 port 38888 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:29:08.228653 sshd-session[5155]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:29:08.234140 systemd-logind[1487]: New session 18 of user core. Dec 12 17:29:08.240887 systemd[1]: Started session-18.scope - Session 18 of User core. Dec 12 17:29:08.360390 kubelet[2774]: E1212 17:29:08.359211 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-clxs9" podUID="b8ca6b0c-01ec-442d-b526-fb52b3677452" Dec 12 17:29:08.994274 sshd[5158]: Connection closed by 139.178.89.65 port 38888 Dec 12 17:29:08.996426 sshd-session[5155]: pam_unix(sshd:session): session closed for user core Dec 12 17:29:09.001475 systemd-logind[1487]: Session 18 logged out. Waiting for processes to exit. Dec 12 17:29:09.004057 systemd[1]: sshd@19-23.88.120.93:22-139.178.89.65:38888.service: Deactivated successfully. Dec 12 17:29:09.008607 systemd[1]: session-18.scope: Deactivated successfully. Dec 12 17:29:09.013513 systemd-logind[1487]: Removed session 18. Dec 12 17:29:10.357636 kubelet[2774]: E1212 17:29:10.357565 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fd8bd9d8f-pgxdr" podUID="73777a70-bcad-484a-aa29-795f5b22b302" Dec 12 17:29:11.358429 kubelet[2774]: E1212 17:29:11.358137 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nnwtm" podUID="df83329f-2747-4a89-9a6a-7b20123df2cf" Dec 12 17:29:14.173999 systemd[1]: Started sshd@20-23.88.120.93:22-139.178.89.65:43348.service - OpenSSH per-connection server daemon (139.178.89.65:43348). Dec 12 17:29:15.190690 sshd[5196]: Accepted publickey for core from 139.178.89.65 port 43348 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:29:15.192164 sshd-session[5196]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:29:15.199402 systemd-logind[1487]: New session 19 of user core. Dec 12 17:29:15.209762 systemd[1]: Started session-19.scope - Session 19 of User core. Dec 12 17:29:15.976818 sshd[5199]: Connection closed by 139.178.89.65 port 43348 Dec 12 17:29:15.977885 sshd-session[5196]: pam_unix(sshd:session): session closed for user core Dec 12 17:29:15.984137 systemd[1]: sshd@20-23.88.120.93:22-139.178.89.65:43348.service: Deactivated successfully. Dec 12 17:29:15.989132 systemd[1]: session-19.scope: Deactivated successfully. Dec 12 17:29:15.991498 systemd-logind[1487]: Session 19 logged out. Waiting for processes to exit. Dec 12 17:29:15.995711 systemd-logind[1487]: Removed session 19. Dec 12 17:29:16.355708 kubelet[2774]: E1212 17:29:16.355565 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-58gsb" podUID="ddab63ad-c74e-4d1e-9071-4b51ff05d58f" Dec 12 17:29:17.357209 kubelet[2774]: E1212 17:29:17.357152 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-tg4kr" podUID="7c9ca7d5-6db7-4a38-8560-e94f3e1a1488" Dec 12 17:29:19.357099 kubelet[2774]: E1212 17:29:19.356916 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7c69bf577-f29bx" podUID="02827e28-67cb-418d-b5bc-487012be465e" Dec 12 17:29:21.150053 systemd[1]: Started sshd@21-23.88.120.93:22-139.178.89.65:52804.service - OpenSSH per-connection server daemon (139.178.89.65:52804). Dec 12 17:29:22.149036 sshd[5217]: Accepted publickey for core from 139.178.89.65 port 52804 ssh2: RSA SHA256:iFtGnG2WH9XVjjUjszxJhaCaYvl4oOJ7+tJOMAqvDiA Dec 12 17:29:22.151679 sshd-session[5217]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:29:22.159976 systemd-logind[1487]: New session 20 of user core. Dec 12 17:29:22.164909 systemd[1]: Started session-20.scope - Session 20 of User core. Dec 12 17:29:22.958081 sshd[5220]: Connection closed by 139.178.89.65 port 52804 Dec 12 17:29:22.958598 sshd-session[5217]: pam_unix(sshd:session): session closed for user core Dec 12 17:29:22.965592 systemd[1]: sshd@21-23.88.120.93:22-139.178.89.65:52804.service: Deactivated successfully. Dec 12 17:29:22.969443 systemd[1]: session-20.scope: Deactivated successfully. Dec 12 17:29:22.973023 systemd-logind[1487]: Session 20 logged out. Waiting for processes to exit. Dec 12 17:29:22.975592 systemd-logind[1487]: Removed session 20. Dec 12 17:29:23.360705 containerd[1508]: time="2025-12-12T17:29:23.358863383Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:29:23.361306 kubelet[2774]: E1212 17:29:23.360296 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nnwtm" podUID="df83329f-2747-4a89-9a6a-7b20123df2cf" Dec 12 17:29:23.365284 kubelet[2774]: E1212 17:29:23.365219 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fd8bd9d8f-pgxdr" podUID="73777a70-bcad-484a-aa29-795f5b22b302" Dec 12 17:29:23.693232 containerd[1508]: time="2025-12-12T17:29:23.693163956Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:23.695731 containerd[1508]: time="2025-12-12T17:29:23.694585481Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:29:23.697911 containerd[1508]: time="2025-12-12T17:29:23.694641481Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:29:23.698251 kubelet[2774]: E1212 17:29:23.698119 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:29:23.698251 kubelet[2774]: E1212 17:29:23.698182 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:29:23.698429 kubelet[2774]: E1212 17:29:23.698383 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-b9d8f9566-clxs9_calico-apiserver(b8ca6b0c-01ec-442d-b526-fb52b3677452): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:23.698525 kubelet[2774]: E1212 17:29:23.698485 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-clxs9" podUID="b8ca6b0c-01ec-442d-b526-fb52b3677452" Dec 12 17:29:27.357072 kubelet[2774]: E1212 17:29:27.356941 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-58gsb" podUID="ddab63ad-c74e-4d1e-9071-4b51ff05d58f" Dec 12 17:29:30.356465 containerd[1508]: time="2025-12-12T17:29:30.356412777Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:29:30.901554 containerd[1508]: time="2025-12-12T17:29:30.901478915Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:30.903081 containerd[1508]: time="2025-12-12T17:29:30.902942000Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:29:30.903081 containerd[1508]: time="2025-12-12T17:29:30.903007361Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:29:30.903325 kubelet[2774]: E1212 17:29:30.903247 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:29:30.903325 kubelet[2774]: E1212 17:29:30.903308 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:29:30.903900 kubelet[2774]: E1212 17:29:30.903431 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-b9d8f9566-tg4kr_calico-apiserver(7c9ca7d5-6db7-4a38-8560-e94f3e1a1488): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:30.903900 kubelet[2774]: E1212 17:29:30.903483 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-tg4kr" podUID="7c9ca7d5-6db7-4a38-8560-e94f3e1a1488" Dec 12 17:29:31.361261 kubelet[2774]: E1212 17:29:31.360635 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7c69bf577-f29bx" podUID="02827e28-67cb-418d-b5bc-487012be465e" Dec 12 17:29:35.357508 containerd[1508]: time="2025-12-12T17:29:35.357353088Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:29:35.701222 containerd[1508]: time="2025-12-12T17:29:35.701114176Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:35.702837 containerd[1508]: time="2025-12-12T17:29:35.702778022Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:29:35.702953 containerd[1508]: time="2025-12-12T17:29:35.702823742Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:29:35.703125 kubelet[2774]: E1212 17:29:35.703078 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:29:35.703443 kubelet[2774]: E1212 17:29:35.703137 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:29:35.703443 kubelet[2774]: E1212 17:29:35.703232 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-nnwtm_calico-system(df83329f-2747-4a89-9a6a-7b20123df2cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:35.710513 containerd[1508]: time="2025-12-12T17:29:35.710437089Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:29:36.031282 containerd[1508]: time="2025-12-12T17:29:36.031025253Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:36.033778 containerd[1508]: time="2025-12-12T17:29:36.033471422Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:29:36.033933 containerd[1508]: time="2025-12-12T17:29:36.033515702Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:29:36.034408 kubelet[2774]: E1212 17:29:36.034295 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:29:36.034408 kubelet[2774]: E1212 17:29:36.034346 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:29:36.034570 kubelet[2774]: E1212 17:29:36.034415 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-nnwtm_calico-system(df83329f-2747-4a89-9a6a-7b20123df2cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:36.034570 kubelet[2774]: E1212 17:29:36.034463 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nnwtm" podUID="df83329f-2747-4a89-9a6a-7b20123df2cf" Dec 12 17:29:36.920615 systemd[1]: cri-containerd-8ff72935087b35adc09bf37af1301f752d5e9a327ab4a48b55843118faee4fea.scope: Deactivated successfully. Dec 12 17:29:36.922848 systemd[1]: cri-containerd-8ff72935087b35adc09bf37af1301f752d5e9a327ab4a48b55843118faee4fea.scope: Consumed 43.216s CPU time, 100.5M memory peak. Dec 12 17:29:36.925363 containerd[1508]: time="2025-12-12T17:29:36.924819617Z" level=info msg="received container exit event container_id:\"8ff72935087b35adc09bf37af1301f752d5e9a327ab4a48b55843118faee4fea\" id:\"8ff72935087b35adc09bf37af1301f752d5e9a327ab4a48b55843118faee4fea\" pid:3096 exit_status:1 exited_at:{seconds:1765560576 nanos:922197648}" Dec 12 17:29:36.953586 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8ff72935087b35adc09bf37af1301f752d5e9a327ab4a48b55843118faee4fea-rootfs.mount: Deactivated successfully. Dec 12 17:29:37.159744 systemd[1]: cri-containerd-fe72a36bac981fec69246dda02f90d12531ee2fa4665edf92d6db32f72828f47.scope: Deactivated successfully. Dec 12 17:29:37.160047 systemd[1]: cri-containerd-fe72a36bac981fec69246dda02f90d12531ee2fa4665edf92d6db32f72828f47.scope: Consumed 6.916s CPU time, 64.2M memory peak, 2.3M read from disk. Dec 12 17:29:37.166546 containerd[1508]: time="2025-12-12T17:29:37.165198650Z" level=info msg="received container exit event container_id:\"fe72a36bac981fec69246dda02f90d12531ee2fa4665edf92d6db32f72828f47\" id:\"fe72a36bac981fec69246dda02f90d12531ee2fa4665edf92d6db32f72828f47\" pid:2596 exit_status:1 exited_at:{seconds:1765560577 nanos:159188268}" Dec 12 17:29:37.211745 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-fe72a36bac981fec69246dda02f90d12531ee2fa4665edf92d6db32f72828f47-rootfs.mount: Deactivated successfully. Dec 12 17:29:37.306579 kubelet[2774]: I1212 17:29:37.306536 2774 scope.go:117] "RemoveContainer" containerID="fe72a36bac981fec69246dda02f90d12531ee2fa4665edf92d6db32f72828f47" Dec 12 17:29:37.307061 kubelet[2774]: I1212 17:29:37.306659 2774 scope.go:117] "RemoveContainer" containerID="8ff72935087b35adc09bf37af1301f752d5e9a327ab4a48b55843118faee4fea" Dec 12 17:29:37.309659 containerd[1508]: time="2025-12-12T17:29:37.309392373Z" level=info msg="CreateContainer within sandbox \"35bb6039cd1b88d45ba0f0547c86e65d4b39d8129308c49ac77ec059803a1070\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Dec 12 17:29:37.309930 containerd[1508]: time="2025-12-12T17:29:37.309894255Z" level=info msg="CreateContainer within sandbox \"0074d04f6639beb1334d27bc410aa3885e6b76f9b645f4e4007b49b114b089b6\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Dec 12 17:29:37.324510 containerd[1508]: time="2025-12-12T17:29:37.324457148Z" level=info msg="Container cdcee87d7fc38dad151b717f7e4a332c3841dfe12ede33cfa22c622b9a53b030: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:29:37.327705 containerd[1508]: time="2025-12-12T17:29:37.326831757Z" level=info msg="Container 1a9a8754655a8c7211c0c35a89001059e9a19d8dfab049797069ecd337b3531d: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:29:37.336590 containerd[1508]: time="2025-12-12T17:29:37.336540432Z" level=info msg="CreateContainer within sandbox \"35bb6039cd1b88d45ba0f0547c86e65d4b39d8129308c49ac77ec059803a1070\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"cdcee87d7fc38dad151b717f7e4a332c3841dfe12ede33cfa22c622b9a53b030\"" Dec 12 17:29:37.337120 containerd[1508]: time="2025-12-12T17:29:37.337091314Z" level=info msg="StartContainer for \"cdcee87d7fc38dad151b717f7e4a332c3841dfe12ede33cfa22c622b9a53b030\"" Dec 12 17:29:37.338691 containerd[1508]: time="2025-12-12T17:29:37.338612839Z" level=info msg="connecting to shim cdcee87d7fc38dad151b717f7e4a332c3841dfe12ede33cfa22c622b9a53b030" address="unix:///run/containerd/s/3291ae1379f0a86b0febe6052b090227db92aeb039eef110003af8b6194f9a8f" protocol=ttrpc version=3 Dec 12 17:29:37.342491 containerd[1508]: time="2025-12-12T17:29:37.342369533Z" level=info msg="CreateContainer within sandbox \"0074d04f6639beb1334d27bc410aa3885e6b76f9b645f4e4007b49b114b089b6\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"1a9a8754655a8c7211c0c35a89001059e9a19d8dfab049797069ecd337b3531d\"" Dec 12 17:29:37.343864 containerd[1508]: time="2025-12-12T17:29:37.343820338Z" level=info msg="StartContainer for \"1a9a8754655a8c7211c0c35a89001059e9a19d8dfab049797069ecd337b3531d\"" Dec 12 17:29:37.345665 containerd[1508]: time="2025-12-12T17:29:37.345591785Z" level=info msg="connecting to shim 1a9a8754655a8c7211c0c35a89001059e9a19d8dfab049797069ecd337b3531d" address="unix:///run/containerd/s/8a5297e54ee8ea3691ed049294a02c410e332db2e4dbd0e0215823e461a0d359" protocol=ttrpc version=3 Dec 12 17:29:37.355131 kubelet[2774]: E1212 17:29:37.354701 2774 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:37246->10.0.0.2:2379: read: connection timed out" Dec 12 17:29:37.374890 systemd[1]: Started cri-containerd-cdcee87d7fc38dad151b717f7e4a332c3841dfe12ede33cfa22c622b9a53b030.scope - libcontainer container cdcee87d7fc38dad151b717f7e4a332c3841dfe12ede33cfa22c622b9a53b030. Dec 12 17:29:37.384863 systemd[1]: Started cri-containerd-1a9a8754655a8c7211c0c35a89001059e9a19d8dfab049797069ecd337b3531d.scope - libcontainer container 1a9a8754655a8c7211c0c35a89001059e9a19d8dfab049797069ecd337b3531d. Dec 12 17:29:37.429881 containerd[1508]: time="2025-12-12T17:29:37.429812531Z" level=info msg="StartContainer for \"cdcee87d7fc38dad151b717f7e4a332c3841dfe12ede33cfa22c622b9a53b030\" returns successfully" Dec 12 17:29:37.448776 containerd[1508]: time="2025-12-12T17:29:37.448729159Z" level=info msg="StartContainer for \"1a9a8754655a8c7211c0c35a89001059e9a19d8dfab049797069ecd337b3531d\" returns successfully" Dec 12 17:29:38.356534 containerd[1508]: time="2025-12-12T17:29:38.356483494Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:29:38.703397 containerd[1508]: time="2025-12-12T17:29:38.703277713Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:38.704672 containerd[1508]: time="2025-12-12T17:29:38.704588358Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:29:38.704772 containerd[1508]: time="2025-12-12T17:29:38.704691398Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:29:38.705271 kubelet[2774]: E1212 17:29:38.705230 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:29:38.706380 kubelet[2774]: E1212 17:29:38.705278 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:29:38.706380 kubelet[2774]: E1212 17:29:38.705373 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-fd8bd9d8f-pgxdr_calico-system(73777a70-bcad-484a-aa29-795f5b22b302): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:38.707475 containerd[1508]: time="2025-12-12T17:29:38.707421768Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:29:39.046024 containerd[1508]: time="2025-12-12T17:29:39.045882277Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:29:39.047602 containerd[1508]: time="2025-12-12T17:29:39.047542123Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:29:39.047738 containerd[1508]: time="2025-12-12T17:29:39.047674323Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:29:39.047915 kubelet[2774]: E1212 17:29:39.047848 2774 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:29:39.047965 kubelet[2774]: E1212 17:29:39.047913 2774 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:29:39.048718 kubelet[2774]: E1212 17:29:39.048006 2774 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-fd8bd9d8f-pgxdr_calico-system(73777a70-bcad-484a-aa29-795f5b22b302): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:29:39.048718 kubelet[2774]: E1212 17:29:39.048107 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-fd8bd9d8f-pgxdr" podUID="73777a70-bcad-484a-aa29-795f5b22b302" Dec 12 17:29:39.357417 kubelet[2774]: E1212 17:29:39.356878 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-clxs9" podUID="b8ca6b0c-01ec-442d-b526-fb52b3677452" Dec 12 17:29:40.356277 kubelet[2774]: E1212 17:29:40.356218 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-58gsb" podUID="ddab63ad-c74e-4d1e-9071-4b51ff05d58f" Dec 12 17:29:42.356514 kubelet[2774]: E1212 17:29:42.356435 2774 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-b9d8f9566-tg4kr" podUID="7c9ca7d5-6db7-4a38-8560-e94f3e1a1488" Dec 12 17:29:43.157141 systemd[1]: cri-containerd-add69298466784047f1a9dc34d511845b3aba4295e8a4e9eadd05b6dd2d47f9a.scope: Deactivated successfully. Dec 12 17:29:43.157478 systemd[1]: cri-containerd-add69298466784047f1a9dc34d511845b3aba4295e8a4e9eadd05b6dd2d47f9a.scope: Consumed 4.184s CPU time, 24M memory peak, 2.4M read from disk. Dec 12 17:29:43.162123 containerd[1508]: time="2025-12-12T17:29:43.161610017Z" level=info msg="received container exit event container_id:\"add69298466784047f1a9dc34d511845b3aba4295e8a4e9eadd05b6dd2d47f9a\" id:\"add69298466784047f1a9dc34d511845b3aba4295e8a4e9eadd05b6dd2d47f9a\" pid:2636 exit_status:1 exited_at:{seconds:1765560583 nanos:161238256}" Dec 12 17:29:43.189664 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-add69298466784047f1a9dc34d511845b3aba4295e8a4e9eadd05b6dd2d47f9a-rootfs.mount: Deactivated successfully. Dec 12 17:29:43.335565 kubelet[2774]: I1212 17:29:43.335528 2774 scope.go:117] "RemoveContainer" containerID="add69298466784047f1a9dc34d511845b3aba4295e8a4e9eadd05b6dd2d47f9a" Dec 12 17:29:43.338223 containerd[1508]: time="2025-12-12T17:29:43.338175738Z" level=info msg="CreateContainer within sandbox \"42c0b604f33ecdc7519e9cd5a448e884a2bc3fca453c3b7c12c8168356317e05\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Dec 12 17:29:43.353670 containerd[1508]: time="2025-12-12T17:29:43.351704227Z" level=info msg="Container 70d44396665bec0009cbab7af3a1c0e97ab6f06fbd9d91762155f3be5fbe2196: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:29:43.362137 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4164812388.mount: Deactivated successfully. Dec 12 17:29:43.368476 containerd[1508]: time="2025-12-12T17:29:43.368384688Z" level=info msg="CreateContainer within sandbox \"42c0b604f33ecdc7519e9cd5a448e884a2bc3fca453c3b7c12c8168356317e05\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"70d44396665bec0009cbab7af3a1c0e97ab6f06fbd9d91762155f3be5fbe2196\"" Dec 12 17:29:43.369267 containerd[1508]: time="2025-12-12T17:29:43.369219411Z" level=info msg="StartContainer for \"70d44396665bec0009cbab7af3a1c0e97ab6f06fbd9d91762155f3be5fbe2196\"" Dec 12 17:29:43.371278 containerd[1508]: time="2025-12-12T17:29:43.371197178Z" level=info msg="connecting to shim 70d44396665bec0009cbab7af3a1c0e97ab6f06fbd9d91762155f3be5fbe2196" address="unix:///run/containerd/s/3aed40cb7be7b94f9fbf27f470f02e8a522115d370928d4be0d34330977059f7" protocol=ttrpc version=3 Dec 12 17:29:43.398129 systemd[1]: Started cri-containerd-70d44396665bec0009cbab7af3a1c0e97ab6f06fbd9d91762155f3be5fbe2196.scope - libcontainer container 70d44396665bec0009cbab7af3a1c0e97ab6f06fbd9d91762155f3be5fbe2196.