Dec 12 17:39:35.168794 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd490] Dec 12 17:39:35.168812 kernel: Linux version 6.12.61-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Fri Dec 12 15:20:48 -00 2025 Dec 12 17:39:35.168819 kernel: KASLR enabled Dec 12 17:39:35.168823 kernel: earlycon: pl11 at MMIO 0x00000000effec000 (options '') Dec 12 17:39:35.168826 kernel: printk: legacy bootconsole [pl11] enabled Dec 12 17:39:35.168831 kernel: efi: EFI v2.7 by EDK II Dec 12 17:39:35.168836 kernel: efi: ACPI 2.0=0x3f979018 SMBIOS=0x3f8a0000 SMBIOS 3.0=0x3f880000 MEMATTR=0x3e89c018 RNG=0x3f979998 MEMRESERVE=0x3db7d598 Dec 12 17:39:35.168840 kernel: random: crng init done Dec 12 17:39:35.168844 kernel: secureboot: Secure boot disabled Dec 12 17:39:35.168848 kernel: ACPI: Early table checksum verification disabled Dec 12 17:39:35.168852 kernel: ACPI: RSDP 0x000000003F979018 000024 (v02 VRTUAL) Dec 12 17:39:35.168856 kernel: ACPI: XSDT 0x000000003F979F18 00006C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Dec 12 17:39:35.168860 kernel: ACPI: FACP 0x000000003F979C18 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Dec 12 17:39:35.168864 kernel: ACPI: DSDT 0x000000003F95A018 01E046 (v02 MSFTVM DSDT01 00000001 INTL 20230628) Dec 12 17:39:35.168870 kernel: ACPI: DBG2 0x000000003F979B18 000072 (v00 VRTUAL MICROSFT 00000001 MSFT 00000001) Dec 12 17:39:35.168874 kernel: ACPI: GTDT 0x000000003F979D98 000060 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Dec 12 17:39:35.168878 kernel: ACPI: OEM0 0x000000003F979098 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Dec 12 17:39:35.168882 kernel: ACPI: SPCR 0x000000003F979A98 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Dec 12 17:39:35.168887 kernel: ACPI: APIC 0x000000003F979818 0000FC (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Dec 12 17:39:35.168892 kernel: ACPI: SRAT 0x000000003F979198 000234 (v03 VRTUAL MICROSFT 00000001 MSFT 00000001) Dec 12 17:39:35.168896 kernel: ACPI: PPTT 0x000000003F979418 000120 (v01 VRTUAL MICROSFT 00000000 MSFT 00000000) Dec 12 17:39:35.168900 kernel: ACPI: BGRT 0x000000003F979E98 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Dec 12 17:39:35.168904 kernel: ACPI: SPCR: console: pl011,mmio32,0xeffec000,115200 Dec 12 17:39:35.168908 kernel: ACPI: Use ACPI SPCR as default console: Yes Dec 12 17:39:35.168913 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] hotplug Dec 12 17:39:35.168917 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x1bfffffff] hotplug Dec 12 17:39:35.168921 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1c0000000-0xfbfffffff] hotplug Dec 12 17:39:35.168925 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] hotplug Dec 12 17:39:35.168930 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] hotplug Dec 12 17:39:35.168934 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] hotplug Dec 12 17:39:35.168939 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] hotplug Dec 12 17:39:35.168943 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] hotplug Dec 12 17:39:35.168947 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] hotplug Dec 12 17:39:35.168952 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] hotplug Dec 12 17:39:35.168956 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] hotplug Dec 12 17:39:35.168960 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] hotplug Dec 12 17:39:35.168964 kernel: NUMA: Node 0 [mem 0x00000000-0x3fffffff] + [mem 0x100000000-0x1bfffffff] -> [mem 0x00000000-0x1bfffffff] Dec 12 17:39:35.168969 kernel: NODE_DATA(0) allocated [mem 0x1bf7ffa00-0x1bf806fff] Dec 12 17:39:35.168973 kernel: Zone ranges: Dec 12 17:39:35.168977 kernel: DMA [mem 0x0000000000000000-0x00000000ffffffff] Dec 12 17:39:35.168984 kernel: DMA32 empty Dec 12 17:39:35.168989 kernel: Normal [mem 0x0000000100000000-0x00000001bfffffff] Dec 12 17:39:35.168993 kernel: Device empty Dec 12 17:39:35.168998 kernel: Movable zone start for each node Dec 12 17:39:35.169002 kernel: Early memory node ranges Dec 12 17:39:35.169006 kernel: node 0: [mem 0x0000000000000000-0x00000000007fffff] Dec 12 17:39:35.169012 kernel: node 0: [mem 0x0000000000824000-0x000000003f38ffff] Dec 12 17:39:35.169016 kernel: node 0: [mem 0x000000003f390000-0x000000003f93ffff] Dec 12 17:39:35.169020 kernel: node 0: [mem 0x000000003f940000-0x000000003f9effff] Dec 12 17:39:35.169025 kernel: node 0: [mem 0x000000003f9f0000-0x000000003fdeffff] Dec 12 17:39:35.169029 kernel: node 0: [mem 0x000000003fdf0000-0x000000003fffffff] Dec 12 17:39:35.169033 kernel: node 0: [mem 0x0000000100000000-0x00000001bfffffff] Dec 12 17:39:35.169038 kernel: Initmem setup node 0 [mem 0x0000000000000000-0x00000001bfffffff] Dec 12 17:39:35.169042 kernel: On node 0, zone DMA: 36 pages in unavailable ranges Dec 12 17:39:35.169046 kernel: cma: Reserved 16 MiB at 0x000000003ca00000 on node -1 Dec 12 17:39:35.169051 kernel: psci: probing for conduit method from ACPI. Dec 12 17:39:35.169055 kernel: psci: PSCIv1.3 detected in firmware. Dec 12 17:39:35.169060 kernel: psci: Using standard PSCI v0.2 function IDs Dec 12 17:39:35.169065 kernel: psci: MIGRATE_INFO_TYPE not supported. Dec 12 17:39:35.169069 kernel: psci: SMC Calling Convention v1.4 Dec 12 17:39:35.169073 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Dec 12 17:39:35.169078 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Dec 12 17:39:35.169082 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Dec 12 17:39:35.169087 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Dec 12 17:39:35.169091 kernel: pcpu-alloc: [0] 0 [0] 1 Dec 12 17:39:35.169095 kernel: Detected PIPT I-cache on CPU0 Dec 12 17:39:35.169100 kernel: CPU features: detected: Address authentication (architected QARMA5 algorithm) Dec 12 17:39:35.169104 kernel: CPU features: detected: GIC system register CPU interface Dec 12 17:39:35.169109 kernel: CPU features: detected: Spectre-v4 Dec 12 17:39:35.169113 kernel: CPU features: detected: Spectre-BHB Dec 12 17:39:35.169118 kernel: CPU features: kernel page table isolation forced ON by KASLR Dec 12 17:39:35.169123 kernel: CPU features: detected: Kernel page table isolation (KPTI) Dec 12 17:39:35.169127 kernel: CPU features: detected: ARM erratum 2067961 or 2054223 Dec 12 17:39:35.169132 kernel: CPU features: detected: SSBS not fully self-synchronizing Dec 12 17:39:35.169136 kernel: alternatives: applying boot alternatives Dec 12 17:39:35.169141 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=361f5baddf90aee3bc7ee7e9be879bc0cc94314f224faa1e2791d9b44cd3ec52 Dec 12 17:39:35.169146 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Dec 12 17:39:35.169150 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 12 17:39:35.169155 kernel: Fallback order for Node 0: 0 Dec 12 17:39:35.169159 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1048540 Dec 12 17:39:35.169164 kernel: Policy zone: Normal Dec 12 17:39:35.169169 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 12 17:39:35.169183 kernel: software IO TLB: area num 2. Dec 12 17:39:35.169187 kernel: software IO TLB: mapped [mem 0x0000000035900000-0x0000000039900000] (64MB) Dec 12 17:39:35.169192 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Dec 12 17:39:35.169196 kernel: rcu: Preemptible hierarchical RCU implementation. Dec 12 17:39:35.169201 kernel: rcu: RCU event tracing is enabled. Dec 12 17:39:35.169206 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Dec 12 17:39:35.169210 kernel: Trampoline variant of Tasks RCU enabled. Dec 12 17:39:35.169215 kernel: Tracing variant of Tasks RCU enabled. Dec 12 17:39:35.169219 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 12 17:39:35.169224 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Dec 12 17:39:35.169229 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Dec 12 17:39:35.169234 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Dec 12 17:39:35.169238 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Dec 12 17:39:35.169243 kernel: GICv3: 960 SPIs implemented Dec 12 17:39:35.169247 kernel: GICv3: 0 Extended SPIs implemented Dec 12 17:39:35.169251 kernel: Root IRQ handler: gic_handle_irq Dec 12 17:39:35.169256 kernel: GICv3: GICv3 features: 16 PPIs, RSS Dec 12 17:39:35.169260 kernel: GICv3: GICD_CTRL.DS=0, SCR_EL3.FIQ=0 Dec 12 17:39:35.169264 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000effee000 Dec 12 17:39:35.169269 kernel: ITS: No ITS available, not enabling LPIs Dec 12 17:39:35.169273 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 12 17:39:35.169279 kernel: arch_timer: cp15 timer(s) running at 1000.00MHz (virt). Dec 12 17:39:35.169283 kernel: clocksource: arch_sys_counter: mask: 0x1fffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Dec 12 17:39:35.169288 kernel: sched_clock: 61 bits at 1000MHz, resolution 1ns, wraps every 4398046511103ns Dec 12 17:39:35.169292 kernel: Console: colour dummy device 80x25 Dec 12 17:39:35.169297 kernel: printk: legacy console [tty1] enabled Dec 12 17:39:35.169302 kernel: ACPI: Core revision 20240827 Dec 12 17:39:35.169306 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 2000.00 BogoMIPS (lpj=1000000) Dec 12 17:39:35.169311 kernel: pid_max: default: 32768 minimum: 301 Dec 12 17:39:35.169316 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Dec 12 17:39:35.169320 kernel: landlock: Up and running. Dec 12 17:39:35.169325 kernel: SELinux: Initializing. Dec 12 17:39:35.169330 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Dec 12 17:39:35.169335 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Dec 12 17:39:35.169339 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3b8030, hints 0xa0000e, misc 0x31e1 Dec 12 17:39:35.169344 kernel: Hyper-V: Host Build 10.0.26102.1172-1-0 Dec 12 17:39:35.169352 kernel: Hyper-V: enabling crash_kexec_post_notifiers Dec 12 17:39:35.169358 kernel: rcu: Hierarchical SRCU implementation. Dec 12 17:39:35.169363 kernel: rcu: Max phase no-delay instances is 400. Dec 12 17:39:35.169368 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Dec 12 17:39:35.169372 kernel: Remapping and enabling EFI services. Dec 12 17:39:35.169377 kernel: smp: Bringing up secondary CPUs ... Dec 12 17:39:35.169382 kernel: Detected PIPT I-cache on CPU1 Dec 12 17:39:35.169388 kernel: GICv3: CPU1: found redistributor 1 region 1:0x00000000f000e000 Dec 12 17:39:35.169393 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd490] Dec 12 17:39:35.169397 kernel: smp: Brought up 1 node, 2 CPUs Dec 12 17:39:35.169402 kernel: SMP: Total of 2 processors activated. Dec 12 17:39:35.169407 kernel: CPU: All CPU(s) started at EL1 Dec 12 17:39:35.169413 kernel: CPU features: detected: 32-bit EL0 Support Dec 12 17:39:35.169418 kernel: CPU features: detected: Instruction cache invalidation not required for I/D coherence Dec 12 17:39:35.169423 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Dec 12 17:39:35.169427 kernel: CPU features: detected: Common not Private translations Dec 12 17:39:35.169432 kernel: CPU features: detected: CRC32 instructions Dec 12 17:39:35.169437 kernel: CPU features: detected: Generic authentication (architected QARMA5 algorithm) Dec 12 17:39:35.169442 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Dec 12 17:39:35.169447 kernel: CPU features: detected: LSE atomic instructions Dec 12 17:39:35.169451 kernel: CPU features: detected: Privileged Access Never Dec 12 17:39:35.169457 kernel: CPU features: detected: Speculation barrier (SB) Dec 12 17:39:35.169462 kernel: CPU features: detected: TLB range maintenance instructions Dec 12 17:39:35.169467 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Dec 12 17:39:35.169471 kernel: CPU features: detected: Scalable Vector Extension Dec 12 17:39:35.169476 kernel: alternatives: applying system-wide alternatives Dec 12 17:39:35.169481 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Dec 12 17:39:35.169486 kernel: SVE: maximum available vector length 16 bytes per vector Dec 12 17:39:35.169491 kernel: SVE: default vector length 16 bytes per vector Dec 12 17:39:35.169496 kernel: Memory: 3952828K/4194160K available (11200K kernel code, 2456K rwdata, 9084K rodata, 39552K init, 1038K bss, 220144K reserved, 16384K cma-reserved) Dec 12 17:39:35.169502 kernel: devtmpfs: initialized Dec 12 17:39:35.169507 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 12 17:39:35.169511 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Dec 12 17:39:35.169516 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Dec 12 17:39:35.169521 kernel: 0 pages in range for non-PLT usage Dec 12 17:39:35.169526 kernel: 508400 pages in range for PLT usage Dec 12 17:39:35.169530 kernel: pinctrl core: initialized pinctrl subsystem Dec 12 17:39:35.169535 kernel: SMBIOS 3.1.0 present. Dec 12 17:39:35.169541 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 06/10/2025 Dec 12 17:39:35.169546 kernel: DMI: Memory slots populated: 2/2 Dec 12 17:39:35.169551 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 12 17:39:35.169556 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Dec 12 17:39:35.169560 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Dec 12 17:39:35.169565 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Dec 12 17:39:35.169570 kernel: audit: initializing netlink subsys (disabled) Dec 12 17:39:35.169575 kernel: audit: type=2000 audit(0.059:1): state=initialized audit_enabled=0 res=1 Dec 12 17:39:35.169580 kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 12 17:39:35.169585 kernel: cpuidle: using governor menu Dec 12 17:39:35.169590 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Dec 12 17:39:35.169595 kernel: ASID allocator initialised with 32768 entries Dec 12 17:39:35.169600 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 12 17:39:35.169604 kernel: Serial: AMBA PL011 UART driver Dec 12 17:39:35.169609 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Dec 12 17:39:35.169614 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Dec 12 17:39:35.169619 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Dec 12 17:39:35.169623 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Dec 12 17:39:35.169629 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Dec 12 17:39:35.169634 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Dec 12 17:39:35.169639 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Dec 12 17:39:35.169643 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Dec 12 17:39:35.169648 kernel: ACPI: Added _OSI(Module Device) Dec 12 17:39:35.169653 kernel: ACPI: Added _OSI(Processor Device) Dec 12 17:39:35.169658 kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 12 17:39:35.169663 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 12 17:39:35.169667 kernel: ACPI: Interpreter enabled Dec 12 17:39:35.169673 kernel: ACPI: Using GIC for interrupt routing Dec 12 17:39:35.169678 kernel: ARMH0011:00: ttyAMA0 at MMIO 0xeffec000 (irq = 12, base_baud = 0) is a SBSA Dec 12 17:39:35.169683 kernel: printk: legacy console [ttyAMA0] enabled Dec 12 17:39:35.169687 kernel: printk: legacy bootconsole [pl11] disabled Dec 12 17:39:35.169692 kernel: ARMH0011:01: ttyAMA1 at MMIO 0xeffeb000 (irq = 13, base_baud = 0) is a SBSA Dec 12 17:39:35.169697 kernel: ACPI: CPU0 has been hot-added Dec 12 17:39:35.169702 kernel: ACPI: CPU1 has been hot-added Dec 12 17:39:35.169707 kernel: iommu: Default domain type: Translated Dec 12 17:39:35.169711 kernel: iommu: DMA domain TLB invalidation policy: strict mode Dec 12 17:39:35.169717 kernel: efivars: Registered efivars operations Dec 12 17:39:35.169722 kernel: vgaarb: loaded Dec 12 17:39:35.169727 kernel: clocksource: Switched to clocksource arch_sys_counter Dec 12 17:39:35.169732 kernel: VFS: Disk quotas dquot_6.6.0 Dec 12 17:39:35.169736 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 12 17:39:35.169741 kernel: pnp: PnP ACPI init Dec 12 17:39:35.169746 kernel: pnp: PnP ACPI: found 0 devices Dec 12 17:39:35.169750 kernel: NET: Registered PF_INET protocol family Dec 12 17:39:35.169755 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Dec 12 17:39:35.169760 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Dec 12 17:39:35.169766 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 12 17:39:35.169771 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 12 17:39:35.169776 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Dec 12 17:39:35.169780 kernel: TCP: Hash tables configured (established 32768 bind 32768) Dec 12 17:39:35.169785 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Dec 12 17:39:35.169790 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Dec 12 17:39:35.169795 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 12 17:39:35.169800 kernel: PCI: CLS 0 bytes, default 64 Dec 12 17:39:35.169804 kernel: kvm [1]: HYP mode not available Dec 12 17:39:35.169810 kernel: Initialise system trusted keyrings Dec 12 17:39:35.169815 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Dec 12 17:39:35.169819 kernel: Key type asymmetric registered Dec 12 17:39:35.169824 kernel: Asymmetric key parser 'x509' registered Dec 12 17:39:35.169829 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Dec 12 17:39:35.169834 kernel: io scheduler mq-deadline registered Dec 12 17:39:35.169838 kernel: io scheduler kyber registered Dec 12 17:39:35.169843 kernel: io scheduler bfq registered Dec 12 17:39:35.169848 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 12 17:39:35.169853 kernel: thunder_xcv, ver 1.0 Dec 12 17:39:35.169858 kernel: thunder_bgx, ver 1.0 Dec 12 17:39:35.169863 kernel: nicpf, ver 1.0 Dec 12 17:39:35.169868 kernel: nicvf, ver 1.0 Dec 12 17:39:35.169977 kernel: rtc-efi rtc-efi.0: registered as rtc0 Dec 12 17:39:35.170027 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-12-12T17:39:34 UTC (1765561174) Dec 12 17:39:35.170034 kernel: efifb: probing for efifb Dec 12 17:39:35.170040 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Dec 12 17:39:35.170045 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Dec 12 17:39:35.170050 kernel: efifb: scrolling: redraw Dec 12 17:39:35.170054 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Dec 12 17:39:35.170059 kernel: Console: switching to colour frame buffer device 128x48 Dec 12 17:39:35.170064 kernel: fb0: EFI VGA frame buffer device Dec 12 17:39:35.170069 kernel: SMCCC: SOC_ID: ARCH_SOC_ID not implemented, skipping .... Dec 12 17:39:35.170074 kernel: hid: raw HID events driver (C) Jiri Kosina Dec 12 17:39:35.170079 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Dec 12 17:39:35.170084 kernel: watchdog: NMI not fully supported Dec 12 17:39:35.170089 kernel: watchdog: Hard watchdog permanently disabled Dec 12 17:39:35.170094 kernel: NET: Registered PF_INET6 protocol family Dec 12 17:39:35.170099 kernel: Segment Routing with IPv6 Dec 12 17:39:35.170103 kernel: In-situ OAM (IOAM) with IPv6 Dec 12 17:39:35.170108 kernel: NET: Registered PF_PACKET protocol family Dec 12 17:39:35.170113 kernel: Key type dns_resolver registered Dec 12 17:39:35.170118 kernel: registered taskstats version 1 Dec 12 17:39:35.170123 kernel: Loading compiled-in X.509 certificates Dec 12 17:39:35.170127 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.61-flatcar: 92f3a94fb747a7ba7cbcfde1535be91b86f9429a' Dec 12 17:39:35.170133 kernel: Demotion targets for Node 0: null Dec 12 17:39:35.170138 kernel: Key type .fscrypt registered Dec 12 17:39:35.170143 kernel: Key type fscrypt-provisioning registered Dec 12 17:39:35.170147 kernel: ima: No TPM chip found, activating TPM-bypass! Dec 12 17:39:35.170152 kernel: ima: Allocated hash algorithm: sha1 Dec 12 17:39:35.170157 kernel: ima: No architecture policies found Dec 12 17:39:35.170162 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Dec 12 17:39:35.170167 kernel: clk: Disabling unused clocks Dec 12 17:39:35.170180 kernel: PM: genpd: Disabling unused power domains Dec 12 17:39:35.170186 kernel: Warning: unable to open an initial console. Dec 12 17:39:35.170191 kernel: Freeing unused kernel memory: 39552K Dec 12 17:39:35.170196 kernel: Run /init as init process Dec 12 17:39:35.170200 kernel: with arguments: Dec 12 17:39:35.170205 kernel: /init Dec 12 17:39:35.170210 kernel: with environment: Dec 12 17:39:35.170214 kernel: HOME=/ Dec 12 17:39:35.170219 kernel: TERM=linux Dec 12 17:39:35.170225 systemd[1]: Successfully made /usr/ read-only. Dec 12 17:39:35.170232 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 12 17:39:35.170238 systemd[1]: Detected virtualization microsoft. Dec 12 17:39:35.170243 systemd[1]: Detected architecture arm64. Dec 12 17:39:35.170248 systemd[1]: Running in initrd. Dec 12 17:39:35.170253 systemd[1]: No hostname configured, using default hostname. Dec 12 17:39:35.170259 systemd[1]: Hostname set to . Dec 12 17:39:35.170264 systemd[1]: Initializing machine ID from random generator. Dec 12 17:39:35.170270 systemd[1]: Queued start job for default target initrd.target. Dec 12 17:39:35.170275 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 17:39:35.170281 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 17:39:35.170286 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Dec 12 17:39:35.170292 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 12 17:39:35.170297 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Dec 12 17:39:35.170303 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Dec 12 17:39:35.170310 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Dec 12 17:39:35.170315 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Dec 12 17:39:35.170320 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 17:39:35.170325 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 12 17:39:35.170331 systemd[1]: Reached target paths.target - Path Units. Dec 12 17:39:35.170336 systemd[1]: Reached target slices.target - Slice Units. Dec 12 17:39:35.170341 systemd[1]: Reached target swap.target - Swaps. Dec 12 17:39:35.170346 systemd[1]: Reached target timers.target - Timer Units. Dec 12 17:39:35.170352 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Dec 12 17:39:35.170357 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 12 17:39:35.170362 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Dec 12 17:39:35.170368 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Dec 12 17:39:35.170373 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 12 17:39:35.170378 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 12 17:39:35.170384 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 17:39:35.170389 systemd[1]: Reached target sockets.target - Socket Units. Dec 12 17:39:35.170394 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Dec 12 17:39:35.170400 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 12 17:39:35.170405 systemd[1]: Finished network-cleanup.service - Network Cleanup. Dec 12 17:39:35.170411 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Dec 12 17:39:35.170416 systemd[1]: Starting systemd-fsck-usr.service... Dec 12 17:39:35.170421 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 12 17:39:35.170427 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 12 17:39:35.170443 systemd-journald[225]: Collecting audit messages is disabled. Dec 12 17:39:35.170458 systemd-journald[225]: Journal started Dec 12 17:39:35.170471 systemd-journald[225]: Runtime Journal (/run/log/journal/0787659af0cb4689a0bbb16d82c07d06) is 8M, max 78.3M, 70.3M free. Dec 12 17:39:35.181228 systemd-modules-load[227]: Inserted module 'overlay' Dec 12 17:39:35.188539 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:39:35.203187 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 12 17:39:35.206994 systemd-modules-load[227]: Inserted module 'br_netfilter' Dec 12 17:39:35.217278 kernel: Bridge firewalling registered Dec 12 17:39:35.217308 systemd[1]: Started systemd-journald.service - Journal Service. Dec 12 17:39:35.224097 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Dec 12 17:39:35.229504 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 17:39:35.241452 systemd[1]: Finished systemd-fsck-usr.service. Dec 12 17:39:35.250845 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 12 17:39:35.258952 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:39:35.270978 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 12 17:39:35.291677 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 12 17:39:35.304295 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 12 17:39:35.318622 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 12 17:39:35.329055 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 12 17:39:35.340630 systemd-tmpfiles[256]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Dec 12 17:39:35.344635 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 12 17:39:35.357664 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 12 17:39:35.368706 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 17:39:35.383354 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Dec 12 17:39:35.402993 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 12 17:39:35.415325 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 12 17:39:35.433216 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 17:39:35.449511 dracut-cmdline[263]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=361f5baddf90aee3bc7ee7e9be879bc0cc94314f224faa1e2791d9b44cd3ec52 Dec 12 17:39:35.481305 systemd-resolved[264]: Positive Trust Anchors: Dec 12 17:39:35.481316 systemd-resolved[264]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 12 17:39:35.481336 systemd-resolved[264]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 12 17:39:35.483303 systemd-resolved[264]: Defaulting to hostname 'linux'. Dec 12 17:39:35.484958 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 12 17:39:35.490800 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 12 17:39:35.588199 kernel: SCSI subsystem initialized Dec 12 17:39:35.594186 kernel: Loading iSCSI transport class v2.0-870. Dec 12 17:39:35.602200 kernel: iscsi: registered transport (tcp) Dec 12 17:39:35.615823 kernel: iscsi: registered transport (qla4xxx) Dec 12 17:39:35.615835 kernel: QLogic iSCSI HBA Driver Dec 12 17:39:35.629404 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 12 17:39:35.649717 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 17:39:35.656094 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 12 17:39:35.706498 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Dec 12 17:39:35.712455 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Dec 12 17:39:35.780195 kernel: raid6: neonx8 gen() 18541 MB/s Dec 12 17:39:35.797182 kernel: raid6: neonx4 gen() 18530 MB/s Dec 12 17:39:35.817180 kernel: raid6: neonx2 gen() 17077 MB/s Dec 12 17:39:35.838183 kernel: raid6: neonx1 gen() 15007 MB/s Dec 12 17:39:35.857181 kernel: raid6: int64x8 gen() 10526 MB/s Dec 12 17:39:35.876267 kernel: raid6: int64x4 gen() 10611 MB/s Dec 12 17:39:35.897182 kernel: raid6: int64x2 gen() 8982 MB/s Dec 12 17:39:35.919894 kernel: raid6: int64x1 gen() 7007 MB/s Dec 12 17:39:35.919903 kernel: raid6: using algorithm neonx8 gen() 18541 MB/s Dec 12 17:39:35.943085 kernel: raid6: .... xor() 14899 MB/s, rmw enabled Dec 12 17:39:35.943156 kernel: raid6: using neon recovery algorithm Dec 12 17:39:35.953842 kernel: xor: measuring software checksum speed Dec 12 17:39:35.953914 kernel: 8regs : 28665 MB/sec Dec 12 17:39:35.957071 kernel: 32regs : 28795 MB/sec Dec 12 17:39:35.960604 kernel: arm64_neon : 37501 MB/sec Dec 12 17:39:35.964548 kernel: xor: using function: arm64_neon (37501 MB/sec) Dec 12 17:39:36.004210 kernel: Btrfs loaded, zoned=no, fsverity=no Dec 12 17:39:36.011222 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Dec 12 17:39:36.023307 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 17:39:36.053266 systemd-udevd[476]: Using default interface naming scheme 'v255'. Dec 12 17:39:36.061601 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 17:39:36.068602 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Dec 12 17:39:36.118465 dracut-pre-trigger[489]: rd.md=0: removing MD RAID activation Dec 12 17:39:36.139957 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Dec 12 17:39:36.147297 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 12 17:39:36.200206 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 17:39:36.214729 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Dec 12 17:39:36.262200 kernel: hv_vmbus: Vmbus version:5.3 Dec 12 17:39:36.282192 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 17:39:36.304881 kernel: pps_core: LinuxPPS API ver. 1 registered Dec 12 17:39:36.304904 kernel: hv_vmbus: registering driver hid_hyperv Dec 12 17:39:36.304911 kernel: hv_vmbus: registering driver hyperv_keyboard Dec 12 17:39:36.304917 kernel: hv_vmbus: registering driver hv_storvsc Dec 12 17:39:36.304934 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Dec 12 17:39:36.304941 kernel: hv_vmbus: registering driver hv_netvsc Dec 12 17:39:36.282618 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:39:36.331227 kernel: scsi host1: storvsc_host_t Dec 12 17:39:36.331917 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input0 Dec 12 17:39:36.331928 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input1 Dec 12 17:39:36.332330 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:39:36.344774 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Dec 12 17:39:36.352208 kernel: scsi host0: storvsc_host_t Dec 12 17:39:36.359274 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Dec 12 17:39:36.352991 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:39:36.365275 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Dec 12 17:39:36.366086 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 17:39:36.395084 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 Dec 12 17:39:36.366155 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:39:36.378283 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:39:36.429194 kernel: PTP clock support registered Dec 12 17:39:36.429721 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:39:36.479093 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Dec 12 17:39:36.721731 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Dec 12 17:39:36.721836 kernel: hv_utils: Registering HyperV Utility Driver Dec 12 17:39:36.721844 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Dec 12 17:39:36.721921 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Dec 12 17:39:36.721927 kernel: hv_vmbus: registering driver hv_utils Dec 12 17:39:36.721941 kernel: hv_utils: Heartbeat IC version 3.0 Dec 12 17:39:36.721947 kernel: sd 0:0:0:0: [sda] Write Protect is off Dec 12 17:39:36.722009 kernel: hv_utils: Shutdown IC version 3.2 Dec 12 17:39:36.722016 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 Dec 12 17:39:36.722077 kernel: hv_utils: TimeSync IC version 4.0 Dec 12 17:39:36.722083 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Dec 12 17:39:36.721690 systemd-resolved[264]: Clock change detected. Flushing caches. Dec 12 17:39:36.739839 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Dec 12 17:39:36.739971 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#80 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Dec 12 17:39:36.740044 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#87 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Dec 12 17:39:36.761698 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 12 17:39:36.761746 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Dec 12 17:39:36.782553 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#143 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Dec 12 17:39:36.806562 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#165 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Dec 12 17:39:37.236531 kernel: hv_netvsc 000d3ac5-f17f-000d-3ac5-f17f000d3ac5 eth0: VF slot 1 added Dec 12 17:39:37.242523 kernel: hv_vmbus: registering driver hv_pci Dec 12 17:39:37.250076 kernel: hv_pci 3ab26d95-573e-4609-8348-74383a883c8a: PCI VMBus probing: Using version 0x10004 Dec 12 17:39:37.250278 kernel: hv_pci 3ab26d95-573e-4609-8348-74383a883c8a: PCI host bridge to bus 573e:00 Dec 12 17:39:37.261083 kernel: pci_bus 573e:00: root bus resource [mem 0xfc0000000-0xfc00fffff window] Dec 12 17:39:37.266790 kernel: pci_bus 573e:00: No busn resource found for root bus, will use [bus 00-ff] Dec 12 17:39:37.508914 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Dec 12 17:39:37.541980 kernel: pci 573e:00:02.0: [15b3:101a] type 00 class 0x020000 PCIe Endpoint Dec 12 17:39:37.542059 kernel: pci 573e:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref] Dec 12 17:39:37.535835 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. Dec 12 17:39:37.557302 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. Dec 12 17:39:37.571631 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Dec 12 17:39:37.602418 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. Dec 12 17:39:37.669766 kernel: pci 573e:00:02.0: enabling Extended Tags Dec 12 17:39:37.784612 kernel: pci 573e:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at 573e:00:02.0 (capable of 252.048 Gb/s with 16.0 GT/s PCIe x16 link) Dec 12 17:39:37.795636 kernel: pci_bus 573e:00: busn_res: [bus 00-ff] end is updated to 00 Dec 12 17:39:37.795804 kernel: pci 573e:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref]: assigned Dec 12 17:39:37.855164 kernel: mlx5_core 573e:00:02.0: enabling device (0000 -> 0002) Dec 12 17:39:37.866092 kernel: mlx5_core 573e:00:02.0: PTM is not supported by PCIe Dec 12 17:39:37.866255 kernel: mlx5_core 573e:00:02.0: firmware version: 16.30.5006 Dec 12 17:39:37.870776 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. Dec 12 17:39:38.085350 kernel: hv_netvsc 000d3ac5-f17f-000d-3ac5-f17f000d3ac5 eth0: VF registering: eth1 Dec 12 17:39:38.085604 kernel: mlx5_core 573e:00:02.0 eth1: joined to eth0 Dec 12 17:39:38.091554 kernel: mlx5_core 573e:00:02.0: MLX5E: StrdRq(1) RqSz(8) StrdSz(2048) RxCqeCmprss(0 basic) Dec 12 17:39:38.103532 kernel: mlx5_core 573e:00:02.0 enP22334s1: renamed from eth1 Dec 12 17:39:38.115798 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Dec 12 17:39:38.120920 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Dec 12 17:39:38.132171 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 17:39:38.143831 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 12 17:39:38.154308 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Dec 12 17:39:38.181132 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Dec 12 17:39:38.784530 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#77 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Dec 12 17:39:38.798146 disk-uuid[629]: The operation has completed successfully. Dec 12 17:39:38.802081 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 12 17:39:38.867445 systemd[1]: disk-uuid.service: Deactivated successfully. Dec 12 17:39:38.869527 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Dec 12 17:39:38.902571 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Dec 12 17:39:38.919815 sh[825]: Success Dec 12 17:39:38.954908 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 12 17:39:38.954970 kernel: device-mapper: uevent: version 1.0.3 Dec 12 17:39:38.960118 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Dec 12 17:39:38.970527 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Dec 12 17:39:39.265833 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Dec 12 17:39:39.278676 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Dec 12 17:39:39.284893 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Dec 12 17:39:39.315517 kernel: BTRFS: device fsid 6d6d314d-b8a1-4727-8a34-8525e276a248 devid 1 transid 38 /dev/mapper/usr (254:0) scanned by mount (843) Dec 12 17:39:39.327487 kernel: BTRFS info (device dm-0): first mount of filesystem 6d6d314d-b8a1-4727-8a34-8525e276a248 Dec 12 17:39:39.327532 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:39:39.621738 kernel: BTRFS info (device dm-0): disabling log replay at mount time Dec 12 17:39:39.621828 kernel: BTRFS info (device dm-0): enabling free space tree Dec 12 17:39:39.657611 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Dec 12 17:39:39.661930 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Dec 12 17:39:39.670758 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Dec 12 17:39:39.671407 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Dec 12 17:39:39.696164 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Dec 12 17:39:39.733553 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (871) Dec 12 17:39:39.746936 kernel: BTRFS info (device sda6): first mount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:39:39.747001 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:39:39.774099 kernel: BTRFS info (device sda6): turning on async discard Dec 12 17:39:39.774165 kernel: BTRFS info (device sda6): enabling free space tree Dec 12 17:39:39.783553 kernel: BTRFS info (device sda6): last unmount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:39:39.784322 systemd[1]: Finished ignition-setup.service - Ignition (setup). Dec 12 17:39:39.791061 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Dec 12 17:39:39.832116 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 12 17:39:39.844867 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 12 17:39:39.879588 systemd-networkd[1012]: lo: Link UP Dec 12 17:39:39.879599 systemd-networkd[1012]: lo: Gained carrier Dec 12 17:39:39.880317 systemd-networkd[1012]: Enumeration completed Dec 12 17:39:39.882713 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 12 17:39:39.887800 systemd[1]: Reached target network.target - Network. Dec 12 17:39:39.889579 systemd-networkd[1012]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:39:39.889583 systemd-networkd[1012]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 12 17:39:39.965525 kernel: mlx5_core 573e:00:02.0 enP22334s1: Link up Dec 12 17:39:40.003779 kernel: hv_netvsc 000d3ac5-f17f-000d-3ac5-f17f000d3ac5 eth0: Data path switched to VF: enP22334s1 Dec 12 17:39:40.003609 systemd-networkd[1012]: enP22334s1: Link UP Dec 12 17:39:40.003666 systemd-networkd[1012]: eth0: Link UP Dec 12 17:39:40.003762 systemd-networkd[1012]: eth0: Gained carrier Dec 12 17:39:40.003777 systemd-networkd[1012]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:39:40.024720 systemd-networkd[1012]: enP22334s1: Gained carrier Dec 12 17:39:40.045545 systemd-networkd[1012]: eth0: DHCPv4 address 10.200.20.11/24, gateway 10.200.20.1 acquired from 168.63.129.16 Dec 12 17:39:41.400261 ignition[965]: Ignition 2.22.0 Dec 12 17:39:41.400277 ignition[965]: Stage: fetch-offline Dec 12 17:39:41.406539 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Dec 12 17:39:41.400368 ignition[965]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:39:41.414088 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Dec 12 17:39:41.400376 ignition[965]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Dec 12 17:39:41.400444 ignition[965]: parsed url from cmdline: "" Dec 12 17:39:41.400447 ignition[965]: no config URL provided Dec 12 17:39:41.400450 ignition[965]: reading system config file "/usr/lib/ignition/user.ign" Dec 12 17:39:41.400455 ignition[965]: no config at "/usr/lib/ignition/user.ign" Dec 12 17:39:41.400459 ignition[965]: failed to fetch config: resource requires networking Dec 12 17:39:41.400597 ignition[965]: Ignition finished successfully Dec 12 17:39:41.459395 ignition[1023]: Ignition 2.22.0 Dec 12 17:39:41.459411 ignition[1023]: Stage: fetch Dec 12 17:39:41.459589 ignition[1023]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:39:41.459596 ignition[1023]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Dec 12 17:39:41.459652 ignition[1023]: parsed url from cmdline: "" Dec 12 17:39:41.459654 ignition[1023]: no config URL provided Dec 12 17:39:41.459657 ignition[1023]: reading system config file "/usr/lib/ignition/user.ign" Dec 12 17:39:41.459663 ignition[1023]: no config at "/usr/lib/ignition/user.ign" Dec 12 17:39:41.459678 ignition[1023]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Dec 12 17:39:41.541642 ignition[1023]: GET result: OK Dec 12 17:39:41.544578 ignition[1023]: config has been read from IMDS userdata Dec 12 17:39:41.544601 ignition[1023]: parsing config with SHA512: 74e5032f4d4effcc646b7aac8937016cf384b2a5ee348ee4889f45bcda00fe44415b7681c3f5ef05b2f16f442bf78b0f8a6cda3a664599b1bd06f343c455a348 Dec 12 17:39:41.547310 unknown[1023]: fetched base config from "system" Dec 12 17:39:41.547672 ignition[1023]: fetch: fetch complete Dec 12 17:39:41.547320 unknown[1023]: fetched base config from "system" Dec 12 17:39:41.547676 ignition[1023]: fetch: fetch passed Dec 12 17:39:41.547323 unknown[1023]: fetched user config from "azure" Dec 12 17:39:41.547716 ignition[1023]: Ignition finished successfully Dec 12 17:39:41.551347 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Dec 12 17:39:41.557964 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Dec 12 17:39:41.596049 ignition[1029]: Ignition 2.22.0 Dec 12 17:39:41.596062 ignition[1029]: Stage: kargs Dec 12 17:39:41.596248 ignition[1029]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:39:41.607166 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Dec 12 17:39:41.596260 ignition[1029]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Dec 12 17:39:41.596794 ignition[1029]: kargs: kargs passed Dec 12 17:39:41.616799 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Dec 12 17:39:41.596835 ignition[1029]: Ignition finished successfully Dec 12 17:39:41.649619 ignition[1035]: Ignition 2.22.0 Dec 12 17:39:41.649629 ignition[1035]: Stage: disks Dec 12 17:39:41.652748 systemd[1]: Finished ignition-disks.service - Ignition (disks). Dec 12 17:39:41.649797 ignition[1035]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:39:41.657780 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Dec 12 17:39:41.649805 ignition[1035]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Dec 12 17:39:41.664918 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Dec 12 17:39:41.650264 ignition[1035]: disks: disks passed Dec 12 17:39:41.674076 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 12 17:39:41.650303 ignition[1035]: Ignition finished successfully Dec 12 17:39:41.683336 systemd[1]: Reached target sysinit.target - System Initialization. Dec 12 17:39:41.692120 systemd[1]: Reached target basic.target - Basic System. Dec 12 17:39:41.701750 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Dec 12 17:39:41.779965 systemd-fsck[1043]: ROOT: clean, 15/7326000 files, 477845/7359488 blocks Dec 12 17:39:41.784922 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Dec 12 17:39:41.791757 systemd[1]: Mounting sysroot.mount - /sysroot... Dec 12 17:39:42.032527 kernel: EXT4-fs (sda9): mounted filesystem 895d7845-d0e8-43ae-a778-7804b473b868 r/w with ordered data mode. Quota mode: none. Dec 12 17:39:42.032811 systemd[1]: Mounted sysroot.mount - /sysroot. Dec 12 17:39:42.037190 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Dec 12 17:39:42.060731 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 12 17:39:42.065413 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Dec 12 17:39:42.083780 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Dec 12 17:39:42.089025 systemd-networkd[1012]: eth0: Gained IPv6LL Dec 12 17:39:42.089709 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Dec 12 17:39:42.089740 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Dec 12 17:39:42.107020 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Dec 12 17:39:42.123065 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Dec 12 17:39:42.148555 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1057) Dec 12 17:39:42.158510 kernel: BTRFS info (device sda6): first mount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:39:42.158543 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:39:42.169892 kernel: BTRFS info (device sda6): turning on async discard Dec 12 17:39:42.169937 kernel: BTRFS info (device sda6): enabling free space tree Dec 12 17:39:42.172090 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 12 17:39:42.990517 coreos-metadata[1059]: Dec 12 17:39:42.990 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Dec 12 17:39:42.999679 coreos-metadata[1059]: Dec 12 17:39:42.999 INFO Fetch successful Dec 12 17:39:43.004046 coreos-metadata[1059]: Dec 12 17:39:43.003 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Dec 12 17:39:43.012428 coreos-metadata[1059]: Dec 12 17:39:43.012 INFO Fetch successful Dec 12 17:39:43.024849 coreos-metadata[1059]: Dec 12 17:39:43.024 INFO wrote hostname ci-4459.2.2-a-9f5170e2ca to /sysroot/etc/hostname Dec 12 17:39:43.033319 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Dec 12 17:39:43.730412 initrd-setup-root[1087]: cut: /sysroot/etc/passwd: No such file or directory Dec 12 17:39:43.768390 initrd-setup-root[1094]: cut: /sysroot/etc/group: No such file or directory Dec 12 17:39:43.787755 initrd-setup-root[1101]: cut: /sysroot/etc/shadow: No such file or directory Dec 12 17:39:43.807132 initrd-setup-root[1108]: cut: /sysroot/etc/gshadow: No such file or directory Dec 12 17:39:44.781096 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Dec 12 17:39:44.787250 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Dec 12 17:39:44.801954 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Dec 12 17:39:44.815400 systemd[1]: sysroot-oem.mount: Deactivated successfully. Dec 12 17:39:44.827568 kernel: BTRFS info (device sda6): last unmount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:39:44.841272 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Dec 12 17:39:44.858577 ignition[1177]: INFO : Ignition 2.22.0 Dec 12 17:39:44.858577 ignition[1177]: INFO : Stage: mount Dec 12 17:39:44.866558 ignition[1177]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 17:39:44.866558 ignition[1177]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Dec 12 17:39:44.866558 ignition[1177]: INFO : mount: mount passed Dec 12 17:39:44.866558 ignition[1177]: INFO : Ignition finished successfully Dec 12 17:39:44.863379 systemd[1]: Finished ignition-mount.service - Ignition (mount). Dec 12 17:39:44.872745 systemd[1]: Starting ignition-files.service - Ignition (files)... Dec 12 17:39:44.904611 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 12 17:39:44.939527 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1188) Dec 12 17:39:44.950509 kernel: BTRFS info (device sda6): first mount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:39:44.950556 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:39:44.960187 kernel: BTRFS info (device sda6): turning on async discard Dec 12 17:39:44.960243 kernel: BTRFS info (device sda6): enabling free space tree Dec 12 17:39:44.961604 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 12 17:39:44.987266 ignition[1205]: INFO : Ignition 2.22.0 Dec 12 17:39:44.987266 ignition[1205]: INFO : Stage: files Dec 12 17:39:44.993431 ignition[1205]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 17:39:44.993431 ignition[1205]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Dec 12 17:39:44.993431 ignition[1205]: DEBUG : files: compiled without relabeling support, skipping Dec 12 17:39:45.007985 ignition[1205]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Dec 12 17:39:45.007985 ignition[1205]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Dec 12 17:39:45.047930 ignition[1205]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Dec 12 17:39:45.054312 ignition[1205]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Dec 12 17:39:45.060621 ignition[1205]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Dec 12 17:39:45.054485 unknown[1205]: wrote ssh authorized keys file for user: core Dec 12 17:39:45.136903 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Dec 12 17:39:45.146290 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Dec 12 17:39:45.166809 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Dec 12 17:39:45.244547 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Dec 12 17:39:45.244547 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Dec 12 17:39:45.262755 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Dec 12 17:39:45.262755 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Dec 12 17:39:45.262755 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Dec 12 17:39:45.262755 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 12 17:39:45.262755 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 12 17:39:45.262755 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 12 17:39:45.262755 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 12 17:39:45.321184 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Dec 12 17:39:45.321184 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Dec 12 17:39:45.321184 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 12 17:39:45.321184 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 12 17:39:45.321184 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 12 17:39:45.321184 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.1-arm64.raw: attempt #1 Dec 12 17:39:45.902084 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Dec 12 17:39:46.099219 ignition[1205]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 12 17:39:46.110825 ignition[1205]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Dec 12 17:39:46.144330 ignition[1205]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 12 17:39:46.160130 ignition[1205]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 12 17:39:46.160130 ignition[1205]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Dec 12 17:39:46.160130 ignition[1205]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Dec 12 17:39:46.184777 ignition[1205]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Dec 12 17:39:46.184777 ignition[1205]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Dec 12 17:39:46.184777 ignition[1205]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Dec 12 17:39:46.184777 ignition[1205]: INFO : files: files passed Dec 12 17:39:46.184777 ignition[1205]: INFO : Ignition finished successfully Dec 12 17:39:46.176125 systemd[1]: Finished ignition-files.service - Ignition (files). Dec 12 17:39:46.191838 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Dec 12 17:39:46.226643 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Dec 12 17:39:46.242240 systemd[1]: ignition-quench.service: Deactivated successfully. Dec 12 17:39:46.242333 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Dec 12 17:39:46.271902 initrd-setup-root-after-ignition[1238]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 12 17:39:46.280523 initrd-setup-root-after-ignition[1234]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 12 17:39:46.280523 initrd-setup-root-after-ignition[1234]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Dec 12 17:39:46.273436 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 12 17:39:46.287642 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Dec 12 17:39:46.301057 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Dec 12 17:39:46.347148 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 12 17:39:46.347255 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Dec 12 17:39:46.358164 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Dec 12 17:39:46.368576 systemd[1]: Reached target initrd.target - Initrd Default Target. Dec 12 17:39:46.378293 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Dec 12 17:39:46.379093 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Dec 12 17:39:46.411843 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 12 17:39:46.419718 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Dec 12 17:39:46.443162 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Dec 12 17:39:46.449205 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 17:39:46.459669 systemd[1]: Stopped target timers.target - Timer Units. Dec 12 17:39:46.469430 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 12 17:39:46.469551 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 12 17:39:46.483991 systemd[1]: Stopped target initrd.target - Initrd Default Target. Dec 12 17:39:46.488972 systemd[1]: Stopped target basic.target - Basic System. Dec 12 17:39:46.498855 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Dec 12 17:39:46.509149 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Dec 12 17:39:46.519187 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Dec 12 17:39:46.529880 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Dec 12 17:39:46.541354 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Dec 12 17:39:46.550920 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Dec 12 17:39:46.562083 systemd[1]: Stopped target sysinit.target - System Initialization. Dec 12 17:39:46.572055 systemd[1]: Stopped target local-fs.target - Local File Systems. Dec 12 17:39:46.582601 systemd[1]: Stopped target swap.target - Swaps. Dec 12 17:39:46.591099 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 12 17:39:46.591225 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Dec 12 17:39:46.605017 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Dec 12 17:39:46.611713 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 17:39:46.623001 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Dec 12 17:39:46.627913 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 17:39:46.633649 systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 12 17:39:46.633756 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Dec 12 17:39:46.649719 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Dec 12 17:39:46.649810 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 12 17:39:46.656522 systemd[1]: ignition-files.service: Deactivated successfully. Dec 12 17:39:46.656594 systemd[1]: Stopped ignition-files.service - Ignition (files). Dec 12 17:39:46.664997 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Dec 12 17:39:46.665064 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Dec 12 17:39:46.743574 ignition[1258]: INFO : Ignition 2.22.0 Dec 12 17:39:46.743574 ignition[1258]: INFO : Stage: umount Dec 12 17:39:46.743574 ignition[1258]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 17:39:46.743574 ignition[1258]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Dec 12 17:39:46.743574 ignition[1258]: INFO : umount: umount passed Dec 12 17:39:46.743574 ignition[1258]: INFO : Ignition finished successfully Dec 12 17:39:46.682681 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Dec 12 17:39:46.707479 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Dec 12 17:39:46.715906 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 12 17:39:46.716052 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 17:39:46.729207 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 12 17:39:46.729298 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Dec 12 17:39:46.753713 systemd[1]: ignition-mount.service: Deactivated successfully. Dec 12 17:39:46.753809 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Dec 12 17:39:46.763914 systemd[1]: ignition-disks.service: Deactivated successfully. Dec 12 17:39:46.764137 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Dec 12 17:39:46.776671 systemd[1]: ignition-kargs.service: Deactivated successfully. Dec 12 17:39:46.776722 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Dec 12 17:39:46.787350 systemd[1]: ignition-fetch.service: Deactivated successfully. Dec 12 17:39:46.787393 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Dec 12 17:39:46.792893 systemd[1]: Stopped target network.target - Network. Dec 12 17:39:46.801370 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Dec 12 17:39:46.801425 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Dec 12 17:39:46.811606 systemd[1]: Stopped target paths.target - Path Units. Dec 12 17:39:46.820042 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 12 17:39:46.825666 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 17:39:46.832920 systemd[1]: Stopped target slices.target - Slice Units. Dec 12 17:39:46.838270 systemd[1]: Stopped target sockets.target - Socket Units. Dec 12 17:39:46.848961 systemd[1]: iscsid.socket: Deactivated successfully. Dec 12 17:39:46.849003 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Dec 12 17:39:46.860016 systemd[1]: iscsiuio.socket: Deactivated successfully. Dec 12 17:39:46.860040 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 12 17:39:46.871120 systemd[1]: ignition-setup.service: Deactivated successfully. Dec 12 17:39:46.871170 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Dec 12 17:39:46.881306 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Dec 12 17:39:46.881335 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Dec 12 17:39:46.887265 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Dec 12 17:39:46.896520 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Dec 12 17:39:46.912873 systemd[1]: sysroot-boot.mount: Deactivated successfully. Dec 12 17:39:46.913359 systemd[1]: systemd-resolved.service: Deactivated successfully. Dec 12 17:39:46.913447 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Dec 12 17:39:47.146912 kernel: hv_netvsc 000d3ac5-f17f-000d-3ac5-f17f000d3ac5 eth0: Data path switched from VF: enP22334s1 Dec 12 17:39:46.926264 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Dec 12 17:39:46.926499 systemd[1]: systemd-networkd.service: Deactivated successfully. Dec 12 17:39:46.926601 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Dec 12 17:39:46.941859 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Dec 12 17:39:46.942060 systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 12 17:39:46.942142 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Dec 12 17:39:46.955116 systemd[1]: Stopped target network-pre.target - Preparation for Network. Dec 12 17:39:46.964148 systemd[1]: systemd-networkd.socket: Deactivated successfully. Dec 12 17:39:46.964191 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Dec 12 17:39:46.985623 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Dec 12 17:39:47.001078 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Dec 12 17:39:47.001164 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 12 17:39:47.014952 systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 12 17:39:47.014994 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Dec 12 17:39:47.028296 systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 12 17:39:47.028338 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Dec 12 17:39:47.033550 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 12 17:39:47.033581 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 17:39:47.049683 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 17:39:47.064309 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Dec 12 17:39:47.064375 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Dec 12 17:39:47.074143 systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 12 17:39:47.084916 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 17:39:47.091992 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 12 17:39:47.092054 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Dec 12 17:39:47.101086 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 12 17:39:47.101117 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 17:39:47.111927 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 12 17:39:47.111974 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Dec 12 17:39:47.125323 systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 12 17:39:47.125373 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Dec 12 17:39:47.141388 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 12 17:39:47.141442 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 12 17:39:47.153564 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Dec 12 17:39:47.170522 systemd[1]: systemd-network-generator.service: Deactivated successfully. Dec 12 17:39:47.170600 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 17:39:47.176462 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 12 17:39:47.176519 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 17:39:47.194222 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Dec 12 17:39:47.194288 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 12 17:39:47.206248 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 12 17:39:47.427607 systemd-journald[225]: Received SIGTERM from PID 1 (systemd). Dec 12 17:39:47.206303 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 17:39:47.212126 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 17:39:47.212174 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:39:47.228055 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Dec 12 17:39:47.228100 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Dec 12 17:39:47.228123 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Dec 12 17:39:47.228149 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Dec 12 17:39:47.228449 systemd[1]: sysroot-boot.service: Deactivated successfully. Dec 12 17:39:47.228572 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Dec 12 17:39:47.237762 systemd[1]: network-cleanup.service: Deactivated successfully. Dec 12 17:39:47.237839 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Dec 12 17:39:47.247477 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 12 17:39:47.247564 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Dec 12 17:39:47.257636 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Dec 12 17:39:47.267666 systemd[1]: initrd-setup-root.service: Deactivated successfully. Dec 12 17:39:47.267773 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Dec 12 17:39:47.277217 systemd[1]: Starting initrd-switch-root.service - Switch Root... Dec 12 17:39:47.305608 systemd[1]: Switching root. Dec 12 17:39:47.520485 systemd-journald[225]: Journal stopped Dec 12 17:39:51.696932 kernel: SELinux: policy capability network_peer_controls=1 Dec 12 17:39:51.696963 kernel: SELinux: policy capability open_perms=1 Dec 12 17:39:51.696971 kernel: SELinux: policy capability extended_socket_class=1 Dec 12 17:39:51.696978 kernel: SELinux: policy capability always_check_network=0 Dec 12 17:39:51.696983 kernel: SELinux: policy capability cgroup_seclabel=1 Dec 12 17:39:51.696991 kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 12 17:39:51.696997 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Dec 12 17:39:51.697002 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Dec 12 17:39:51.697008 kernel: SELinux: policy capability userspace_initial_context=0 Dec 12 17:39:51.697013 kernel: audit: type=1403 audit(1765561188.409:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Dec 12 17:39:51.697020 systemd[1]: Successfully loaded SELinux policy in 179.926ms. Dec 12 17:39:51.697028 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 4.437ms. Dec 12 17:39:51.697035 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 12 17:39:51.697041 systemd[1]: Detected virtualization microsoft. Dec 12 17:39:51.697048 systemd[1]: Detected architecture arm64. Dec 12 17:39:51.697054 systemd[1]: Detected first boot. Dec 12 17:39:51.697062 systemd[1]: Hostname set to . Dec 12 17:39:51.697069 systemd[1]: Initializing machine ID from random generator. Dec 12 17:39:51.697075 zram_generator::config[1303]: No configuration found. Dec 12 17:39:51.697082 kernel: NET: Registered PF_VSOCK protocol family Dec 12 17:39:51.697087 systemd[1]: Populated /etc with preset unit settings. Dec 12 17:39:51.697094 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Dec 12 17:39:51.697100 systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 12 17:39:51.697107 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Dec 12 17:39:51.697113 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 12 17:39:51.697119 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Dec 12 17:39:51.697126 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Dec 12 17:39:51.697132 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Dec 12 17:39:51.697138 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Dec 12 17:39:51.697144 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Dec 12 17:39:51.697151 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Dec 12 17:39:51.697157 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Dec 12 17:39:51.697163 systemd[1]: Created slice user.slice - User and Session Slice. Dec 12 17:39:51.697169 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 17:39:51.697175 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 17:39:51.697181 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Dec 12 17:39:51.697188 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Dec 12 17:39:51.697194 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Dec 12 17:39:51.697202 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 12 17:39:51.697208 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Dec 12 17:39:51.697216 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 17:39:51.697222 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 12 17:39:51.697228 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Dec 12 17:39:51.697234 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Dec 12 17:39:51.697240 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Dec 12 17:39:51.697247 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Dec 12 17:39:51.697254 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 17:39:51.697260 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 12 17:39:51.697266 systemd[1]: Reached target slices.target - Slice Units. Dec 12 17:39:51.697272 systemd[1]: Reached target swap.target - Swaps. Dec 12 17:39:51.697278 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Dec 12 17:39:51.697284 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Dec 12 17:39:51.697292 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Dec 12 17:39:51.697298 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 12 17:39:51.697304 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 12 17:39:51.697310 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 17:39:51.697317 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Dec 12 17:39:51.697323 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Dec 12 17:39:51.697329 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Dec 12 17:39:51.697337 systemd[1]: Mounting media.mount - External Media Directory... Dec 12 17:39:51.697343 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Dec 12 17:39:51.697349 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Dec 12 17:39:51.697355 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Dec 12 17:39:51.697362 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 12 17:39:51.697368 systemd[1]: Reached target machines.target - Containers. Dec 12 17:39:51.697374 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Dec 12 17:39:51.697381 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:39:51.697388 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 12 17:39:51.697394 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Dec 12 17:39:51.697401 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 17:39:51.697407 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 12 17:39:51.697413 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 17:39:51.697419 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Dec 12 17:39:51.697425 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 17:39:51.697432 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Dec 12 17:39:51.697438 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 12 17:39:51.697446 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Dec 12 17:39:51.697452 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Dec 12 17:39:51.697459 systemd[1]: Stopped systemd-fsck-usr.service. Dec 12 17:39:51.697465 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:39:51.697472 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 12 17:39:51.697478 kernel: loop: module loaded Dec 12 17:39:51.697484 kernel: fuse: init (API version 7.41) Dec 12 17:39:51.697489 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 12 17:39:51.697496 kernel: ACPI: bus type drm_connector registered Dec 12 17:39:51.697527 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 12 17:39:51.697534 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Dec 12 17:39:51.697541 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Dec 12 17:39:51.697576 systemd-journald[1393]: Collecting audit messages is disabled. Dec 12 17:39:51.697593 systemd-journald[1393]: Journal started Dec 12 17:39:51.697609 systemd-journald[1393]: Runtime Journal (/run/log/journal/1925a678ca9f47be8df53d8c45e61c1e) is 8M, max 78.3M, 70.3M free. Dec 12 17:39:50.835518 systemd[1]: Queued start job for default target multi-user.target. Dec 12 17:39:50.839963 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Dec 12 17:39:50.840370 systemd[1]: systemd-journald.service: Deactivated successfully. Dec 12 17:39:50.840659 systemd[1]: systemd-journald.service: Consumed 2.819s CPU time. Dec 12 17:39:51.706544 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 12 17:39:51.722350 systemd[1]: verity-setup.service: Deactivated successfully. Dec 12 17:39:51.722403 systemd[1]: Stopped verity-setup.service. Dec 12 17:39:51.740721 systemd[1]: Started systemd-journald.service - Journal Service. Dec 12 17:39:51.741606 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Dec 12 17:39:51.748744 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Dec 12 17:39:51.754964 systemd[1]: Mounted media.mount - External Media Directory. Dec 12 17:39:51.760587 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Dec 12 17:39:51.766482 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Dec 12 17:39:51.772786 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Dec 12 17:39:51.778582 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Dec 12 17:39:51.786555 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 17:39:51.793418 systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 12 17:39:51.793694 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Dec 12 17:39:51.799544 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 17:39:51.799751 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 17:39:51.805850 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 12 17:39:51.806053 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 12 17:39:51.811201 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 17:39:51.811327 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 17:39:51.817011 systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 12 17:39:51.817137 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Dec 12 17:39:51.823017 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 17:39:51.823137 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 17:39:51.828794 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 12 17:39:51.834638 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 17:39:51.841257 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Dec 12 17:39:51.847152 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Dec 12 17:39:51.854023 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 17:39:51.867875 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 12 17:39:51.874612 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Dec 12 17:39:51.890599 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Dec 12 17:39:51.895984 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Dec 12 17:39:51.896015 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 12 17:39:51.902246 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Dec 12 17:39:51.909867 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Dec 12 17:39:51.914872 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:39:51.923630 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Dec 12 17:39:51.930176 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Dec 12 17:39:51.936036 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 12 17:39:51.936915 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Dec 12 17:39:51.942319 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 12 17:39:51.943083 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 12 17:39:51.948316 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Dec 12 17:39:51.956623 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 12 17:39:51.964146 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Dec 12 17:39:51.971846 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Dec 12 17:39:51.981630 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Dec 12 17:39:51.987623 systemd-journald[1393]: Time spent on flushing to /var/log/journal/1925a678ca9f47be8df53d8c45e61c1e is 8.563ms for 936 entries. Dec 12 17:39:51.987623 systemd-journald[1393]: System Journal (/var/log/journal/1925a678ca9f47be8df53d8c45e61c1e) is 8M, max 2.6G, 2.6G free. Dec 12 17:39:52.035369 systemd-journald[1393]: Received client request to flush runtime journal. Dec 12 17:39:52.035422 kernel: loop0: detected capacity change from 0 to 100632 Dec 12 17:39:51.993317 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Dec 12 17:39:52.001405 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Dec 12 17:39:52.036949 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Dec 12 17:39:52.051380 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 12 17:39:52.077262 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Dec 12 17:39:52.079635 systemd-tmpfiles[1444]: ACLs are not supported, ignoring. Dec 12 17:39:52.080184 systemd-tmpfiles[1444]: ACLs are not supported, ignoring. Dec 12 17:39:52.080635 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Dec 12 17:39:52.089547 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 12 17:39:52.100491 systemd[1]: Starting systemd-sysusers.service - Create System Users... Dec 12 17:39:52.239262 systemd[1]: Finished systemd-sysusers.service - Create System Users. Dec 12 17:39:52.245724 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 12 17:39:52.263259 systemd-tmpfiles[1459]: ACLs are not supported, ignoring. Dec 12 17:39:52.263575 systemd-tmpfiles[1459]: ACLs are not supported, ignoring. Dec 12 17:39:52.266108 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 17:39:52.466728 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Dec 12 17:39:52.520524 kernel: loop1: detected capacity change from 0 to 27936 Dec 12 17:39:52.543826 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Dec 12 17:39:52.551135 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 17:39:52.577750 systemd-udevd[1465]: Using default interface naming scheme 'v255'. Dec 12 17:39:52.771704 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 17:39:52.787705 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 12 17:39:52.829613 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Dec 12 17:39:52.861750 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Dec 12 17:39:52.935768 kernel: loop2: detected capacity change from 0 to 200800 Dec 12 17:39:52.935861 kernel: mousedev: PS/2 mouse device common for all mice Dec 12 17:39:52.936847 systemd[1]: Started systemd-userdbd.service - User Database Manager. Dec 12 17:39:52.949530 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#189 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Dec 12 17:39:52.987423 kernel: hv_vmbus: registering driver hv_balloon Dec 12 17:39:52.987522 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Dec 12 17:39:52.992495 kernel: hv_balloon: Memory hot add disabled on ARM64 Dec 12 17:39:53.008525 kernel: loop3: detected capacity change from 0 to 119840 Dec 12 17:39:53.042525 kernel: hv_vmbus: registering driver hyperv_fb Dec 12 17:39:53.053343 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Dec 12 17:39:53.053426 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Dec 12 17:39:53.060131 kernel: Console: switching to colour dummy device 80x25 Dec 12 17:39:53.068559 kernel: Console: switching to colour frame buffer device 128x48 Dec 12 17:39:53.094214 systemd-networkd[1495]: lo: Link UP Dec 12 17:39:53.094540 systemd-networkd[1495]: lo: Gained carrier Dec 12 17:39:53.095496 systemd-networkd[1495]: Enumeration completed Dec 12 17:39:53.095732 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:39:53.096989 systemd-networkd[1495]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:39:53.097054 systemd-networkd[1495]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 12 17:39:53.105330 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 12 17:39:53.113749 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Dec 12 17:39:53.123589 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Dec 12 17:39:53.150532 kernel: mlx5_core 573e:00:02.0 enP22334s1: Link up Dec 12 17:39:53.171480 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 17:39:53.173543 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:39:53.181698 kernel: hv_netvsc 000d3ac5-f17f-000d-3ac5-f17f000d3ac5 eth0: Data path switched to VF: enP22334s1 Dec 12 17:39:53.184484 systemd-networkd[1495]: enP22334s1: Link UP Dec 12 17:39:53.184631 systemd-networkd[1495]: eth0: Link UP Dec 12 17:39:53.184634 systemd-networkd[1495]: eth0: Gained carrier Dec 12 17:39:53.184652 systemd-networkd[1495]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:39:53.186076 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Dec 12 17:39:53.187756 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:39:53.195783 systemd-networkd[1495]: enP22334s1: Gained carrier Dec 12 17:39:53.199727 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Dec 12 17:39:53.208924 systemd-networkd[1495]: eth0: DHCPv4 address 10.200.20.11/24, gateway 10.200.20.1 acquired from 168.63.129.16 Dec 12 17:39:53.234074 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Dec 12 17:39:53.242377 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Dec 12 17:39:53.259522 kernel: MACsec IEEE 802.1AE Dec 12 17:39:53.298145 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Dec 12 17:39:53.394532 kernel: loop4: detected capacity change from 0 to 100632 Dec 12 17:39:53.408535 kernel: loop5: detected capacity change from 0 to 27936 Dec 12 17:39:53.423540 kernel: loop6: detected capacity change from 0 to 200800 Dec 12 17:39:53.447603 kernel: loop7: detected capacity change from 0 to 119840 Dec 12 17:39:53.467749 (sd-merge)[1611]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. Dec 12 17:39:53.468143 (sd-merge)[1611]: Merged extensions into '/usr'. Dec 12 17:39:53.471432 systemd[1]: Reload requested from client PID 1442 ('systemd-sysext') (unit systemd-sysext.service)... Dec 12 17:39:53.471449 systemd[1]: Reloading... Dec 12 17:39:53.525523 zram_generator::config[1641]: No configuration found. Dec 12 17:39:53.693776 systemd[1]: Reloading finished in 221 ms. Dec 12 17:39:53.718563 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:39:53.724136 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Dec 12 17:39:53.736453 systemd[1]: Starting ensure-sysext.service... Dec 12 17:39:53.743654 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 12 17:39:53.759539 systemd[1]: Reload requested from client PID 1699 ('systemctl') (unit ensure-sysext.service)... Dec 12 17:39:53.759553 systemd[1]: Reloading... Dec 12 17:39:53.805308 systemd-tmpfiles[1700]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Dec 12 17:39:53.805331 systemd-tmpfiles[1700]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Dec 12 17:39:53.805906 systemd-tmpfiles[1700]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 12 17:39:53.806091 systemd-tmpfiles[1700]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Dec 12 17:39:53.807251 systemd-tmpfiles[1700]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Dec 12 17:39:53.807559 systemd-tmpfiles[1700]: ACLs are not supported, ignoring. Dec 12 17:39:53.807677 systemd-tmpfiles[1700]: ACLs are not supported, ignoring. Dec 12 17:39:53.816567 zram_generator::config[1733]: No configuration found. Dec 12 17:39:53.827563 systemd-tmpfiles[1700]: Detected autofs mount point /boot during canonicalization of boot. Dec 12 17:39:53.827574 systemd-tmpfiles[1700]: Skipping /boot Dec 12 17:39:53.834091 systemd-tmpfiles[1700]: Detected autofs mount point /boot during canonicalization of boot. Dec 12 17:39:53.834100 systemd-tmpfiles[1700]: Skipping /boot Dec 12 17:39:53.978905 systemd[1]: Reloading finished in 219 ms. Dec 12 17:39:53.992371 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 17:39:54.005935 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 12 17:39:54.018165 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Dec 12 17:39:54.025948 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:39:54.027801 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 17:39:54.038923 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 17:39:54.047735 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 17:39:54.052295 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:39:54.052397 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:39:54.053710 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Dec 12 17:39:54.066325 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 12 17:39:54.074824 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Dec 12 17:39:54.082278 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 17:39:54.083833 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 17:39:54.091149 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 17:39:54.091298 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 17:39:54.098677 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 17:39:54.098914 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 17:39:54.108749 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:39:54.116920 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 17:39:54.126876 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 17:39:54.138185 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 17:39:54.145290 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:39:54.145641 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:39:54.149171 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 17:39:54.149632 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 17:39:54.156008 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 17:39:54.156150 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 17:39:54.161963 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 17:39:54.162093 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 17:39:54.170131 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Dec 12 17:39:54.176492 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Dec 12 17:39:54.188322 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:39:54.189657 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 17:39:54.206466 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 12 17:39:54.212489 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 17:39:54.213841 systemd-resolved[1795]: Positive Trust Anchors: Dec 12 17:39:54.213851 systemd-resolved[1795]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 12 17:39:54.213871 systemd-resolved[1795]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 12 17:39:54.219760 systemd-resolved[1795]: Using system hostname 'ci-4459.2.2-a-9f5170e2ca'. Dec 12 17:39:54.221722 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 17:39:54.227715 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:39:54.227833 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:39:54.227953 systemd[1]: Reached target time-set.target - System Time Set. Dec 12 17:39:54.234820 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 12 17:39:54.240963 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 17:39:54.241263 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 17:39:54.247418 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 12 17:39:54.247571 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 12 17:39:54.252916 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 17:39:54.253051 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 17:39:54.258876 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 17:39:54.258997 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 17:39:54.262058 augenrules[1831]: No rules Dec 12 17:39:54.265203 systemd[1]: audit-rules.service: Deactivated successfully. Dec 12 17:39:54.265355 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 12 17:39:54.273675 systemd[1]: Finished ensure-sysext.service. Dec 12 17:39:54.278892 systemd[1]: Reached target network.target - Network. Dec 12 17:39:54.283481 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 12 17:39:54.288852 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 12 17:39:54.288912 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 12 17:39:54.437712 systemd-networkd[1495]: eth0: Gained IPv6LL Dec 12 17:39:54.440375 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Dec 12 17:39:54.446979 systemd[1]: Reached target network-online.target - Network is Online. Dec 12 17:39:54.837397 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Dec 12 17:39:54.843235 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 12 17:39:57.719142 ldconfig[1437]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Dec 12 17:39:57.730002 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Dec 12 17:39:57.737727 systemd[1]: Starting systemd-update-done.service - Update is Completed... Dec 12 17:39:57.754104 systemd[1]: Finished systemd-update-done.service - Update is Completed. Dec 12 17:39:57.760274 systemd[1]: Reached target sysinit.target - System Initialization. Dec 12 17:39:57.765318 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Dec 12 17:39:57.771851 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Dec 12 17:39:57.777883 systemd[1]: Started logrotate.timer - Daily rotation of log files. Dec 12 17:39:57.783426 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Dec 12 17:39:57.789122 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Dec 12 17:39:57.794795 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Dec 12 17:39:57.794827 systemd[1]: Reached target paths.target - Path Units. Dec 12 17:39:57.799555 systemd[1]: Reached target timers.target - Timer Units. Dec 12 17:39:57.823587 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Dec 12 17:39:57.830113 systemd[1]: Starting docker.socket - Docker Socket for the API... Dec 12 17:39:57.835945 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Dec 12 17:39:57.842014 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Dec 12 17:39:57.848405 systemd[1]: Reached target ssh-access.target - SSH Access Available. Dec 12 17:39:57.854809 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Dec 12 17:39:57.859710 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Dec 12 17:39:57.866349 systemd[1]: Listening on docker.socket - Docker Socket for the API. Dec 12 17:39:57.871330 systemd[1]: Reached target sockets.target - Socket Units. Dec 12 17:39:57.876336 systemd[1]: Reached target basic.target - Basic System. Dec 12 17:39:57.880846 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Dec 12 17:39:57.880868 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Dec 12 17:39:57.883098 systemd[1]: Starting chronyd.service - NTP client/server... Dec 12 17:39:57.897242 systemd[1]: Starting containerd.service - containerd container runtime... Dec 12 17:39:57.904635 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Dec 12 17:39:57.912666 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Dec 12 17:39:57.920002 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Dec 12 17:39:57.928626 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Dec 12 17:39:57.935755 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Dec 12 17:39:57.941231 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Dec 12 17:39:57.944603 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. Dec 12 17:39:57.950362 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). Dec 12 17:39:57.951793 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:39:57.959262 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Dec 12 17:39:57.960386 KVP[1859]: KVP starting; pid is:1859 Dec 12 17:39:57.963406 jq[1857]: false Dec 12 17:39:57.970347 kernel: hv_utils: KVP IC version 4.0 Dec 12 17:39:57.967861 chronyd[1849]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Dec 12 17:39:57.968106 KVP[1859]: KVP LIC Version: 3.1 Dec 12 17:39:57.970921 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Dec 12 17:39:57.980301 extend-filesystems[1858]: Found /dev/sda6 Dec 12 17:39:57.980640 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Dec 12 17:39:57.992731 chronyd[1849]: Timezone right/UTC failed leap second check, ignoring Dec 12 17:39:57.992936 chronyd[1849]: Loaded seccomp filter (level 2) Dec 12 17:39:57.997762 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Dec 12 17:39:58.005071 extend-filesystems[1858]: Found /dev/sda9 Dec 12 17:39:58.014776 extend-filesystems[1858]: Checking size of /dev/sda9 Dec 12 17:39:58.008960 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Dec 12 17:39:58.025672 systemd[1]: Starting systemd-logind.service - User Login Management... Dec 12 17:39:58.032106 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Dec 12 17:39:58.032645 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Dec 12 17:39:58.036736 systemd[1]: Starting update-engine.service - Update Engine... Dec 12 17:39:58.049877 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Dec 12 17:39:58.058827 systemd[1]: Started chronyd.service - NTP client/server. Dec 12 17:39:58.070044 extend-filesystems[1858]: Old size kept for /dev/sda9 Dec 12 17:39:58.078891 jq[1888]: true Dec 12 17:39:58.081536 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Dec 12 17:39:58.089084 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Dec 12 17:39:58.090523 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Dec 12 17:39:58.090847 systemd[1]: extend-filesystems.service: Deactivated successfully. Dec 12 17:39:58.095548 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Dec 12 17:39:58.105846 systemd[1]: motdgen.service: Deactivated successfully. Dec 12 17:39:58.106027 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Dec 12 17:39:58.117269 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Dec 12 17:39:58.125900 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Dec 12 17:39:58.126062 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Dec 12 17:39:58.126821 systemd-logind[1877]: New seat seat0. Dec 12 17:39:58.137694 systemd-logind[1877]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Dec 12 17:39:58.139022 systemd[1]: Started systemd-logind.service - User Login Management. Dec 12 17:39:58.170154 update_engine[1881]: I20251212 17:39:58.169687 1881 main.cc:92] Flatcar Update Engine starting Dec 12 17:39:58.182113 (ntainerd)[1916]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Dec 12 17:39:58.184571 jq[1915]: true Dec 12 17:39:58.230899 tar[1910]: linux-arm64/LICENSE Dec 12 17:39:58.231129 tar[1910]: linux-arm64/helm Dec 12 17:39:58.289644 dbus-daemon[1852]: [system] SELinux support is enabled Dec 12 17:39:58.290200 systemd[1]: Started dbus.service - D-Bus System Message Bus. Dec 12 17:39:58.298976 bash[1971]: Updated "/home/core/.ssh/authorized_keys" Dec 12 17:39:58.301420 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Dec 12 17:39:58.301448 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Dec 12 17:39:58.315409 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Dec 12 17:39:58.315431 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Dec 12 17:39:58.324123 update_engine[1881]: I20251212 17:39:58.324065 1881 update_check_scheduler.cc:74] Next update check in 5m8s Dec 12 17:39:58.329574 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Dec 12 17:39:58.343923 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Dec 12 17:39:58.345625 systemd[1]: Started update-engine.service - Update Engine. Dec 12 17:39:58.350913 dbus-daemon[1852]: [system] Successfully activated service 'org.freedesktop.systemd1' Dec 12 17:39:58.356413 systemd[1]: Started locksmithd.service - Cluster reboot manager. Dec 12 17:39:58.400925 coreos-metadata[1851]: Dec 12 17:39:58.400 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Dec 12 17:39:58.407090 coreos-metadata[1851]: Dec 12 17:39:58.407 INFO Fetch successful Dec 12 17:39:58.407467 coreos-metadata[1851]: Dec 12 17:39:58.407 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 Dec 12 17:39:58.412798 coreos-metadata[1851]: Dec 12 17:39:58.412 INFO Fetch successful Dec 12 17:39:58.412798 coreos-metadata[1851]: Dec 12 17:39:58.412 INFO Fetching http://168.63.129.16/machine/ff732f79-0d94-4275-a3cc-e68070f58fee/0d209bf1%2Ddd43%2D442a%2D9bb0%2D88892f65ceb6.%5Fci%2D4459.2.2%2Da%2D9f5170e2ca?comp=config&type=sharedConfig&incarnation=1: Attempt #1 Dec 12 17:39:58.417092 coreos-metadata[1851]: Dec 12 17:39:58.416 INFO Fetch successful Dec 12 17:39:58.417092 coreos-metadata[1851]: Dec 12 17:39:58.416 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 Dec 12 17:39:58.427355 coreos-metadata[1851]: Dec 12 17:39:58.427 INFO Fetch successful Dec 12 17:39:58.468969 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Dec 12 17:39:58.478725 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Dec 12 17:39:58.675488 sshd_keygen[1880]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Dec 12 17:39:58.678621 tar[1910]: linux-arm64/README.md Dec 12 17:39:58.698548 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Dec 12 17:39:58.706157 locksmithd[1997]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Dec 12 17:39:58.708598 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Dec 12 17:39:58.719531 systemd[1]: Starting issuegen.service - Generate /run/issue... Dec 12 17:39:58.729307 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... Dec 12 17:39:58.736959 systemd[1]: issuegen.service: Deactivated successfully. Dec 12 17:39:58.737144 systemd[1]: Finished issuegen.service - Generate /run/issue. Dec 12 17:39:58.747798 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Dec 12 17:39:58.761653 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Dec 12 17:39:58.767720 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. Dec 12 17:39:58.777730 systemd[1]: Started getty@tty1.service - Getty on tty1. Dec 12 17:39:58.782827 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Dec 12 17:39:58.788525 systemd[1]: Reached target getty.target - Login Prompts. Dec 12 17:39:58.931129 containerd[1916]: time="2025-12-12T17:39:58Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Dec 12 17:39:58.933091 containerd[1916]: time="2025-12-12T17:39:58.933054632Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Dec 12 17:39:58.943819 containerd[1916]: time="2025-12-12T17:39:58.943775640Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.872µs" Dec 12 17:39:58.943945 containerd[1916]: time="2025-12-12T17:39:58.943928800Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Dec 12 17:39:58.943990 containerd[1916]: time="2025-12-12T17:39:58.943979040Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Dec 12 17:39:58.944169 containerd[1916]: time="2025-12-12T17:39:58.944152000Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Dec 12 17:39:58.944230 containerd[1916]: time="2025-12-12T17:39:58.944217344Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Dec 12 17:39:58.944302 containerd[1916]: time="2025-12-12T17:39:58.944289440Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 12 17:39:58.944406 containerd[1916]: time="2025-12-12T17:39:58.944387688Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 12 17:39:58.944449 containerd[1916]: time="2025-12-12T17:39:58.944437328Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 12 17:39:58.944744 containerd[1916]: time="2025-12-12T17:39:58.944720512Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 12 17:39:58.944819 containerd[1916]: time="2025-12-12T17:39:58.944805736Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 12 17:39:58.944879 containerd[1916]: time="2025-12-12T17:39:58.944867456Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 12 17:39:58.944932 containerd[1916]: time="2025-12-12T17:39:58.944919872Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Dec 12 17:39:58.945060 containerd[1916]: time="2025-12-12T17:39:58.945043224Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Dec 12 17:39:58.945321 containerd[1916]: time="2025-12-12T17:39:58.945296784Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 12 17:39:58.945409 containerd[1916]: time="2025-12-12T17:39:58.945397128Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 12 17:39:58.945449 containerd[1916]: time="2025-12-12T17:39:58.945439376Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Dec 12 17:39:58.945542 containerd[1916]: time="2025-12-12T17:39:58.945529896Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Dec 12 17:39:58.945778 containerd[1916]: time="2025-12-12T17:39:58.945760072Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Dec 12 17:39:58.945893 containerd[1916]: time="2025-12-12T17:39:58.945879656Z" level=info msg="metadata content store policy set" policy=shared Dec 12 17:39:58.963599 containerd[1916]: time="2025-12-12T17:39:58.963544232Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Dec 12 17:39:58.963687 containerd[1916]: time="2025-12-12T17:39:58.963624312Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Dec 12 17:39:58.963687 containerd[1916]: time="2025-12-12T17:39:58.963636896Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Dec 12 17:39:58.963687 containerd[1916]: time="2025-12-12T17:39:58.963646688Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Dec 12 17:39:58.963687 containerd[1916]: time="2025-12-12T17:39:58.963655472Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Dec 12 17:39:58.963687 containerd[1916]: time="2025-12-12T17:39:58.963662032Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Dec 12 17:39:58.963687 containerd[1916]: time="2025-12-12T17:39:58.963669936Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Dec 12 17:39:58.963687 containerd[1916]: time="2025-12-12T17:39:58.963677472Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Dec 12 17:39:58.963687 containerd[1916]: time="2025-12-12T17:39:58.963685048Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Dec 12 17:39:58.963819 containerd[1916]: time="2025-12-12T17:39:58.963703112Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Dec 12 17:39:58.963819 containerd[1916]: time="2025-12-12T17:39:58.963709920Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Dec 12 17:39:58.963819 containerd[1916]: time="2025-12-12T17:39:58.963718400Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Dec 12 17:39:58.964213 containerd[1916]: time="2025-12-12T17:39:58.963855632Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Dec 12 17:39:58.964213 containerd[1916]: time="2025-12-12T17:39:58.963874768Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Dec 12 17:39:58.964213 containerd[1916]: time="2025-12-12T17:39:58.963885096Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Dec 12 17:39:58.964213 containerd[1916]: time="2025-12-12T17:39:58.963892208Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Dec 12 17:39:58.964213 containerd[1916]: time="2025-12-12T17:39:58.963899200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Dec 12 17:39:58.964213 containerd[1916]: time="2025-12-12T17:39:58.963905808Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Dec 12 17:39:58.964213 containerd[1916]: time="2025-12-12T17:39:58.963913888Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Dec 12 17:39:58.964213 containerd[1916]: time="2025-12-12T17:39:58.963920104Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Dec 12 17:39:58.964213 containerd[1916]: time="2025-12-12T17:39:58.963930592Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Dec 12 17:39:58.964213 containerd[1916]: time="2025-12-12T17:39:58.963937992Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Dec 12 17:39:58.964213 containerd[1916]: time="2025-12-12T17:39:58.963944288Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Dec 12 17:39:58.964213 containerd[1916]: time="2025-12-12T17:39:58.963986744Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Dec 12 17:39:58.964213 containerd[1916]: time="2025-12-12T17:39:58.963996360Z" level=info msg="Start snapshots syncer" Dec 12 17:39:58.964213 containerd[1916]: time="2025-12-12T17:39:58.964014784Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Dec 12 17:39:58.964416 containerd[1916]: time="2025-12-12T17:39:58.964192824Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Dec 12 17:39:58.964416 containerd[1916]: time="2025-12-12T17:39:58.964225504Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Dec 12 17:39:58.965835 containerd[1916]: time="2025-12-12T17:39:58.964255384Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Dec 12 17:39:58.965835 containerd[1916]: time="2025-12-12T17:39:58.964339656Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Dec 12 17:39:58.965835 containerd[1916]: time="2025-12-12T17:39:58.964353432Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Dec 12 17:39:58.965835 containerd[1916]: time="2025-12-12T17:39:58.964361360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Dec 12 17:39:58.965835 containerd[1916]: time="2025-12-12T17:39:58.964367592Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Dec 12 17:39:58.965835 containerd[1916]: time="2025-12-12T17:39:58.964375288Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Dec 12 17:39:58.965835 containerd[1916]: time="2025-12-12T17:39:58.964381936Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Dec 12 17:39:58.965835 containerd[1916]: time="2025-12-12T17:39:58.964389152Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Dec 12 17:39:58.965835 containerd[1916]: time="2025-12-12T17:39:58.964407360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Dec 12 17:39:58.965835 containerd[1916]: time="2025-12-12T17:39:58.964414648Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Dec 12 17:39:58.965835 containerd[1916]: time="2025-12-12T17:39:58.964421256Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Dec 12 17:39:58.965835 containerd[1916]: time="2025-12-12T17:39:58.964442760Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 12 17:39:58.965835 containerd[1916]: time="2025-12-12T17:39:58.964453952Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 12 17:39:58.965835 containerd[1916]: time="2025-12-12T17:39:58.964459352Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 12 17:39:58.966038 containerd[1916]: time="2025-12-12T17:39:58.964465584Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 12 17:39:58.966038 containerd[1916]: time="2025-12-12T17:39:58.964470368Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Dec 12 17:39:58.966038 containerd[1916]: time="2025-12-12T17:39:58.964497048Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Dec 12 17:39:58.966038 containerd[1916]: time="2025-12-12T17:39:58.964521032Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Dec 12 17:39:58.966038 containerd[1916]: time="2025-12-12T17:39:58.964532736Z" level=info msg="runtime interface created" Dec 12 17:39:58.966038 containerd[1916]: time="2025-12-12T17:39:58.964536128Z" level=info msg="created NRI interface" Dec 12 17:39:58.966038 containerd[1916]: time="2025-12-12T17:39:58.964541016Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Dec 12 17:39:58.966038 containerd[1916]: time="2025-12-12T17:39:58.964549160Z" level=info msg="Connect containerd service" Dec 12 17:39:58.966038 containerd[1916]: time="2025-12-12T17:39:58.964564360Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Dec 12 17:39:58.966038 containerd[1916]: time="2025-12-12T17:39:58.965565120Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 12 17:39:59.009000 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:39:59.078016 (kubelet)[2051]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:39:59.399259 kubelet[2051]: E1212 17:39:59.399144 2051 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:39:59.401295 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:39:59.401408 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:39:59.403588 systemd[1]: kubelet.service: Consumed 510ms CPU time, 246.1M memory peak. Dec 12 17:39:59.476326 containerd[1916]: time="2025-12-12T17:39:59.476266016Z" level=info msg="Start subscribing containerd event" Dec 12 17:39:59.476326 containerd[1916]: time="2025-12-12T17:39:59.476334984Z" level=info msg="Start recovering state" Dec 12 17:39:59.476448 containerd[1916]: time="2025-12-12T17:39:59.476416104Z" level=info msg="Start event monitor" Dec 12 17:39:59.476448 containerd[1916]: time="2025-12-12T17:39:59.476426376Z" level=info msg="Start cni network conf syncer for default" Dec 12 17:39:59.476448 containerd[1916]: time="2025-12-12T17:39:59.476430864Z" level=info msg="Start streaming server" Dec 12 17:39:59.476448 containerd[1916]: time="2025-12-12T17:39:59.476437600Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Dec 12 17:39:59.476448 containerd[1916]: time="2025-12-12T17:39:59.476442560Z" level=info msg="runtime interface starting up..." Dec 12 17:39:59.476448 containerd[1916]: time="2025-12-12T17:39:59.476448752Z" level=info msg="starting plugins..." Dec 12 17:39:59.476555 containerd[1916]: time="2025-12-12T17:39:59.476462400Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Dec 12 17:39:59.477017 containerd[1916]: time="2025-12-12T17:39:59.476991544Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Dec 12 17:39:59.477048 containerd[1916]: time="2025-12-12T17:39:59.477042592Z" level=info msg=serving... address=/run/containerd/containerd.sock Dec 12 17:39:59.477337 systemd[1]: Started containerd.service - containerd container runtime. Dec 12 17:39:59.482634 containerd[1916]: time="2025-12-12T17:39:59.482601392Z" level=info msg="containerd successfully booted in 0.551853s" Dec 12 17:39:59.486131 systemd[1]: Reached target multi-user.target - Multi-User System. Dec 12 17:39:59.491566 systemd[1]: Startup finished in 1.665s (kernel) + 13.344s (initrd) + 11.259s (userspace) = 26.269s. Dec 12 17:39:59.741482 login[2038]: pam_lastlog(login:session): file /var/log/lastlog is locked/write, retrying Dec 12 17:39:59.742404 login[2037]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:39:59.748948 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Dec 12 17:39:59.750392 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Dec 12 17:39:59.755963 systemd-logind[1877]: New session 2 of user core. Dec 12 17:39:59.763555 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Dec 12 17:39:59.767128 systemd[1]: Starting user@500.service - User Manager for UID 500... Dec 12 17:39:59.775290 (systemd)[2070]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Dec 12 17:39:59.777231 systemd-logind[1877]: New session c1 of user core. Dec 12 17:39:59.897755 systemd[2070]: Queued start job for default target default.target. Dec 12 17:39:59.902685 systemd[2070]: Created slice app.slice - User Application Slice. Dec 12 17:39:59.902713 systemd[2070]: Reached target paths.target - Paths. Dec 12 17:39:59.902743 systemd[2070]: Reached target timers.target - Timers. Dec 12 17:39:59.903737 systemd[2070]: Starting dbus.socket - D-Bus User Message Bus Socket... Dec 12 17:39:59.911647 systemd[2070]: Listening on dbus.socket - D-Bus User Message Bus Socket. Dec 12 17:39:59.911694 systemd[2070]: Reached target sockets.target - Sockets. Dec 12 17:39:59.911728 systemd[2070]: Reached target basic.target - Basic System. Dec 12 17:39:59.911748 systemd[2070]: Reached target default.target - Main User Target. Dec 12 17:39:59.911770 systemd[2070]: Startup finished in 129ms. Dec 12 17:39:59.911827 systemd[1]: Started user@500.service - User Manager for UID 500. Dec 12 17:39:59.912734 systemd[1]: Started session-2.scope - Session 2 of User core. Dec 12 17:40:00.279725 waagent[2035]: 2025-12-12T17:40:00.279659Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 Dec 12 17:40:00.284456 waagent[2035]: 2025-12-12T17:40:00.284414Z INFO Daemon Daemon OS: flatcar 4459.2.2 Dec 12 17:40:00.288572 waagent[2035]: 2025-12-12T17:40:00.288538Z INFO Daemon Daemon Python: 3.11.13 Dec 12 17:40:00.291978 waagent[2035]: 2025-12-12T17:40:00.291926Z INFO Daemon Daemon Run daemon Dec 12 17:40:00.295049 waagent[2035]: 2025-12-12T17:40:00.295012Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4459.2.2' Dec 12 17:40:00.301909 waagent[2035]: 2025-12-12T17:40:00.301880Z INFO Daemon Daemon Using waagent for provisioning Dec 12 17:40:00.306508 waagent[2035]: 2025-12-12T17:40:00.306467Z INFO Daemon Daemon Activate resource disk Dec 12 17:40:00.310540 waagent[2035]: 2025-12-12T17:40:00.310507Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Dec 12 17:40:00.319842 waagent[2035]: 2025-12-12T17:40:00.319800Z INFO Daemon Daemon Found device: None Dec 12 17:40:00.323844 waagent[2035]: 2025-12-12T17:40:00.323814Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Dec 12 17:40:00.330491 waagent[2035]: 2025-12-12T17:40:00.330464Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Dec 12 17:40:00.339841 waagent[2035]: 2025-12-12T17:40:00.339806Z INFO Daemon Daemon Clean protocol and wireserver endpoint Dec 12 17:40:00.344408 waagent[2035]: 2025-12-12T17:40:00.344379Z INFO Daemon Daemon Running default provisioning handler Dec 12 17:40:00.353603 waagent[2035]: 2025-12-12T17:40:00.353567Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. Dec 12 17:40:00.364119 waagent[2035]: 2025-12-12T17:40:00.364082Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Dec 12 17:40:00.371457 waagent[2035]: 2025-12-12T17:40:00.371424Z INFO Daemon Daemon cloud-init is enabled: False Dec 12 17:40:00.375256 waagent[2035]: 2025-12-12T17:40:00.375232Z INFO Daemon Daemon Copying ovf-env.xml Dec 12 17:40:00.425154 waagent[2035]: 2025-12-12T17:40:00.425076Z INFO Daemon Daemon Successfully mounted dvd Dec 12 17:40:00.453866 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Dec 12 17:40:00.456213 waagent[2035]: 2025-12-12T17:40:00.456156Z INFO Daemon Daemon Detect protocol endpoint Dec 12 17:40:00.460204 waagent[2035]: 2025-12-12T17:40:00.460165Z INFO Daemon Daemon Clean protocol and wireserver endpoint Dec 12 17:40:00.464386 waagent[2035]: 2025-12-12T17:40:00.464358Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Dec 12 17:40:00.469803 waagent[2035]: 2025-12-12T17:40:00.469777Z INFO Daemon Daemon Test for route to 168.63.129.16 Dec 12 17:40:00.474180 waagent[2035]: 2025-12-12T17:40:00.474151Z INFO Daemon Daemon Route to 168.63.129.16 exists Dec 12 17:40:00.478882 waagent[2035]: 2025-12-12T17:40:00.478847Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Dec 12 17:40:00.525340 waagent[2035]: 2025-12-12T17:40:00.525297Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Dec 12 17:40:00.530988 waagent[2035]: 2025-12-12T17:40:00.530930Z INFO Daemon Daemon Wire protocol version:2012-11-30 Dec 12 17:40:00.535191 waagent[2035]: 2025-12-12T17:40:00.535165Z INFO Daemon Daemon Server preferred version:2015-04-05 Dec 12 17:40:00.700732 waagent[2035]: 2025-12-12T17:40:00.700641Z INFO Daemon Daemon Initializing goal state during protocol detection Dec 12 17:40:00.706309 waagent[2035]: 2025-12-12T17:40:00.706262Z INFO Daemon Daemon Forcing an update of the goal state. Dec 12 17:40:00.714721 waagent[2035]: 2025-12-12T17:40:00.714681Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] Dec 12 17:40:00.733047 waagent[2035]: 2025-12-12T17:40:00.733014Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.177 Dec 12 17:40:00.737934 waagent[2035]: 2025-12-12T17:40:00.737900Z INFO Daemon Dec 12 17:40:00.740312 waagent[2035]: 2025-12-12T17:40:00.740283Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: 118ec0d4-6d07-4ddc-b408-a63d541834c7 eTag: 2346852255265084758 source: Fabric] Dec 12 17:40:00.742571 login[2038]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:40:00.750355 waagent[2035]: 2025-12-12T17:40:00.750322Z INFO Daemon The vmSettings originated via Fabric; will ignore them. Dec 12 17:40:00.756629 waagent[2035]: 2025-12-12T17:40:00.756332Z INFO Daemon Dec 12 17:40:00.758366 systemd-logind[1877]: New session 1 of user core. Dec 12 17:40:00.759517 waagent[2035]: 2025-12-12T17:40:00.758726Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] Dec 12 17:40:00.768636 systemd[1]: Started session-1.scope - Session 1 of User core. Dec 12 17:40:00.769579 waagent[2035]: 2025-12-12T17:40:00.769548Z INFO Daemon Daemon Downloading artifacts profile blob Dec 12 17:40:00.836057 waagent[2035]: 2025-12-12T17:40:00.835948Z INFO Daemon Downloaded certificate {'thumbprint': 'BF75443AA25D201275F6DEA2D03F86FBC518979E', 'hasPrivateKey': True} Dec 12 17:40:00.844629 waagent[2035]: 2025-12-12T17:40:00.844590Z INFO Daemon Fetch goal state completed Dec 12 17:40:00.854940 waagent[2035]: 2025-12-12T17:40:00.854907Z INFO Daemon Daemon Starting provisioning Dec 12 17:40:00.859052 waagent[2035]: 2025-12-12T17:40:00.859020Z INFO Daemon Daemon Handle ovf-env.xml. Dec 12 17:40:00.862763 waagent[2035]: 2025-12-12T17:40:00.862738Z INFO Daemon Daemon Set hostname [ci-4459.2.2-a-9f5170e2ca] Dec 12 17:40:00.886287 waagent[2035]: 2025-12-12T17:40:00.886229Z INFO Daemon Daemon Publish hostname [ci-4459.2.2-a-9f5170e2ca] Dec 12 17:40:00.891525 waagent[2035]: 2025-12-12T17:40:00.891473Z INFO Daemon Daemon Examine /proc/net/route for primary interface Dec 12 17:40:00.896445 waagent[2035]: 2025-12-12T17:40:00.896412Z INFO Daemon Daemon Primary interface is [eth0] Dec 12 17:40:00.906986 systemd-networkd[1495]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:40:00.906992 systemd-networkd[1495]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 12 17:40:00.907023 systemd-networkd[1495]: eth0: DHCP lease lost Dec 12 17:40:00.908077 waagent[2035]: 2025-12-12T17:40:00.908025Z INFO Daemon Daemon Create user account if not exists Dec 12 17:40:00.912423 waagent[2035]: 2025-12-12T17:40:00.912391Z INFO Daemon Daemon User core already exists, skip useradd Dec 12 17:40:00.917358 waagent[2035]: 2025-12-12T17:40:00.917327Z INFO Daemon Daemon Configure sudoer Dec 12 17:40:00.925702 waagent[2035]: 2025-12-12T17:40:00.925650Z INFO Daemon Daemon Configure sshd Dec 12 17:40:00.933742 waagent[2035]: 2025-12-12T17:40:00.933693Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. Dec 12 17:40:00.944640 waagent[2035]: 2025-12-12T17:40:00.944603Z INFO Daemon Daemon Deploy ssh public key. Dec 12 17:40:00.951352 systemd-networkd[1495]: eth0: DHCPv4 address 10.200.20.11/24, gateway 10.200.20.1 acquired from 168.63.129.16 Dec 12 17:40:02.021528 waagent[2035]: 2025-12-12T17:40:02.021350Z INFO Daemon Daemon Provisioning complete Dec 12 17:40:02.036349 waagent[2035]: 2025-12-12T17:40:02.036312Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Dec 12 17:40:02.041491 waagent[2035]: 2025-12-12T17:40:02.041458Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Dec 12 17:40:02.049617 waagent[2035]: 2025-12-12T17:40:02.049588Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent Dec 12 17:40:02.148536 waagent[2120]: 2025-12-12T17:40:02.148396Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) Dec 12 17:40:02.149537 waagent[2120]: 2025-12-12T17:40:02.148870Z INFO ExtHandler ExtHandler OS: flatcar 4459.2.2 Dec 12 17:40:02.149537 waagent[2120]: 2025-12-12T17:40:02.148945Z INFO ExtHandler ExtHandler Python: 3.11.13 Dec 12 17:40:02.149537 waagent[2120]: 2025-12-12T17:40:02.148987Z INFO ExtHandler ExtHandler CPU Arch: aarch64 Dec 12 17:40:02.189235 waagent[2120]: 2025-12-12T17:40:02.189163Z INFO ExtHandler ExtHandler Distro: flatcar-4459.2.2; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.13; Arch: aarch64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.22.0; Dec 12 17:40:02.189756 waagent[2120]: 2025-12-12T17:40:02.189721Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Dec 12 17:40:02.189879 waagent[2120]: 2025-12-12T17:40:02.189858Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Dec 12 17:40:02.196127 waagent[2120]: 2025-12-12T17:40:02.196082Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Dec 12 17:40:02.201772 waagent[2120]: 2025-12-12T17:40:02.201737Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.177 Dec 12 17:40:02.202224 waagent[2120]: 2025-12-12T17:40:02.202192Z INFO ExtHandler Dec 12 17:40:02.202337 waagent[2120]: 2025-12-12T17:40:02.202317Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: 97c7f002-a72b-4632-aaa0-1fbe5aa231b2 eTag: 2346852255265084758 source: Fabric] Dec 12 17:40:02.202702 waagent[2120]: 2025-12-12T17:40:02.202671Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Dec 12 17:40:02.203194 waagent[2120]: 2025-12-12T17:40:02.203163Z INFO ExtHandler Dec 12 17:40:02.203304 waagent[2120]: 2025-12-12T17:40:02.203282Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Dec 12 17:40:02.207008 waagent[2120]: 2025-12-12T17:40:02.206980Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Dec 12 17:40:02.257526 waagent[2120]: 2025-12-12T17:40:02.256771Z INFO ExtHandler Downloaded certificate {'thumbprint': 'BF75443AA25D201275F6DEA2D03F86FBC518979E', 'hasPrivateKey': True} Dec 12 17:40:02.257526 waagent[2120]: 2025-12-12T17:40:02.257188Z INFO ExtHandler Fetch goal state completed Dec 12 17:40:02.270436 waagent[2120]: 2025-12-12T17:40:02.270382Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.4.2 1 Jul 2025 (Library: OpenSSL 3.4.2 1 Jul 2025) Dec 12 17:40:02.273798 waagent[2120]: 2025-12-12T17:40:02.273709Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 2120 Dec 12 17:40:02.273855 waagent[2120]: 2025-12-12T17:40:02.273833Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** Dec 12 17:40:02.274086 waagent[2120]: 2025-12-12T17:40:02.274055Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** Dec 12 17:40:02.275166 waagent[2120]: 2025-12-12T17:40:02.275132Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4459.2.2', '', 'Flatcar Container Linux by Kinvolk'] Dec 12 17:40:02.275476 waagent[2120]: 2025-12-12T17:40:02.275447Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4459.2.2', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported Dec 12 17:40:02.275618 waagent[2120]: 2025-12-12T17:40:02.275593Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False Dec 12 17:40:02.276025 waagent[2120]: 2025-12-12T17:40:02.275995Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Dec 12 17:40:02.311758 waagent[2120]: 2025-12-12T17:40:02.311720Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Dec 12 17:40:02.311931 waagent[2120]: 2025-12-12T17:40:02.311903Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Dec 12 17:40:02.316234 waagent[2120]: 2025-12-12T17:40:02.316209Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Dec 12 17:40:02.321002 systemd[1]: Reload requested from client PID 2135 ('systemctl') (unit waagent.service)... Dec 12 17:40:02.321017 systemd[1]: Reloading... Dec 12 17:40:02.381582 zram_generator::config[2177]: No configuration found. Dec 12 17:40:02.529159 systemd[1]: Reloading finished in 207 ms. Dec 12 17:40:02.543118 waagent[2120]: 2025-12-12T17:40:02.540600Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service Dec 12 17:40:02.543118 waagent[2120]: 2025-12-12T17:40:02.540736Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully Dec 12 17:40:02.725541 waagent[2120]: 2025-12-12T17:40:02.725247Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Dec 12 17:40:02.725654 waagent[2120]: 2025-12-12T17:40:02.725577Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] Dec 12 17:40:02.726240 waagent[2120]: 2025-12-12T17:40:02.726197Z INFO ExtHandler ExtHandler Starting env monitor service. Dec 12 17:40:02.726545 waagent[2120]: 2025-12-12T17:40:02.726467Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Dec 12 17:40:02.727275 waagent[2120]: 2025-12-12T17:40:02.726709Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Dec 12 17:40:02.727275 waagent[2120]: 2025-12-12T17:40:02.726779Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Dec 12 17:40:02.727275 waagent[2120]: 2025-12-12T17:40:02.726931Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Dec 12 17:40:02.727275 waagent[2120]: 2025-12-12T17:40:02.727056Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Dec 12 17:40:02.727275 waagent[2120]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Dec 12 17:40:02.727275 waagent[2120]: eth0 00000000 0114C80A 0003 0 0 1024 00000000 0 0 0 Dec 12 17:40:02.727275 waagent[2120]: eth0 0014C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Dec 12 17:40:02.727275 waagent[2120]: eth0 0114C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Dec 12 17:40:02.727275 waagent[2120]: eth0 10813FA8 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Dec 12 17:40:02.727275 waagent[2120]: eth0 FEA9FEA9 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Dec 12 17:40:02.727617 waagent[2120]: 2025-12-12T17:40:02.727578Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Dec 12 17:40:02.727739 waagent[2120]: 2025-12-12T17:40:02.727704Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Dec 12 17:40:02.728051 waagent[2120]: 2025-12-12T17:40:02.728009Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Dec 12 17:40:02.728086 waagent[2120]: 2025-12-12T17:40:02.728071Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Dec 12 17:40:02.728201 waagent[2120]: 2025-12-12T17:40:02.728175Z INFO EnvHandler ExtHandler Configure routes Dec 12 17:40:02.728242 waagent[2120]: 2025-12-12T17:40:02.728223Z INFO EnvHandler ExtHandler Gateway:None Dec 12 17:40:02.728269 waagent[2120]: 2025-12-12T17:40:02.728255Z INFO EnvHandler ExtHandler Routes:None Dec 12 17:40:02.728399 waagent[2120]: 2025-12-12T17:40:02.728361Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Dec 12 17:40:02.728557 waagent[2120]: 2025-12-12T17:40:02.728528Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Dec 12 17:40:02.728817 waagent[2120]: 2025-12-12T17:40:02.728761Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Dec 12 17:40:02.734794 waagent[2120]: 2025-12-12T17:40:02.734755Z INFO ExtHandler ExtHandler Dec 12 17:40:02.734841 waagent[2120]: 2025-12-12T17:40:02.734821Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: 5fa93463-8e6e-4b3b-bd3d-1fbcf8de494f correlation 133f691e-7375-409d-a41d-5345c686dcba created: 2025-12-12T17:39:02.175331Z] Dec 12 17:40:02.735108 waagent[2120]: 2025-12-12T17:40:02.735076Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Dec 12 17:40:02.735504 waagent[2120]: 2025-12-12T17:40:02.735476Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 0 ms] Dec 12 17:40:02.757574 waagent[2120]: 2025-12-12T17:40:02.757458Z WARNING ExtHandler ExtHandler Failed to get firewall packets: 'iptables -w -t security -L OUTPUT --zero OUTPUT -nxv' failed: 2 (iptables v1.8.11 (nf_tables): Illegal option `--numeric' with this command Dec 12 17:40:02.757574 waagent[2120]: Try `iptables -h' or 'iptables --help' for more information.) Dec 12 17:40:02.757828 waagent[2120]: 2025-12-12T17:40:02.757797Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: 6FC5347C-FA8D-42C8-AD5D-7406DBA5110C;DroppedPackets: -1;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] Dec 12 17:40:02.809239 waagent[2120]: 2025-12-12T17:40:02.809119Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: Dec 12 17:40:02.809239 waagent[2120]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Dec 12 17:40:02.809239 waagent[2120]: pkts bytes target prot opt in out source destination Dec 12 17:40:02.809239 waagent[2120]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Dec 12 17:40:02.809239 waagent[2120]: pkts bytes target prot opt in out source destination Dec 12 17:40:02.809239 waagent[2120]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Dec 12 17:40:02.809239 waagent[2120]: pkts bytes target prot opt in out source destination Dec 12 17:40:02.809239 waagent[2120]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Dec 12 17:40:02.809239 waagent[2120]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Dec 12 17:40:02.809239 waagent[2120]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Dec 12 17:40:02.811597 waagent[2120]: 2025-12-12T17:40:02.811452Z INFO EnvHandler ExtHandler Current Firewall rules: Dec 12 17:40:02.811597 waagent[2120]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Dec 12 17:40:02.811597 waagent[2120]: pkts bytes target prot opt in out source destination Dec 12 17:40:02.811597 waagent[2120]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Dec 12 17:40:02.811597 waagent[2120]: pkts bytes target prot opt in out source destination Dec 12 17:40:02.811597 waagent[2120]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Dec 12 17:40:02.811597 waagent[2120]: pkts bytes target prot opt in out source destination Dec 12 17:40:02.811597 waagent[2120]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Dec 12 17:40:02.811597 waagent[2120]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Dec 12 17:40:02.811597 waagent[2120]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Dec 12 17:40:02.811769 waagent[2120]: 2025-12-12T17:40:02.811697Z INFO EnvHandler ExtHandler Set block dev timeout: sda with timeout: 300 Dec 12 17:40:02.815169 waagent[2120]: 2025-12-12T17:40:02.814878Z INFO MonitorHandler ExtHandler Network interfaces: Dec 12 17:40:02.815169 waagent[2120]: Executing ['ip', '-a', '-o', 'link']: Dec 12 17:40:02.815169 waagent[2120]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Dec 12 17:40:02.815169 waagent[2120]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 00:0d:3a:c5:f1:7f brd ff:ff:ff:ff:ff:ff Dec 12 17:40:02.815169 waagent[2120]: 3: enP22334s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 00:0d:3a:c5:f1:7f brd ff:ff:ff:ff:ff:ff\ altname enP22334p0s2 Dec 12 17:40:02.815169 waagent[2120]: Executing ['ip', '-4', '-a', '-o', 'address']: Dec 12 17:40:02.815169 waagent[2120]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Dec 12 17:40:02.815169 waagent[2120]: 2: eth0 inet 10.200.20.11/24 metric 1024 brd 10.200.20.255 scope global eth0\ valid_lft forever preferred_lft forever Dec 12 17:40:02.815169 waagent[2120]: Executing ['ip', '-6', '-a', '-o', 'address']: Dec 12 17:40:02.815169 waagent[2120]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever Dec 12 17:40:02.815169 waagent[2120]: 2: eth0 inet6 fe80::20d:3aff:fec5:f17f/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Dec 12 17:40:09.652128 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Dec 12 17:40:09.653874 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:40:09.813117 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:40:09.816138 (kubelet)[2269]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:40:09.899723 kubelet[2269]: E1212 17:40:09.899655 2269 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:40:09.902484 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:40:09.902757 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:40:09.903282 systemd[1]: kubelet.service: Consumed 111ms CPU time, 107.5M memory peak. Dec 12 17:40:18.627540 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Dec 12 17:40:18.629172 systemd[1]: Started sshd@0-10.200.20.11:22-10.200.16.10:36356.service - OpenSSH per-connection server daemon (10.200.16.10:36356). Dec 12 17:40:19.234914 sshd[2277]: Accepted publickey for core from 10.200.16.10 port 36356 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:40:19.235993 sshd-session[2277]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:40:19.239812 systemd-logind[1877]: New session 3 of user core. Dec 12 17:40:19.246629 systemd[1]: Started session-3.scope - Session 3 of User core. Dec 12 17:40:19.642855 systemd[1]: Started sshd@1-10.200.20.11:22-10.200.16.10:36368.service - OpenSSH per-connection server daemon (10.200.16.10:36368). Dec 12 17:40:19.913411 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Dec 12 17:40:19.915156 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:40:20.095849 sshd[2283]: Accepted publickey for core from 10.200.16.10 port 36368 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:40:20.097435 sshd-session[2283]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:40:20.101398 systemd-logind[1877]: New session 4 of user core. Dec 12 17:40:20.108955 systemd[1]: Started session-4.scope - Session 4 of User core. Dec 12 17:40:20.252444 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:40:20.255120 (kubelet)[2295]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:40:20.283588 kubelet[2295]: E1212 17:40:20.283536 2295 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:40:20.285907 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:40:20.286121 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:40:20.286708 systemd[1]: kubelet.service: Consumed 108ms CPU time, 106.2M memory peak. Dec 12 17:40:20.422152 sshd[2289]: Connection closed by 10.200.16.10 port 36368 Dec 12 17:40:20.422974 sshd-session[2283]: pam_unix(sshd:session): session closed for user core Dec 12 17:40:20.426237 systemd[1]: sshd@1-10.200.20.11:22-10.200.16.10:36368.service: Deactivated successfully. Dec 12 17:40:20.427663 systemd[1]: session-4.scope: Deactivated successfully. Dec 12 17:40:20.428835 systemd-logind[1877]: Session 4 logged out. Waiting for processes to exit. Dec 12 17:40:20.430037 systemd-logind[1877]: Removed session 4. Dec 12 17:40:20.527349 systemd[1]: Started sshd@2-10.200.20.11:22-10.200.16.10:40478.service - OpenSSH per-connection server daemon (10.200.16.10:40478). Dec 12 17:40:21.018036 sshd[2307]: Accepted publickey for core from 10.200.16.10 port 40478 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:40:21.019127 sshd-session[2307]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:40:21.022699 systemd-logind[1877]: New session 5 of user core. Dec 12 17:40:21.029800 systemd[1]: Started session-5.scope - Session 5 of User core. Dec 12 17:40:21.377830 sshd[2310]: Connection closed by 10.200.16.10 port 40478 Dec 12 17:40:21.378280 sshd-session[2307]: pam_unix(sshd:session): session closed for user core Dec 12 17:40:21.381343 systemd[1]: sshd@2-10.200.20.11:22-10.200.16.10:40478.service: Deactivated successfully. Dec 12 17:40:21.382983 systemd[1]: session-5.scope: Deactivated successfully. Dec 12 17:40:21.383655 systemd-logind[1877]: Session 5 logged out. Waiting for processes to exit. Dec 12 17:40:21.385093 systemd-logind[1877]: Removed session 5. Dec 12 17:40:21.465714 systemd[1]: Started sshd@3-10.200.20.11:22-10.200.16.10:40488.service - OpenSSH per-connection server daemon (10.200.16.10:40488). Dec 12 17:40:21.783814 chronyd[1849]: Selected source PHC0 Dec 12 17:40:21.970392 sshd[2316]: Accepted publickey for core from 10.200.16.10 port 40488 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:40:21.971486 sshd-session[2316]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:40:21.975338 systemd-logind[1877]: New session 6 of user core. Dec 12 17:40:21.986633 systemd[1]: Started session-6.scope - Session 6 of User core. Dec 12 17:40:22.323519 sshd[2319]: Connection closed by 10.200.16.10 port 40488 Dec 12 17:40:22.324102 sshd-session[2316]: pam_unix(sshd:session): session closed for user core Dec 12 17:40:22.327459 systemd-logind[1877]: Session 6 logged out. Waiting for processes to exit. Dec 12 17:40:22.328099 systemd[1]: sshd@3-10.200.20.11:22-10.200.16.10:40488.service: Deactivated successfully. Dec 12 17:40:22.331047 systemd[1]: session-6.scope: Deactivated successfully. Dec 12 17:40:22.332665 systemd-logind[1877]: Removed session 6. Dec 12 17:40:22.404730 systemd[1]: Started sshd@4-10.200.20.11:22-10.200.16.10:40504.service - OpenSSH per-connection server daemon (10.200.16.10:40504). Dec 12 17:40:22.858120 sshd[2325]: Accepted publickey for core from 10.200.16.10 port 40504 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:40:22.859269 sshd-session[2325]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:40:22.863082 systemd-logind[1877]: New session 7 of user core. Dec 12 17:40:22.871631 systemd[1]: Started session-7.scope - Session 7 of User core. Dec 12 17:40:23.256711 sudo[2329]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Dec 12 17:40:23.256939 sudo[2329]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:40:23.285981 sudo[2329]: pam_unix(sudo:session): session closed for user root Dec 12 17:40:23.362489 sshd[2328]: Connection closed by 10.200.16.10 port 40504 Dec 12 17:40:23.361780 sshd-session[2325]: pam_unix(sshd:session): session closed for user core Dec 12 17:40:23.365425 systemd-logind[1877]: Session 7 logged out. Waiting for processes to exit. Dec 12 17:40:23.366120 systemd[1]: sshd@4-10.200.20.11:22-10.200.16.10:40504.service: Deactivated successfully. Dec 12 17:40:23.367688 systemd[1]: session-7.scope: Deactivated successfully. Dec 12 17:40:23.368980 systemd-logind[1877]: Removed session 7. Dec 12 17:40:23.456664 systemd[1]: Started sshd@5-10.200.20.11:22-10.200.16.10:40518.service - OpenSSH per-connection server daemon (10.200.16.10:40518). Dec 12 17:40:23.953328 sshd[2335]: Accepted publickey for core from 10.200.16.10 port 40518 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:40:23.954498 sshd-session[2335]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:40:23.957922 systemd-logind[1877]: New session 8 of user core. Dec 12 17:40:23.970628 systemd[1]: Started session-8.scope - Session 8 of User core. Dec 12 17:40:24.231194 sudo[2340]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Dec 12 17:40:24.231845 sudo[2340]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:40:24.238417 sudo[2340]: pam_unix(sudo:session): session closed for user root Dec 12 17:40:24.243119 sudo[2339]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Dec 12 17:40:24.243335 sudo[2339]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:40:24.251143 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 12 17:40:24.280444 augenrules[2362]: No rules Dec 12 17:40:24.281927 systemd[1]: audit-rules.service: Deactivated successfully. Dec 12 17:40:24.282176 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 12 17:40:24.283736 sudo[2339]: pam_unix(sudo:session): session closed for user root Dec 12 17:40:24.356542 sshd[2338]: Connection closed by 10.200.16.10 port 40518 Dec 12 17:40:24.356854 sshd-session[2335]: pam_unix(sshd:session): session closed for user core Dec 12 17:40:24.361693 systemd[1]: sshd@5-10.200.20.11:22-10.200.16.10:40518.service: Deactivated successfully. Dec 12 17:40:24.364975 systemd[1]: session-8.scope: Deactivated successfully. Dec 12 17:40:24.365987 systemd-logind[1877]: Session 8 logged out. Waiting for processes to exit. Dec 12 17:40:24.367173 systemd-logind[1877]: Removed session 8. Dec 12 17:40:24.447264 systemd[1]: Started sshd@6-10.200.20.11:22-10.200.16.10:40526.service - OpenSSH per-connection server daemon (10.200.16.10:40526). Dec 12 17:40:24.939577 sshd[2371]: Accepted publickey for core from 10.200.16.10 port 40526 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:40:24.940335 sshd-session[2371]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:40:24.944223 systemd-logind[1877]: New session 9 of user core. Dec 12 17:40:24.951624 systemd[1]: Started session-9.scope - Session 9 of User core. Dec 12 17:40:25.212590 sudo[2375]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Dec 12 17:40:25.212803 sudo[2375]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:40:26.873358 systemd[1]: Starting docker.service - Docker Application Container Engine... Dec 12 17:40:26.881768 (dockerd)[2393]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Dec 12 17:40:27.681098 dockerd[2393]: time="2025-12-12T17:40:27.680824898Z" level=info msg="Starting up" Dec 12 17:40:27.681727 dockerd[2393]: time="2025-12-12T17:40:27.681698954Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Dec 12 17:40:27.689835 dockerd[2393]: time="2025-12-12T17:40:27.689804738Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Dec 12 17:40:27.776964 dockerd[2393]: time="2025-12-12T17:40:27.776920714Z" level=info msg="Loading containers: start." Dec 12 17:40:27.790553 kernel: Initializing XFRM netlink socket Dec 12 17:40:28.076828 systemd-networkd[1495]: docker0: Link UP Dec 12 17:40:28.093133 dockerd[2393]: time="2025-12-12T17:40:28.093088170Z" level=info msg="Loading containers: done." Dec 12 17:40:28.112775 dockerd[2393]: time="2025-12-12T17:40:28.112729538Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Dec 12 17:40:28.112934 dockerd[2393]: time="2025-12-12T17:40:28.112816354Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Dec 12 17:40:28.112934 dockerd[2393]: time="2025-12-12T17:40:28.112902418Z" level=info msg="Initializing buildkit" Dec 12 17:40:28.157664 dockerd[2393]: time="2025-12-12T17:40:28.157622066Z" level=info msg="Completed buildkit initialization" Dec 12 17:40:28.163090 dockerd[2393]: time="2025-12-12T17:40:28.163049282Z" level=info msg="Daemon has completed initialization" Dec 12 17:40:28.163361 dockerd[2393]: time="2025-12-12T17:40:28.163133362Z" level=info msg="API listen on /run/docker.sock" Dec 12 17:40:28.163579 systemd[1]: Started docker.service - Docker Application Container Engine. Dec 12 17:40:28.818048 containerd[1916]: time="2025-12-12T17:40:28.818007810Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\"" Dec 12 17:40:29.700442 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2735894130.mount: Deactivated successfully. Dec 12 17:40:30.413433 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Dec 12 17:40:30.414932 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:40:30.521220 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:40:30.528784 (kubelet)[2669]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:40:30.656790 kubelet[2669]: E1212 17:40:30.656741 2669 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:40:30.660279 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:40:30.660529 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:40:30.661152 systemd[1]: kubelet.service: Consumed 112ms CPU time, 105.7M memory peak. Dec 12 17:40:30.996539 containerd[1916]: time="2025-12-12T17:40:30.996180233Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:30.998968 containerd[1916]: time="2025-12-12T17:40:30.998941113Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.3: active requests=0, bytes read=24571040" Dec 12 17:40:31.001906 containerd[1916]: time="2025-12-12T17:40:31.001882462Z" level=info msg="ImageCreate event name:\"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:31.006364 containerd[1916]: time="2025-12-12T17:40:31.006331186Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:31.007997 containerd[1916]: time="2025-12-12T17:40:31.007969894Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.3\" with image id \"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\", size \"24567639\" in 2.18992382s" Dec 12 17:40:31.008017 containerd[1916]: time="2025-12-12T17:40:31.008005719Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\" returns image reference \"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\"" Dec 12 17:40:31.010153 containerd[1916]: time="2025-12-12T17:40:31.010086113Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\"" Dec 12 17:40:32.139341 containerd[1916]: time="2025-12-12T17:40:32.139269992Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:32.142183 containerd[1916]: time="2025-12-12T17:40:32.142154835Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.3: active requests=0, bytes read=19135477" Dec 12 17:40:32.145093 containerd[1916]: time="2025-12-12T17:40:32.145069319Z" level=info msg="ImageCreate event name:\"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:32.150359 containerd[1916]: time="2025-12-12T17:40:32.150328606Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:32.151008 containerd[1916]: time="2025-12-12T17:40:32.150708010Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.3\" with image id \"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\", size \"20719958\" in 1.140597712s" Dec 12 17:40:32.151008 containerd[1916]: time="2025-12-12T17:40:32.150734259Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\" returns image reference \"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\"" Dec 12 17:40:32.151163 containerd[1916]: time="2025-12-12T17:40:32.151138391Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\"" Dec 12 17:40:33.183897 containerd[1916]: time="2025-12-12T17:40:33.183828185Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:33.186530 containerd[1916]: time="2025-12-12T17:40:33.186346448Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.3: active requests=0, bytes read=14191716" Dec 12 17:40:33.189731 containerd[1916]: time="2025-12-12T17:40:33.189706027Z" level=info msg="ImageCreate event name:\"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:33.193545 containerd[1916]: time="2025-12-12T17:40:33.193477682Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:33.194290 containerd[1916]: time="2025-12-12T17:40:33.194101470Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.3\" with image id \"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\", size \"15776215\" in 1.042937053s" Dec 12 17:40:33.194290 containerd[1916]: time="2025-12-12T17:40:33.194127975Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\" returns image reference \"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\"" Dec 12 17:40:33.194513 containerd[1916]: time="2025-12-12T17:40:33.194490682Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\"" Dec 12 17:40:34.151555 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2346483668.mount: Deactivated successfully. Dec 12 17:40:34.929818 containerd[1916]: time="2025-12-12T17:40:34.929757849Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:34.932989 containerd[1916]: time="2025-12-12T17:40:34.932953014Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.3: active requests=0, bytes read=22805253" Dec 12 17:40:34.936145 containerd[1916]: time="2025-12-12T17:40:34.936102297Z" level=info msg="ImageCreate event name:\"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:34.940414 containerd[1916]: time="2025-12-12T17:40:34.940353536Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:34.940794 containerd[1916]: time="2025-12-12T17:40:34.940655681Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.3\" with image id \"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\", repo tag \"registry.k8s.io/kube-proxy:v1.34.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\", size \"22804272\" in 1.746081621s" Dec 12 17:40:34.940794 containerd[1916]: time="2025-12-12T17:40:34.940680698Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\" returns image reference \"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\"" Dec 12 17:40:34.941164 containerd[1916]: time="2025-12-12T17:40:34.941127016Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Dec 12 17:40:35.688736 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1653635978.mount: Deactivated successfully. Dec 12 17:40:36.632536 containerd[1916]: time="2025-12-12T17:40:36.632195337Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:36.637249 containerd[1916]: time="2025-12-12T17:40:36.637201664Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=20395406" Dec 12 17:40:36.640268 containerd[1916]: time="2025-12-12T17:40:36.640240016Z" level=info msg="ImageCreate event name:\"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:36.645341 containerd[1916]: time="2025-12-12T17:40:36.645066953Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:36.645740 containerd[1916]: time="2025-12-12T17:40:36.645712389Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"20392204\" in 1.704240682s" Dec 12 17:40:36.645740 containerd[1916]: time="2025-12-12T17:40:36.645740950Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\"" Dec 12 17:40:36.646274 containerd[1916]: time="2025-12-12T17:40:36.646233702Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Dec 12 17:40:37.195762 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount297178401.mount: Deactivated successfully. Dec 12 17:40:37.220068 containerd[1916]: time="2025-12-12T17:40:37.220016056Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:37.223004 containerd[1916]: time="2025-12-12T17:40:37.222967061Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=268709" Dec 12 17:40:37.225966 containerd[1916]: time="2025-12-12T17:40:37.225938819Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:37.230659 containerd[1916]: time="2025-12-12T17:40:37.230600462Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:37.231144 containerd[1916]: time="2025-12-12T17:40:37.230876807Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 584.343816ms" Dec 12 17:40:37.231144 containerd[1916]: time="2025-12-12T17:40:37.230903712Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\"" Dec 12 17:40:37.231451 containerd[1916]: time="2025-12-12T17:40:37.231412792Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\"" Dec 12 17:40:37.861047 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3516678640.mount: Deactivated successfully. Dec 12 17:40:40.223612 containerd[1916]: time="2025-12-12T17:40:40.223554213Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.4-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:40.334371 containerd[1916]: time="2025-12-12T17:40:40.334091788Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.4-0: active requests=0, bytes read=98062987" Dec 12 17:40:40.337574 containerd[1916]: time="2025-12-12T17:40:40.337540973Z" level=info msg="ImageCreate event name:\"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:40.342454 containerd[1916]: time="2025-12-12T17:40:40.341815250Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:40.342454 containerd[1916]: time="2025-12-12T17:40:40.342345211Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.4-0\" with image id \"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\", repo tag \"registry.k8s.io/etcd:3.6.4-0\", repo digest \"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\", size \"98207481\" in 3.110907506s" Dec 12 17:40:40.342454 containerd[1916]: time="2025-12-12T17:40:40.342371812Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\" returns image reference \"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\"" Dec 12 17:40:40.663611 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Dec 12 17:40:40.665046 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:40:40.823977 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:40:40.832790 (kubelet)[2829]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:40:40.898210 kubelet[2829]: E1212 17:40:40.898159 2829 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:40:40.901600 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:40:40.901829 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:40:40.902344 systemd[1]: kubelet.service: Consumed 103ms CPU time, 107.3M memory peak. Dec 12 17:40:41.150037 kernel: hv_balloon: Max. dynamic memory size: 4096 MB Dec 12 17:40:42.984553 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:40:42.984663 systemd[1]: kubelet.service: Consumed 103ms CPU time, 107.3M memory peak. Dec 12 17:40:42.986592 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:40:43.008421 systemd[1]: Reload requested from client PID 2848 ('systemctl') (unit session-9.scope)... Dec 12 17:40:43.008437 systemd[1]: Reloading... Dec 12 17:40:43.092676 zram_generator::config[2901]: No configuration found. Dec 12 17:40:43.235911 update_engine[1881]: I20251212 17:40:43.235541 1881 update_attempter.cc:509] Updating boot flags... Dec 12 17:40:43.245250 systemd[1]: Reloading finished in 236 ms. Dec 12 17:40:43.286018 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Dec 12 17:40:43.286098 systemd[1]: kubelet.service: Failed with result 'signal'. Dec 12 17:40:43.286317 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:40:43.286377 systemd[1]: kubelet.service: Consumed 55ms CPU time, 75.3M memory peak. Dec 12 17:40:43.289715 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:40:43.768398 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:40:43.775015 (kubelet)[2985]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 12 17:40:43.833933 kubelet[2985]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 12 17:40:43.833933 kubelet[2985]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 17:40:43.834590 kubelet[2985]: I1212 17:40:43.834424 2985 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 12 17:40:44.206933 kubelet[2985]: I1212 17:40:44.205120 2985 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Dec 12 17:40:44.206933 kubelet[2985]: I1212 17:40:44.205154 2985 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 12 17:40:44.206933 kubelet[2985]: I1212 17:40:44.206518 2985 watchdog_linux.go:95] "Systemd watchdog is not enabled" Dec 12 17:40:44.206933 kubelet[2985]: I1212 17:40:44.206540 2985 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 12 17:40:44.206933 kubelet[2985]: I1212 17:40:44.206783 2985 server.go:956] "Client rotation is on, will bootstrap in background" Dec 12 17:40:44.218356 kubelet[2985]: E1212 17:40:44.218320 2985 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.200.20.11:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.20.11:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Dec 12 17:40:44.219427 kubelet[2985]: I1212 17:40:44.219403 2985 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 12 17:40:44.224526 kubelet[2985]: I1212 17:40:44.223991 2985 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 12 17:40:44.226718 kubelet[2985]: I1212 17:40:44.226700 2985 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Dec 12 17:40:44.226873 kubelet[2985]: I1212 17:40:44.226851 2985 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 12 17:40:44.226981 kubelet[2985]: I1212 17:40:44.226870 2985 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459.2.2-a-9f5170e2ca","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 12 17:40:44.226981 kubelet[2985]: I1212 17:40:44.226979 2985 topology_manager.go:138] "Creating topology manager with none policy" Dec 12 17:40:44.227070 kubelet[2985]: I1212 17:40:44.226986 2985 container_manager_linux.go:306] "Creating device plugin manager" Dec 12 17:40:44.227087 kubelet[2985]: I1212 17:40:44.227078 2985 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Dec 12 17:40:44.234601 kubelet[2985]: I1212 17:40:44.234577 2985 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:40:44.235634 kubelet[2985]: I1212 17:40:44.235618 2985 kubelet.go:475] "Attempting to sync node with API server" Dec 12 17:40:44.235676 kubelet[2985]: I1212 17:40:44.235638 2985 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 12 17:40:44.236206 kubelet[2985]: I1212 17:40:44.236114 2985 kubelet.go:387] "Adding apiserver pod source" Dec 12 17:40:44.236206 kubelet[2985]: I1212 17:40:44.236132 2985 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 12 17:40:44.236206 kubelet[2985]: E1212 17:40:44.236121 2985 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.200.20.11:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459.2.2-a-9f5170e2ca&limit=500&resourceVersion=0\": dial tcp 10.200.20.11:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 12 17:40:44.236740 kubelet[2985]: E1212 17:40:44.236673 2985 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.200.20.11:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.11:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 12 17:40:44.236949 kubelet[2985]: I1212 17:40:44.236928 2985 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 12 17:40:44.237298 kubelet[2985]: I1212 17:40:44.237279 2985 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 12 17:40:44.237298 kubelet[2985]: I1212 17:40:44.237301 2985 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Dec 12 17:40:44.237355 kubelet[2985]: W1212 17:40:44.237331 2985 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Dec 12 17:40:44.240539 kubelet[2985]: I1212 17:40:44.240522 2985 server.go:1262] "Started kubelet" Dec 12 17:40:44.240696 kubelet[2985]: I1212 17:40:44.240663 2985 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 12 17:40:44.241277 kubelet[2985]: I1212 17:40:44.241261 2985 server.go:310] "Adding debug handlers to kubelet server" Dec 12 17:40:44.242914 kubelet[2985]: I1212 17:40:44.242860 2985 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 12 17:40:44.242984 kubelet[2985]: I1212 17:40:44.242919 2985 server_v1.go:49] "podresources" method="list" useActivePods=true Dec 12 17:40:44.243159 kubelet[2985]: I1212 17:40:44.243138 2985 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 12 17:40:44.244448 kubelet[2985]: E1212 17:40:44.243247 2985 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.11:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.11:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459.2.2-a-9f5170e2ca.18808898e826a1ce default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459.2.2-a-9f5170e2ca,UID:ci-4459.2.2-a-9f5170e2ca,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459.2.2-a-9f5170e2ca,},FirstTimestamp:2025-12-12 17:40:44.240486862 +0000 UTC m=+0.457991036,LastTimestamp:2025-12-12 17:40:44.240486862 +0000 UTC m=+0.457991036,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459.2.2-a-9f5170e2ca,}" Dec 12 17:40:44.245883 kubelet[2985]: E1212 17:40:44.245863 2985 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 12 17:40:44.246418 kubelet[2985]: I1212 17:40:44.246263 2985 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 12 17:40:44.246644 kubelet[2985]: I1212 17:40:44.246626 2985 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 12 17:40:44.248603 kubelet[2985]: E1212 17:40:44.248580 2985 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459.2.2-a-9f5170e2ca\" not found" Dec 12 17:40:44.248725 kubelet[2985]: I1212 17:40:44.248715 2985 volume_manager.go:313] "Starting Kubelet Volume Manager" Dec 12 17:40:44.249093 kubelet[2985]: I1212 17:40:44.249076 2985 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 12 17:40:44.249188 kubelet[2985]: I1212 17:40:44.249179 2985 reconciler.go:29] "Reconciler: start to sync state" Dec 12 17:40:44.249564 kubelet[2985]: E1212 17:40:44.249545 2985 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.200.20.11:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.11:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 12 17:40:44.250191 kubelet[2985]: I1212 17:40:44.250169 2985 factory.go:223] Registration of the systemd container factory successfully Dec 12 17:40:44.250515 kubelet[2985]: I1212 17:40:44.250396 2985 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 12 17:40:44.251767 kubelet[2985]: E1212 17:40:44.251743 2985 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.11:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.2-a-9f5170e2ca?timeout=10s\": dial tcp 10.200.20.11:6443: connect: connection refused" interval="200ms" Dec 12 17:40:44.251944 kubelet[2985]: I1212 17:40:44.251931 2985 factory.go:223] Registration of the containerd container factory successfully Dec 12 17:40:44.279224 kubelet[2985]: I1212 17:40:44.278830 2985 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Dec 12 17:40:44.280333 kubelet[2985]: I1212 17:40:44.280300 2985 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Dec 12 17:40:44.280333 kubelet[2985]: I1212 17:40:44.280322 2985 status_manager.go:244] "Starting to sync pod status with apiserver" Dec 12 17:40:44.280428 kubelet[2985]: I1212 17:40:44.280356 2985 kubelet.go:2427] "Starting kubelet main sync loop" Dec 12 17:40:44.280428 kubelet[2985]: E1212 17:40:44.280389 2985 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 12 17:40:44.281763 kubelet[2985]: E1212 17:40:44.281721 2985 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.200.20.11:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.11:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Dec 12 17:40:44.283876 kubelet[2985]: I1212 17:40:44.283859 2985 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 12 17:40:44.283876 kubelet[2985]: I1212 17:40:44.283871 2985 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 12 17:40:44.283957 kubelet[2985]: I1212 17:40:44.283886 2985 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:40:44.289636 kubelet[2985]: I1212 17:40:44.289612 2985 policy_none.go:49] "None policy: Start" Dec 12 17:40:44.289636 kubelet[2985]: I1212 17:40:44.289634 2985 memory_manager.go:187] "Starting memorymanager" policy="None" Dec 12 17:40:44.289730 kubelet[2985]: I1212 17:40:44.289645 2985 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Dec 12 17:40:44.294205 kubelet[2985]: I1212 17:40:44.294179 2985 policy_none.go:47] "Start" Dec 12 17:40:44.298632 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Dec 12 17:40:44.306188 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Dec 12 17:40:44.308697 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Dec 12 17:40:44.319519 kubelet[2985]: E1212 17:40:44.319378 2985 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 12 17:40:44.319709 kubelet[2985]: I1212 17:40:44.319695 2985 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 12 17:40:44.319912 kubelet[2985]: I1212 17:40:44.319881 2985 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 12 17:40:44.321255 kubelet[2985]: E1212 17:40:44.321238 2985 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 12 17:40:44.321357 kubelet[2985]: E1212 17:40:44.321347 2985 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459.2.2-a-9f5170e2ca\" not found" Dec 12 17:40:44.321494 kubelet[2985]: I1212 17:40:44.321471 2985 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 12 17:40:44.392561 systemd[1]: Created slice kubepods-burstable-pode101b43f11f6291e032d2e63399f3a96.slice - libcontainer container kubepods-burstable-pode101b43f11f6291e032d2e63399f3a96.slice. Dec 12 17:40:44.399077 kubelet[2985]: E1212 17:40:44.399040 2985 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.2-a-9f5170e2ca\" not found" node="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:44.402925 systemd[1]: Created slice kubepods-burstable-pod51167a2a389bcebab9dd3134edbdb5e2.slice - libcontainer container kubepods-burstable-pod51167a2a389bcebab9dd3134edbdb5e2.slice. Dec 12 17:40:44.404914 kubelet[2985]: E1212 17:40:44.404887 2985 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.2-a-9f5170e2ca\" not found" node="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:44.415037 systemd[1]: Created slice kubepods-burstable-pod1b60b9cdb3818de3265058ced8148ea1.slice - libcontainer container kubepods-burstable-pod1b60b9cdb3818de3265058ced8148ea1.slice. Dec 12 17:40:44.416641 kubelet[2985]: E1212 17:40:44.416616 2985 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.2-a-9f5170e2ca\" not found" node="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:44.422548 kubelet[2985]: I1212 17:40:44.422282 2985 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:44.422750 kubelet[2985]: E1212 17:40:44.422730 2985 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.11:6443/api/v1/nodes\": dial tcp 10.200.20.11:6443: connect: connection refused" node="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:44.450099 kubelet[2985]: I1212 17:40:44.450068 2985 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e101b43f11f6291e032d2e63399f3a96-ca-certs\") pod \"kube-apiserver-ci-4459.2.2-a-9f5170e2ca\" (UID: \"e101b43f11f6291e032d2e63399f3a96\") " pod="kube-system/kube-apiserver-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:44.450234 kubelet[2985]: I1212 17:40:44.450222 2985 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e101b43f11f6291e032d2e63399f3a96-k8s-certs\") pod \"kube-apiserver-ci-4459.2.2-a-9f5170e2ca\" (UID: \"e101b43f11f6291e032d2e63399f3a96\") " pod="kube-system/kube-apiserver-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:44.450473 kubelet[2985]: I1212 17:40:44.450338 2985 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e101b43f11f6291e032d2e63399f3a96-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459.2.2-a-9f5170e2ca\" (UID: \"e101b43f11f6291e032d2e63399f3a96\") " pod="kube-system/kube-apiserver-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:44.450473 kubelet[2985]: I1212 17:40:44.450362 2985 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/51167a2a389bcebab9dd3134edbdb5e2-k8s-certs\") pod \"kube-controller-manager-ci-4459.2.2-a-9f5170e2ca\" (UID: \"51167a2a389bcebab9dd3134edbdb5e2\") " pod="kube-system/kube-controller-manager-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:44.450473 kubelet[2985]: I1212 17:40:44.450373 2985 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/51167a2a389bcebab9dd3134edbdb5e2-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459.2.2-a-9f5170e2ca\" (UID: \"51167a2a389bcebab9dd3134edbdb5e2\") " pod="kube-system/kube-controller-manager-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:44.450473 kubelet[2985]: I1212 17:40:44.450383 2985 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/51167a2a389bcebab9dd3134edbdb5e2-ca-certs\") pod \"kube-controller-manager-ci-4459.2.2-a-9f5170e2ca\" (UID: \"51167a2a389bcebab9dd3134edbdb5e2\") " pod="kube-system/kube-controller-manager-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:44.450473 kubelet[2985]: I1212 17:40:44.450394 2985 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/51167a2a389bcebab9dd3134edbdb5e2-flexvolume-dir\") pod \"kube-controller-manager-ci-4459.2.2-a-9f5170e2ca\" (UID: \"51167a2a389bcebab9dd3134edbdb5e2\") " pod="kube-system/kube-controller-manager-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:44.450640 kubelet[2985]: I1212 17:40:44.450403 2985 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/51167a2a389bcebab9dd3134edbdb5e2-kubeconfig\") pod \"kube-controller-manager-ci-4459.2.2-a-9f5170e2ca\" (UID: \"51167a2a389bcebab9dd3134edbdb5e2\") " pod="kube-system/kube-controller-manager-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:44.450640 kubelet[2985]: I1212 17:40:44.450445 2985 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1b60b9cdb3818de3265058ced8148ea1-kubeconfig\") pod \"kube-scheduler-ci-4459.2.2-a-9f5170e2ca\" (UID: \"1b60b9cdb3818de3265058ced8148ea1\") " pod="kube-system/kube-scheduler-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:44.452475 kubelet[2985]: E1212 17:40:44.452442 2985 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.11:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.2-a-9f5170e2ca?timeout=10s\": dial tcp 10.200.20.11:6443: connect: connection refused" interval="400ms" Dec 12 17:40:44.624588 kubelet[2985]: I1212 17:40:44.624478 2985 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:44.624931 kubelet[2985]: E1212 17:40:44.624880 2985 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.11:6443/api/v1/nodes\": dial tcp 10.200.20.11:6443: connect: connection refused" node="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:44.707524 containerd[1916]: time="2025-12-12T17:40:44.707006065Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459.2.2-a-9f5170e2ca,Uid:e101b43f11f6291e032d2e63399f3a96,Namespace:kube-system,Attempt:0,}" Dec 12 17:40:44.711606 containerd[1916]: time="2025-12-12T17:40:44.711575471Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459.2.2-a-9f5170e2ca,Uid:51167a2a389bcebab9dd3134edbdb5e2,Namespace:kube-system,Attempt:0,}" Dec 12 17:40:44.722256 containerd[1916]: time="2025-12-12T17:40:44.722224813Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459.2.2-a-9f5170e2ca,Uid:1b60b9cdb3818de3265058ced8148ea1,Namespace:kube-system,Attempt:0,}" Dec 12 17:40:44.853757 kubelet[2985]: E1212 17:40:44.853712 2985 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.11:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.2-a-9f5170e2ca?timeout=10s\": dial tcp 10.200.20.11:6443: connect: connection refused" interval="800ms" Dec 12 17:40:45.026791 kubelet[2985]: I1212 17:40:45.026759 2985 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:45.027080 kubelet[2985]: E1212 17:40:45.027057 2985 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.11:6443/api/v1/nodes\": dial tcp 10.200.20.11:6443: connect: connection refused" node="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:45.230722 kubelet[2985]: E1212 17:40:45.230685 2985 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.200.20.11:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.11:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 12 17:40:45.412024 kubelet[2985]: E1212 17:40:45.411904 2985 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.200.20.11:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.11:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 12 17:40:45.460248 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2417312391.mount: Deactivated successfully. Dec 12 17:40:45.654638 kubelet[2985]: E1212 17:40:45.654599 2985 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.11:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.2-a-9f5170e2ca?timeout=10s\": dial tcp 10.200.20.11:6443: connect: connection refused" interval="1.6s" Dec 12 17:40:45.771964 kubelet[2985]: E1212 17:40:45.771924 2985 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.200.20.11:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.11:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Dec 12 17:40:45.829334 kubelet[2985]: I1212 17:40:45.829297 2985 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:45.829623 kubelet[2985]: E1212 17:40:45.829599 2985 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.11:6443/api/v1/nodes\": dial tcp 10.200.20.11:6443: connect: connection refused" node="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:45.834169 kubelet[2985]: E1212 17:40:45.834133 2985 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.200.20.11:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459.2.2-a-9f5170e2ca&limit=500&resourceVersion=0\": dial tcp 10.200.20.11:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 12 17:40:46.326222 kubelet[2985]: E1212 17:40:46.326184 2985 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.200.20.11:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.20.11:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Dec 12 17:40:46.943035 containerd[1916]: time="2025-12-12T17:40:46.942533399Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:40:46.953630 containerd[1916]: time="2025-12-12T17:40:46.953585316Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" Dec 12 17:40:46.956967 containerd[1916]: time="2025-12-12T17:40:46.956568546Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:40:46.959544 containerd[1916]: time="2025-12-12T17:40:46.959491142Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:40:46.965470 containerd[1916]: time="2025-12-12T17:40:46.965339383Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 12 17:40:46.968536 containerd[1916]: time="2025-12-12T17:40:46.968190289Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:40:46.972532 containerd[1916]: time="2025-12-12T17:40:46.971755514Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:40:46.972532 containerd[1916]: time="2025-12-12T17:40:46.972197680Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 2.261108929s" Dec 12 17:40:46.974795 containerd[1916]: time="2025-12-12T17:40:46.974765321Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 12 17:40:46.979390 containerd[1916]: time="2025-12-12T17:40:46.979352466Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 2.262497366s" Dec 12 17:40:47.011047 containerd[1916]: time="2025-12-12T17:40:47.010992313Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 2.285833692s" Dec 12 17:40:47.026580 containerd[1916]: time="2025-12-12T17:40:47.026025116Z" level=info msg="connecting to shim 6245fd0c36b2a2c6408ae234c41fe0266015ede325a06539ac64c9dbace897fa" address="unix:///run/containerd/s/8b54c8cd3f8e57e3a3a05c32438793ebe077afb273a7e5486d49a3164525c4be" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:40:47.046897 containerd[1916]: time="2025-12-12T17:40:47.046855358Z" level=info msg="connecting to shim 075554fff1a81e3c179ae822e17407a91ea77d77aaa45ca3d2901cdf495dadd8" address="unix:///run/containerd/s/8ea70cd40d1b58cc508295b6a6348392cd49e4c66dfe77de47fead3016094f80" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:40:47.047654 systemd[1]: Started cri-containerd-6245fd0c36b2a2c6408ae234c41fe0266015ede325a06539ac64c9dbace897fa.scope - libcontainer container 6245fd0c36b2a2c6408ae234c41fe0266015ede325a06539ac64c9dbace897fa. Dec 12 17:40:47.073105 systemd[1]: Started cri-containerd-075554fff1a81e3c179ae822e17407a91ea77d77aaa45ca3d2901cdf495dadd8.scope - libcontainer container 075554fff1a81e3c179ae822e17407a91ea77d77aaa45ca3d2901cdf495dadd8. Dec 12 17:40:47.076959 containerd[1916]: time="2025-12-12T17:40:47.076856378Z" level=info msg="connecting to shim f6ac430c723782f1a8136399fe5b4104a976635615320d4ae9b85d9a3106a787" address="unix:///run/containerd/s/f09df21b98b5b999a9d2080e59b9cc646672ecf5a5f1a543d0bfaaa40567fcb0" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:40:47.099790 systemd[1]: Started cri-containerd-f6ac430c723782f1a8136399fe5b4104a976635615320d4ae9b85d9a3106a787.scope - libcontainer container f6ac430c723782f1a8136399fe5b4104a976635615320d4ae9b85d9a3106a787. Dec 12 17:40:47.130424 containerd[1916]: time="2025-12-12T17:40:47.130385005Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459.2.2-a-9f5170e2ca,Uid:51167a2a389bcebab9dd3134edbdb5e2,Namespace:kube-system,Attempt:0,} returns sandbox id \"075554fff1a81e3c179ae822e17407a91ea77d77aaa45ca3d2901cdf495dadd8\"" Dec 12 17:40:47.134061 containerd[1916]: time="2025-12-12T17:40:47.134020576Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459.2.2-a-9f5170e2ca,Uid:e101b43f11f6291e032d2e63399f3a96,Namespace:kube-system,Attempt:0,} returns sandbox id \"6245fd0c36b2a2c6408ae234c41fe0266015ede325a06539ac64c9dbace897fa\"" Dec 12 17:40:47.138847 containerd[1916]: time="2025-12-12T17:40:47.138809767Z" level=info msg="CreateContainer within sandbox \"075554fff1a81e3c179ae822e17407a91ea77d77aaa45ca3d2901cdf495dadd8\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Dec 12 17:40:47.143601 containerd[1916]: time="2025-12-12T17:40:47.143550013Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459.2.2-a-9f5170e2ca,Uid:1b60b9cdb3818de3265058ced8148ea1,Namespace:kube-system,Attempt:0,} returns sandbox id \"f6ac430c723782f1a8136399fe5b4104a976635615320d4ae9b85d9a3106a787\"" Dec 12 17:40:47.144003 containerd[1916]: time="2025-12-12T17:40:47.143802733Z" level=info msg="CreateContainer within sandbox \"6245fd0c36b2a2c6408ae234c41fe0266015ede325a06539ac64c9dbace897fa\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Dec 12 17:40:47.150863 containerd[1916]: time="2025-12-12T17:40:47.150832867Z" level=info msg="CreateContainer within sandbox \"f6ac430c723782f1a8136399fe5b4104a976635615320d4ae9b85d9a3106a787\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Dec 12 17:40:47.171041 containerd[1916]: time="2025-12-12T17:40:47.170991656Z" level=info msg="Container f8859b5fc30d08e34f0176a235a246126f19e1844299c51f090ba10d3e3545f0: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:40:47.177210 containerd[1916]: time="2025-12-12T17:40:47.177171475Z" level=info msg="Container 77947d4e3a2852ab7a42c35446df173d661909dc2414bb06ab0fbe1e67810238: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:40:47.182710 containerd[1916]: time="2025-12-12T17:40:47.182677353Z" level=info msg="Container 3371947a5c96e5b7fb163329def38452ea9f29cb39fd0c695a628860191adb6c: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:40:47.223202 containerd[1916]: time="2025-12-12T17:40:47.222476683Z" level=info msg="CreateContainer within sandbox \"075554fff1a81e3c179ae822e17407a91ea77d77aaa45ca3d2901cdf495dadd8\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"f8859b5fc30d08e34f0176a235a246126f19e1844299c51f090ba10d3e3545f0\"" Dec 12 17:40:47.223915 containerd[1916]: time="2025-12-12T17:40:47.223824405Z" level=info msg="StartContainer for \"f8859b5fc30d08e34f0176a235a246126f19e1844299c51f090ba10d3e3545f0\"" Dec 12 17:40:47.224698 containerd[1916]: time="2025-12-12T17:40:47.224675880Z" level=info msg="CreateContainer within sandbox \"6245fd0c36b2a2c6408ae234c41fe0266015ede325a06539ac64c9dbace897fa\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"77947d4e3a2852ab7a42c35446df173d661909dc2414bb06ab0fbe1e67810238\"" Dec 12 17:40:47.225178 containerd[1916]: time="2025-12-12T17:40:47.225154735Z" level=info msg="connecting to shim f8859b5fc30d08e34f0176a235a246126f19e1844299c51f090ba10d3e3545f0" address="unix:///run/containerd/s/8ea70cd40d1b58cc508295b6a6348392cd49e4c66dfe77de47fead3016094f80" protocol=ttrpc version=3 Dec 12 17:40:47.225574 containerd[1916]: time="2025-12-12T17:40:47.225541428Z" level=info msg="StartContainer for \"77947d4e3a2852ab7a42c35446df173d661909dc2414bb06ab0fbe1e67810238\"" Dec 12 17:40:47.228070 containerd[1916]: time="2025-12-12T17:40:47.227675711Z" level=info msg="connecting to shim 77947d4e3a2852ab7a42c35446df173d661909dc2414bb06ab0fbe1e67810238" address="unix:///run/containerd/s/8b54c8cd3f8e57e3a3a05c32438793ebe077afb273a7e5486d49a3164525c4be" protocol=ttrpc version=3 Dec 12 17:40:47.230290 containerd[1916]: time="2025-12-12T17:40:47.230261865Z" level=info msg="CreateContainer within sandbox \"f6ac430c723782f1a8136399fe5b4104a976635615320d4ae9b85d9a3106a787\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"3371947a5c96e5b7fb163329def38452ea9f29cb39fd0c695a628860191adb6c\"" Dec 12 17:40:47.232944 containerd[1916]: time="2025-12-12T17:40:47.232923317Z" level=info msg="StartContainer for \"3371947a5c96e5b7fb163329def38452ea9f29cb39fd0c695a628860191adb6c\"" Dec 12 17:40:47.236777 containerd[1916]: time="2025-12-12T17:40:47.236725597Z" level=info msg="connecting to shim 3371947a5c96e5b7fb163329def38452ea9f29cb39fd0c695a628860191adb6c" address="unix:///run/containerd/s/f09df21b98b5b999a9d2080e59b9cc646672ecf5a5f1a543d0bfaaa40567fcb0" protocol=ttrpc version=3 Dec 12 17:40:47.251645 systemd[1]: Started cri-containerd-77947d4e3a2852ab7a42c35446df173d661909dc2414bb06ab0fbe1e67810238.scope - libcontainer container 77947d4e3a2852ab7a42c35446df173d661909dc2414bb06ab0fbe1e67810238. Dec 12 17:40:47.252427 systemd[1]: Started cri-containerd-f8859b5fc30d08e34f0176a235a246126f19e1844299c51f090ba10d3e3545f0.scope - libcontainer container f8859b5fc30d08e34f0176a235a246126f19e1844299c51f090ba10d3e3545f0. Dec 12 17:40:47.256752 systemd[1]: Started cri-containerd-3371947a5c96e5b7fb163329def38452ea9f29cb39fd0c695a628860191adb6c.scope - libcontainer container 3371947a5c96e5b7fb163329def38452ea9f29cb39fd0c695a628860191adb6c. Dec 12 17:40:47.258046 kubelet[2985]: E1212 17:40:47.258004 2985 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.11:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.2-a-9f5170e2ca?timeout=10s\": dial tcp 10.200.20.11:6443: connect: connection refused" interval="3.2s" Dec 12 17:40:47.330457 containerd[1916]: time="2025-12-12T17:40:47.329981560Z" level=info msg="StartContainer for \"f8859b5fc30d08e34f0176a235a246126f19e1844299c51f090ba10d3e3545f0\" returns successfully" Dec 12 17:40:47.335718 containerd[1916]: time="2025-12-12T17:40:47.335678772Z" level=info msg="StartContainer for \"3371947a5c96e5b7fb163329def38452ea9f29cb39fd0c695a628860191adb6c\" returns successfully" Dec 12 17:40:47.337036 containerd[1916]: time="2025-12-12T17:40:47.337009310Z" level=info msg="StartContainer for \"77947d4e3a2852ab7a42c35446df173d661909dc2414bb06ab0fbe1e67810238\" returns successfully" Dec 12 17:40:47.431869 kubelet[2985]: I1212 17:40:47.431840 2985 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:48.323674 kubelet[2985]: E1212 17:40:48.323645 2985 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.2-a-9f5170e2ca\" not found" node="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:48.326036 kubelet[2985]: E1212 17:40:48.325995 2985 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.2-a-9f5170e2ca\" not found" node="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:48.327253 kubelet[2985]: E1212 17:40:48.327227 2985 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.2-a-9f5170e2ca\" not found" node="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:48.646769 kubelet[2985]: I1212 17:40:48.646432 2985 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:48.646769 kubelet[2985]: E1212 17:40:48.646473 2985 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"ci-4459.2.2-a-9f5170e2ca\": node \"ci-4459.2.2-a-9f5170e2ca\" not found" Dec 12 17:40:48.665579 kubelet[2985]: E1212 17:40:48.665546 2985 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459.2.2-a-9f5170e2ca\" not found" Dec 12 17:40:48.766619 kubelet[2985]: E1212 17:40:48.766576 2985 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459.2.2-a-9f5170e2ca\" not found" Dec 12 17:40:48.867344 kubelet[2985]: E1212 17:40:48.867297 2985 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459.2.2-a-9f5170e2ca\" not found" Dec 12 17:40:48.952028 kubelet[2985]: I1212 17:40:48.951989 2985 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:48.956418 kubelet[2985]: E1212 17:40:48.956391 2985 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459.2.2-a-9f5170e2ca\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:48.956455 kubelet[2985]: I1212 17:40:48.956420 2985 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:48.957657 kubelet[2985]: E1212 17:40:48.957635 2985 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459.2.2-a-9f5170e2ca\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:48.957696 kubelet[2985]: I1212 17:40:48.957658 2985 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:48.959070 kubelet[2985]: E1212 17:40:48.959050 2985 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459.2.2-a-9f5170e2ca\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:49.239861 kubelet[2985]: I1212 17:40:49.239740 2985 apiserver.go:52] "Watching apiserver" Dec 12 17:40:49.250163 kubelet[2985]: I1212 17:40:49.250102 2985 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 12 17:40:49.328583 kubelet[2985]: I1212 17:40:49.328132 2985 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:49.329436 kubelet[2985]: I1212 17:40:49.328356 2985 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:49.329620 kubelet[2985]: I1212 17:40:49.328483 2985 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:49.333337 kubelet[2985]: E1212 17:40:49.333297 2985 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459.2.2-a-9f5170e2ca\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:49.333693 kubelet[2985]: E1212 17:40:49.333421 2985 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459.2.2-a-9f5170e2ca\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:49.333693 kubelet[2985]: E1212 17:40:49.333606 2985 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459.2.2-a-9f5170e2ca\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:50.331253 kubelet[2985]: I1212 17:40:50.331018 2985 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:50.331827 kubelet[2985]: I1212 17:40:50.331697 2985 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:50.336901 kubelet[2985]: I1212 17:40:50.336875 2985 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Dec 12 17:40:50.340404 kubelet[2985]: I1212 17:40:50.339966 2985 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Dec 12 17:40:50.712261 systemd[1]: Reload requested from client PID 3416 ('systemctl') (unit session-9.scope)... Dec 12 17:40:50.712274 systemd[1]: Reloading... Dec 12 17:40:50.797601 zram_generator::config[3463]: No configuration found. Dec 12 17:40:50.964273 systemd[1]: Reloading finished in 251 ms. Dec 12 17:40:50.989125 kubelet[2985]: I1212 17:40:50.989014 2985 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 12 17:40:50.989510 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:40:51.002377 systemd[1]: kubelet.service: Deactivated successfully. Dec 12 17:40:51.002825 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:40:51.002941 systemd[1]: kubelet.service: Consumed 588ms CPU time, 121M memory peak. Dec 12 17:40:51.005830 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:40:51.247606 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:40:51.258162 (kubelet)[3527]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 12 17:40:51.295949 kubelet[3527]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 12 17:40:51.296287 kubelet[3527]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 17:40:51.296418 kubelet[3527]: I1212 17:40:51.296391 3527 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 12 17:40:51.302028 kubelet[3527]: I1212 17:40:51.301990 3527 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Dec 12 17:40:51.302028 kubelet[3527]: I1212 17:40:51.302015 3527 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 12 17:40:51.302143 kubelet[3527]: I1212 17:40:51.302043 3527 watchdog_linux.go:95] "Systemd watchdog is not enabled" Dec 12 17:40:51.302143 kubelet[3527]: I1212 17:40:51.302048 3527 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 12 17:40:51.302250 kubelet[3527]: I1212 17:40:51.302231 3527 server.go:956] "Client rotation is on, will bootstrap in background" Dec 12 17:40:51.303284 kubelet[3527]: I1212 17:40:51.303263 3527 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Dec 12 17:40:51.306741 kubelet[3527]: I1212 17:40:51.306616 3527 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 12 17:40:51.316925 kubelet[3527]: I1212 17:40:51.316896 3527 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 12 17:40:51.320047 kubelet[3527]: I1212 17:40:51.320025 3527 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Dec 12 17:40:51.320226 kubelet[3527]: I1212 17:40:51.320203 3527 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 12 17:40:51.320343 kubelet[3527]: I1212 17:40:51.320226 3527 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459.2.2-a-9f5170e2ca","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 12 17:40:51.320416 kubelet[3527]: I1212 17:40:51.320343 3527 topology_manager.go:138] "Creating topology manager with none policy" Dec 12 17:40:51.320416 kubelet[3527]: I1212 17:40:51.320350 3527 container_manager_linux.go:306] "Creating device plugin manager" Dec 12 17:40:51.320416 kubelet[3527]: I1212 17:40:51.320370 3527 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Dec 12 17:40:51.322204 kubelet[3527]: I1212 17:40:51.322181 3527 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:40:51.322484 kubelet[3527]: I1212 17:40:51.322467 3527 kubelet.go:475] "Attempting to sync node with API server" Dec 12 17:40:51.322484 kubelet[3527]: I1212 17:40:51.322484 3527 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 12 17:40:51.322977 kubelet[3527]: I1212 17:40:51.322957 3527 kubelet.go:387] "Adding apiserver pod source" Dec 12 17:40:51.323008 kubelet[3527]: I1212 17:40:51.322985 3527 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 12 17:40:51.330386 kubelet[3527]: I1212 17:40:51.330302 3527 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 12 17:40:51.331516 kubelet[3527]: I1212 17:40:51.331454 3527 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 12 17:40:51.331516 kubelet[3527]: I1212 17:40:51.331483 3527 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Dec 12 17:40:51.336521 kubelet[3527]: I1212 17:40:51.335765 3527 server.go:1262] "Started kubelet" Dec 12 17:40:51.336521 kubelet[3527]: I1212 17:40:51.335957 3527 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 12 17:40:51.336521 kubelet[3527]: I1212 17:40:51.336001 3527 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 12 17:40:51.336521 kubelet[3527]: I1212 17:40:51.336048 3527 server_v1.go:49] "podresources" method="list" useActivePods=true Dec 12 17:40:51.336521 kubelet[3527]: I1212 17:40:51.336230 3527 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 12 17:40:51.336721 kubelet[3527]: I1212 17:40:51.336707 3527 server.go:310] "Adding debug handlers to kubelet server" Dec 12 17:40:51.337764 kubelet[3527]: I1212 17:40:51.337751 3527 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 12 17:40:51.339712 kubelet[3527]: I1212 17:40:51.339694 3527 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 12 17:40:51.344052 kubelet[3527]: I1212 17:40:51.344025 3527 volume_manager.go:313] "Starting Kubelet Volume Manager" Dec 12 17:40:51.344130 kubelet[3527]: I1212 17:40:51.344115 3527 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 12 17:40:51.344224 kubelet[3527]: I1212 17:40:51.344208 3527 reconciler.go:29] "Reconciler: start to sync state" Dec 12 17:40:51.345814 kubelet[3527]: E1212 17:40:51.345789 3527 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 12 17:40:51.346366 kubelet[3527]: I1212 17:40:51.346333 3527 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 12 17:40:51.348625 kubelet[3527]: I1212 17:40:51.348588 3527 factory.go:223] Registration of the containerd container factory successfully Dec 12 17:40:51.348625 kubelet[3527]: I1212 17:40:51.348607 3527 factory.go:223] Registration of the systemd container factory successfully Dec 12 17:40:51.350759 kubelet[3527]: I1212 17:40:51.350729 3527 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Dec 12 17:40:51.351701 kubelet[3527]: I1212 17:40:51.351685 3527 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Dec 12 17:40:51.351785 kubelet[3527]: I1212 17:40:51.351776 3527 status_manager.go:244] "Starting to sync pod status with apiserver" Dec 12 17:40:51.351850 kubelet[3527]: I1212 17:40:51.351843 3527 kubelet.go:2427] "Starting kubelet main sync loop" Dec 12 17:40:51.351944 kubelet[3527]: E1212 17:40:51.351923 3527 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 12 17:40:51.395079 kubelet[3527]: I1212 17:40:51.395048 3527 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 12 17:40:51.395079 kubelet[3527]: I1212 17:40:51.395070 3527 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 12 17:40:51.395079 kubelet[3527]: I1212 17:40:51.395090 3527 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:40:51.395255 kubelet[3527]: I1212 17:40:51.395240 3527 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 12 17:40:51.395274 kubelet[3527]: I1212 17:40:51.395253 3527 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 12 17:40:51.395274 kubelet[3527]: I1212 17:40:51.395268 3527 policy_none.go:49] "None policy: Start" Dec 12 17:40:51.395274 kubelet[3527]: I1212 17:40:51.395275 3527 memory_manager.go:187] "Starting memorymanager" policy="None" Dec 12 17:40:51.395324 kubelet[3527]: I1212 17:40:51.395283 3527 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Dec 12 17:40:51.395376 kubelet[3527]: I1212 17:40:51.395363 3527 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Dec 12 17:40:51.395376 kubelet[3527]: I1212 17:40:51.395375 3527 policy_none.go:47] "Start" Dec 12 17:40:51.399412 kubelet[3527]: E1212 17:40:51.399385 3527 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 12 17:40:51.400718 kubelet[3527]: I1212 17:40:51.400697 3527 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 12 17:40:51.400758 kubelet[3527]: I1212 17:40:51.400719 3527 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 12 17:40:51.402538 kubelet[3527]: I1212 17:40:51.402264 3527 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 12 17:40:51.403254 kubelet[3527]: E1212 17:40:51.403233 3527 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 12 17:40:51.453280 kubelet[3527]: I1212 17:40:51.453238 3527 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:51.453649 kubelet[3527]: I1212 17:40:51.453620 3527 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:51.453949 kubelet[3527]: I1212 17:40:51.453886 3527 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:51.460550 kubelet[3527]: I1212 17:40:51.460531 3527 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Dec 12 17:40:51.465087 kubelet[3527]: I1212 17:40:51.465062 3527 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Dec 12 17:40:51.465176 kubelet[3527]: E1212 17:40:51.465151 3527 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459.2.2-a-9f5170e2ca\" already exists" pod="kube-system/kube-scheduler-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:51.465432 kubelet[3527]: I1212 17:40:51.465217 3527 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Dec 12 17:40:51.465432 kubelet[3527]: E1212 17:40:51.465237 3527 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459.2.2-a-9f5170e2ca\" already exists" pod="kube-system/kube-apiserver-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:51.507359 kubelet[3527]: I1212 17:40:51.506471 3527 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:51.518994 kubelet[3527]: I1212 17:40:51.518932 3527 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:51.519980 kubelet[3527]: I1212 17:40:51.519269 3527 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:51.645127 kubelet[3527]: I1212 17:40:51.645079 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1b60b9cdb3818de3265058ced8148ea1-kubeconfig\") pod \"kube-scheduler-ci-4459.2.2-a-9f5170e2ca\" (UID: \"1b60b9cdb3818de3265058ced8148ea1\") " pod="kube-system/kube-scheduler-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:51.645459 kubelet[3527]: I1212 17:40:51.645108 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/51167a2a389bcebab9dd3134edbdb5e2-flexvolume-dir\") pod \"kube-controller-manager-ci-4459.2.2-a-9f5170e2ca\" (UID: \"51167a2a389bcebab9dd3134edbdb5e2\") " pod="kube-system/kube-controller-manager-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:51.645459 kubelet[3527]: I1212 17:40:51.645247 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/51167a2a389bcebab9dd3134edbdb5e2-k8s-certs\") pod \"kube-controller-manager-ci-4459.2.2-a-9f5170e2ca\" (UID: \"51167a2a389bcebab9dd3134edbdb5e2\") " pod="kube-system/kube-controller-manager-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:51.645459 kubelet[3527]: I1212 17:40:51.645258 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/51167a2a389bcebab9dd3134edbdb5e2-kubeconfig\") pod \"kube-controller-manager-ci-4459.2.2-a-9f5170e2ca\" (UID: \"51167a2a389bcebab9dd3134edbdb5e2\") " pod="kube-system/kube-controller-manager-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:51.645459 kubelet[3527]: I1212 17:40:51.645267 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e101b43f11f6291e032d2e63399f3a96-ca-certs\") pod \"kube-apiserver-ci-4459.2.2-a-9f5170e2ca\" (UID: \"e101b43f11f6291e032d2e63399f3a96\") " pod="kube-system/kube-apiserver-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:51.645826 kubelet[3527]: I1212 17:40:51.645542 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e101b43f11f6291e032d2e63399f3a96-k8s-certs\") pod \"kube-apiserver-ci-4459.2.2-a-9f5170e2ca\" (UID: \"e101b43f11f6291e032d2e63399f3a96\") " pod="kube-system/kube-apiserver-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:51.645826 kubelet[3527]: I1212 17:40:51.645562 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e101b43f11f6291e032d2e63399f3a96-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459.2.2-a-9f5170e2ca\" (UID: \"e101b43f11f6291e032d2e63399f3a96\") " pod="kube-system/kube-apiserver-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:51.645826 kubelet[3527]: I1212 17:40:51.645575 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/51167a2a389bcebab9dd3134edbdb5e2-ca-certs\") pod \"kube-controller-manager-ci-4459.2.2-a-9f5170e2ca\" (UID: \"51167a2a389bcebab9dd3134edbdb5e2\") " pod="kube-system/kube-controller-manager-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:51.645826 kubelet[3527]: I1212 17:40:51.645587 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/51167a2a389bcebab9dd3134edbdb5e2-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459.2.2-a-9f5170e2ca\" (UID: \"51167a2a389bcebab9dd3134edbdb5e2\") " pod="kube-system/kube-controller-manager-ci-4459.2.2-a-9f5170e2ca" Dec 12 17:40:52.324477 kubelet[3527]: I1212 17:40:52.324201 3527 apiserver.go:52] "Watching apiserver" Dec 12 17:40:52.345010 kubelet[3527]: I1212 17:40:52.344970 3527 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 12 17:40:52.455513 kubelet[3527]: I1212 17:40:52.455433 3527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459.2.2-a-9f5170e2ca" podStartSLOduration=1.455420946 podStartE2EDuration="1.455420946s" podCreationTimestamp="2025-12-12 17:40:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:40:52.455242164 +0000 UTC m=+1.193473129" watchObservedRunningTime="2025-12-12 17:40:52.455420946 +0000 UTC m=+1.193651919" Dec 12 17:40:52.466228 kubelet[3527]: I1212 17:40:52.466150 3527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459.2.2-a-9f5170e2ca" podStartSLOduration=2.466102044 podStartE2EDuration="2.466102044s" podCreationTimestamp="2025-12-12 17:40:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:40:52.466036105 +0000 UTC m=+1.204267070" watchObservedRunningTime="2025-12-12 17:40:52.466102044 +0000 UTC m=+1.204333009" Dec 12 17:40:52.475864 kubelet[3527]: I1212 17:40:52.475811 3527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459.2.2-a-9f5170e2ca" podStartSLOduration=2.475795846 podStartE2EDuration="2.475795846s" podCreationTimestamp="2025-12-12 17:40:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:40:52.475702691 +0000 UTC m=+1.213933688" watchObservedRunningTime="2025-12-12 17:40:52.475795846 +0000 UTC m=+1.214026819" Dec 12 17:40:56.565597 kubelet[3527]: I1212 17:40:56.565101 3527 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Dec 12 17:40:56.565597 kubelet[3527]: I1212 17:40:56.565554 3527 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Dec 12 17:40:56.566035 containerd[1916]: time="2025-12-12T17:40:56.565380312Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Dec 12 17:40:57.639302 systemd[1]: Created slice kubepods-besteffort-podf34beab7_0530_4611_b7c3_1670d777662a.slice - libcontainer container kubepods-besteffort-podf34beab7_0530_4611_b7c3_1670d777662a.slice. Dec 12 17:40:57.682070 kubelet[3527]: I1212 17:40:57.681950 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdkml\" (UniqueName: \"kubernetes.io/projected/f34beab7-0530-4611-b7c3-1670d777662a-kube-api-access-xdkml\") pod \"kube-proxy-sfzz6\" (UID: \"f34beab7-0530-4611-b7c3-1670d777662a\") " pod="kube-system/kube-proxy-sfzz6" Dec 12 17:40:57.682070 kubelet[3527]: I1212 17:40:57.681992 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/f34beab7-0530-4611-b7c3-1670d777662a-kube-proxy\") pod \"kube-proxy-sfzz6\" (UID: \"f34beab7-0530-4611-b7c3-1670d777662a\") " pod="kube-system/kube-proxy-sfzz6" Dec 12 17:40:57.682070 kubelet[3527]: I1212 17:40:57.682006 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f34beab7-0530-4611-b7c3-1670d777662a-xtables-lock\") pod \"kube-proxy-sfzz6\" (UID: \"f34beab7-0530-4611-b7c3-1670d777662a\") " pod="kube-system/kube-proxy-sfzz6" Dec 12 17:40:57.682070 kubelet[3527]: I1212 17:40:57.682017 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f34beab7-0530-4611-b7c3-1670d777662a-lib-modules\") pod \"kube-proxy-sfzz6\" (UID: \"f34beab7-0530-4611-b7c3-1670d777662a\") " pod="kube-system/kube-proxy-sfzz6" Dec 12 17:40:57.749030 systemd[1]: Created slice kubepods-besteffort-pod5cc02624_e606_4acf_a582_57c05c2e5bde.slice - libcontainer container kubepods-besteffort-pod5cc02624_e606_4acf_a582_57c05c2e5bde.slice. Dec 12 17:40:57.784548 kubelet[3527]: I1212 17:40:57.782848 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4kns\" (UniqueName: \"kubernetes.io/projected/5cc02624-e606-4acf-a582-57c05c2e5bde-kube-api-access-b4kns\") pod \"tigera-operator-65cdcdfd6d-49pdk\" (UID: \"5cc02624-e606-4acf-a582-57c05c2e5bde\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-49pdk" Dec 12 17:40:57.784548 kubelet[3527]: I1212 17:40:57.782902 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/5cc02624-e606-4acf-a582-57c05c2e5bde-var-lib-calico\") pod \"tigera-operator-65cdcdfd6d-49pdk\" (UID: \"5cc02624-e606-4acf-a582-57c05c2e5bde\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-49pdk" Dec 12 17:40:57.957662 containerd[1916]: time="2025-12-12T17:40:57.957614474Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-sfzz6,Uid:f34beab7-0530-4611-b7c3-1670d777662a,Namespace:kube-system,Attempt:0,}" Dec 12 17:40:57.992024 containerd[1916]: time="2025-12-12T17:40:57.991975404Z" level=info msg="connecting to shim 0fdf3164c4964409d9dcb2d9b04aa26381d86843f16c974de215734f22004236" address="unix:///run/containerd/s/a4a3a194bb61968b081b969840e0be24764286189597ad7428cc51bca0eb8fff" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:40:58.013717 systemd[1]: Started cri-containerd-0fdf3164c4964409d9dcb2d9b04aa26381d86843f16c974de215734f22004236.scope - libcontainer container 0fdf3164c4964409d9dcb2d9b04aa26381d86843f16c974de215734f22004236. Dec 12 17:40:58.036343 containerd[1916]: time="2025-12-12T17:40:58.036241444Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-sfzz6,Uid:f34beab7-0530-4611-b7c3-1670d777662a,Namespace:kube-system,Attempt:0,} returns sandbox id \"0fdf3164c4964409d9dcb2d9b04aa26381d86843f16c974de215734f22004236\"" Dec 12 17:40:58.047969 containerd[1916]: time="2025-12-12T17:40:58.047592103Z" level=info msg="CreateContainer within sandbox \"0fdf3164c4964409d9dcb2d9b04aa26381d86843f16c974de215734f22004236\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Dec 12 17:40:58.062784 containerd[1916]: time="2025-12-12T17:40:58.062745799Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-49pdk,Uid:5cc02624-e606-4acf-a582-57c05c2e5bde,Namespace:tigera-operator,Attempt:0,}" Dec 12 17:40:58.072621 containerd[1916]: time="2025-12-12T17:40:58.072581907Z" level=info msg="Container 263c02b2abf565f4a723b5316849a9ed51ce23481799fa082b9268869c703a91: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:40:58.105101 containerd[1916]: time="2025-12-12T17:40:58.105051323Z" level=info msg="CreateContainer within sandbox \"0fdf3164c4964409d9dcb2d9b04aa26381d86843f16c974de215734f22004236\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"263c02b2abf565f4a723b5316849a9ed51ce23481799fa082b9268869c703a91\"" Dec 12 17:40:58.106042 containerd[1916]: time="2025-12-12T17:40:58.105939643Z" level=info msg="StartContainer for \"263c02b2abf565f4a723b5316849a9ed51ce23481799fa082b9268869c703a91\"" Dec 12 17:40:58.107587 containerd[1916]: time="2025-12-12T17:40:58.107535765Z" level=info msg="connecting to shim 263c02b2abf565f4a723b5316849a9ed51ce23481799fa082b9268869c703a91" address="unix:///run/containerd/s/a4a3a194bb61968b081b969840e0be24764286189597ad7428cc51bca0eb8fff" protocol=ttrpc version=3 Dec 12 17:40:58.118521 containerd[1916]: time="2025-12-12T17:40:58.118455981Z" level=info msg="connecting to shim c608c2410a99fddab63ca84cb556280d62ddaf2f7a71901148f8d022155dcd0a" address="unix:///run/containerd/s/c3cb5485817b5d3499ab336af08006f2d4d101381654dc5e5ac22d9c236146ee" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:40:58.131738 systemd[1]: Started cri-containerd-263c02b2abf565f4a723b5316849a9ed51ce23481799fa082b9268869c703a91.scope - libcontainer container 263c02b2abf565f4a723b5316849a9ed51ce23481799fa082b9268869c703a91. Dec 12 17:40:58.141809 systemd[1]: Started cri-containerd-c608c2410a99fddab63ca84cb556280d62ddaf2f7a71901148f8d022155dcd0a.scope - libcontainer container c608c2410a99fddab63ca84cb556280d62ddaf2f7a71901148f8d022155dcd0a. Dec 12 17:40:58.191644 containerd[1916]: time="2025-12-12T17:40:58.191585446Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-49pdk,Uid:5cc02624-e606-4acf-a582-57c05c2e5bde,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"c608c2410a99fddab63ca84cb556280d62ddaf2f7a71901148f8d022155dcd0a\"" Dec 12 17:40:58.195979 containerd[1916]: time="2025-12-12T17:40:58.195933185Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Dec 12 17:40:58.196545 containerd[1916]: time="2025-12-12T17:40:58.196517192Z" level=info msg="StartContainer for \"263c02b2abf565f4a723b5316849a9ed51ce23481799fa082b9268869c703a91\" returns successfully" Dec 12 17:40:58.792748 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1670413274.mount: Deactivated successfully. Dec 12 17:40:59.155831 kubelet[3527]: I1212 17:40:59.155462 3527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-sfzz6" podStartSLOduration=2.155445161 podStartE2EDuration="2.155445161s" podCreationTimestamp="2025-12-12 17:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:40:58.405941989 +0000 UTC m=+7.144172954" watchObservedRunningTime="2025-12-12 17:40:59.155445161 +0000 UTC m=+7.893676126" Dec 12 17:41:00.080795 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2690206502.mount: Deactivated successfully. Dec 12 17:41:00.583193 containerd[1916]: time="2025-12-12T17:41:00.583143234Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:41:00.586019 containerd[1916]: time="2025-12-12T17:41:00.585829790Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=22152004" Dec 12 17:41:00.588752 containerd[1916]: time="2025-12-12T17:41:00.588710680Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:41:00.592604 containerd[1916]: time="2025-12-12T17:41:00.592420435Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:41:00.592999 containerd[1916]: time="2025-12-12T17:41:00.592971253Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 2.39700053s" Dec 12 17:41:00.593137 containerd[1916]: time="2025-12-12T17:41:00.593069624Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Dec 12 17:41:00.599534 containerd[1916]: time="2025-12-12T17:41:00.599442334Z" level=info msg="CreateContainer within sandbox \"c608c2410a99fddab63ca84cb556280d62ddaf2f7a71901148f8d022155dcd0a\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Dec 12 17:41:00.614002 containerd[1916]: time="2025-12-12T17:41:00.613962323Z" level=info msg="Container 762474ef3d5d8d984d958713a625cf0fd68fcaa8f72b6e3860e650c9e2e97ba9: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:41:00.618918 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1843409972.mount: Deactivated successfully. Dec 12 17:41:00.628773 containerd[1916]: time="2025-12-12T17:41:00.628708919Z" level=info msg="CreateContainer within sandbox \"c608c2410a99fddab63ca84cb556280d62ddaf2f7a71901148f8d022155dcd0a\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"762474ef3d5d8d984d958713a625cf0fd68fcaa8f72b6e3860e650c9e2e97ba9\"" Dec 12 17:41:00.629439 containerd[1916]: time="2025-12-12T17:41:00.629415589Z" level=info msg="StartContainer for \"762474ef3d5d8d984d958713a625cf0fd68fcaa8f72b6e3860e650c9e2e97ba9\"" Dec 12 17:41:00.631369 containerd[1916]: time="2025-12-12T17:41:00.631301815Z" level=info msg="connecting to shim 762474ef3d5d8d984d958713a625cf0fd68fcaa8f72b6e3860e650c9e2e97ba9" address="unix:///run/containerd/s/c3cb5485817b5d3499ab336af08006f2d4d101381654dc5e5ac22d9c236146ee" protocol=ttrpc version=3 Dec 12 17:41:00.651661 systemd[1]: Started cri-containerd-762474ef3d5d8d984d958713a625cf0fd68fcaa8f72b6e3860e650c9e2e97ba9.scope - libcontainer container 762474ef3d5d8d984d958713a625cf0fd68fcaa8f72b6e3860e650c9e2e97ba9. Dec 12 17:41:00.678181 containerd[1916]: time="2025-12-12T17:41:00.678140259Z" level=info msg="StartContainer for \"762474ef3d5d8d984d958713a625cf0fd68fcaa8f72b6e3860e650c9e2e97ba9\" returns successfully" Dec 12 17:41:02.609908 kubelet[3527]: I1212 17:41:02.609842 3527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-65cdcdfd6d-49pdk" podStartSLOduration=3.210226959 podStartE2EDuration="5.60982401s" podCreationTimestamp="2025-12-12 17:40:57 +0000 UTC" firstStartedPulling="2025-12-12 17:40:58.194192675 +0000 UTC m=+6.932423640" lastFinishedPulling="2025-12-12 17:41:00.593789726 +0000 UTC m=+9.332020691" observedRunningTime="2025-12-12 17:41:01.413944569 +0000 UTC m=+10.152175534" watchObservedRunningTime="2025-12-12 17:41:02.60982401 +0000 UTC m=+11.348054975" Dec 12 17:41:05.722597 sudo[2375]: pam_unix(sudo:session): session closed for user root Dec 12 17:41:05.803885 sshd[2374]: Connection closed by 10.200.16.10 port 40526 Dec 12 17:41:05.802979 sshd-session[2371]: pam_unix(sshd:session): session closed for user core Dec 12 17:41:05.807367 systemd[1]: session-9.scope: Deactivated successfully. Dec 12 17:41:05.807560 systemd[1]: session-9.scope: Consumed 3.473s CPU time, 223.1M memory peak. Dec 12 17:41:05.810724 systemd[1]: sshd@6-10.200.20.11:22-10.200.16.10:40526.service: Deactivated successfully. Dec 12 17:41:05.817286 systemd-logind[1877]: Session 9 logged out. Waiting for processes to exit. Dec 12 17:41:05.819156 systemd-logind[1877]: Removed session 9. Dec 12 17:41:13.123236 systemd[1]: Created slice kubepods-besteffort-pod1681c0a7_4647_45c3_8da3_c329373abc11.slice - libcontainer container kubepods-besteffort-pod1681c0a7_4647_45c3_8da3_c329373abc11.slice. Dec 12 17:41:13.175749 kubelet[3527]: I1212 17:41:13.175701 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/1681c0a7-4647-45c3-8da3-c329373abc11-typha-certs\") pod \"calico-typha-679b86445-ckqs8\" (UID: \"1681c0a7-4647-45c3-8da3-c329373abc11\") " pod="calico-system/calico-typha-679b86445-ckqs8" Dec 12 17:41:13.175749 kubelet[3527]: I1212 17:41:13.175745 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1681c0a7-4647-45c3-8da3-c329373abc11-tigera-ca-bundle\") pod \"calico-typha-679b86445-ckqs8\" (UID: \"1681c0a7-4647-45c3-8da3-c329373abc11\") " pod="calico-system/calico-typha-679b86445-ckqs8" Dec 12 17:41:13.175749 kubelet[3527]: I1212 17:41:13.175763 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjw67\" (UniqueName: \"kubernetes.io/projected/1681c0a7-4647-45c3-8da3-c329373abc11-kube-api-access-jjw67\") pod \"calico-typha-679b86445-ckqs8\" (UID: \"1681c0a7-4647-45c3-8da3-c329373abc11\") " pod="calico-system/calico-typha-679b86445-ckqs8" Dec 12 17:41:13.359645 systemd[1]: Created slice kubepods-besteffort-poda788050a_0a03_442e_9ef2_b058610fda91.slice - libcontainer container kubepods-besteffort-poda788050a_0a03_442e_9ef2_b058610fda91.slice. Dec 12 17:41:13.377940 kubelet[3527]: I1212 17:41:13.377237 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/a788050a-0a03-442e-9ef2-b058610fda91-cni-net-dir\") pod \"calico-node-9vh7g\" (UID: \"a788050a-0a03-442e-9ef2-b058610fda91\") " pod="calico-system/calico-node-9vh7g" Dec 12 17:41:13.377940 kubelet[3527]: I1212 17:41:13.377876 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/a788050a-0a03-442e-9ef2-b058610fda91-cni-log-dir\") pod \"calico-node-9vh7g\" (UID: \"a788050a-0a03-442e-9ef2-b058610fda91\") " pod="calico-system/calico-node-9vh7g" Dec 12 17:41:13.377940 kubelet[3527]: I1212 17:41:13.377892 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a788050a-0a03-442e-9ef2-b058610fda91-var-lib-calico\") pod \"calico-node-9vh7g\" (UID: \"a788050a-0a03-442e-9ef2-b058610fda91\") " pod="calico-system/calico-node-9vh7g" Dec 12 17:41:13.377940 kubelet[3527]: I1212 17:41:13.377901 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/a788050a-0a03-442e-9ef2-b058610fda91-var-run-calico\") pod \"calico-node-9vh7g\" (UID: \"a788050a-0a03-442e-9ef2-b058610fda91\") " pod="calico-system/calico-node-9vh7g" Dec 12 17:41:13.377940 kubelet[3527]: I1212 17:41:13.377912 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtwvb\" (UniqueName: \"kubernetes.io/projected/a788050a-0a03-442e-9ef2-b058610fda91-kube-api-access-wtwvb\") pod \"calico-node-9vh7g\" (UID: \"a788050a-0a03-442e-9ef2-b058610fda91\") " pod="calico-system/calico-node-9vh7g" Dec 12 17:41:13.378292 kubelet[3527]: I1212 17:41:13.377924 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/a788050a-0a03-442e-9ef2-b058610fda91-cni-bin-dir\") pod \"calico-node-9vh7g\" (UID: \"a788050a-0a03-442e-9ef2-b058610fda91\") " pod="calico-system/calico-node-9vh7g" Dec 12 17:41:13.378292 kubelet[3527]: I1212 17:41:13.378217 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/a788050a-0a03-442e-9ef2-b058610fda91-node-certs\") pod \"calico-node-9vh7g\" (UID: \"a788050a-0a03-442e-9ef2-b058610fda91\") " pod="calico-system/calico-node-9vh7g" Dec 12 17:41:13.378292 kubelet[3527]: I1212 17:41:13.378231 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a788050a-0a03-442e-9ef2-b058610fda91-tigera-ca-bundle\") pod \"calico-node-9vh7g\" (UID: \"a788050a-0a03-442e-9ef2-b058610fda91\") " pod="calico-system/calico-node-9vh7g" Dec 12 17:41:13.378292 kubelet[3527]: I1212 17:41:13.378242 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/a788050a-0a03-442e-9ef2-b058610fda91-flexvol-driver-host\") pod \"calico-node-9vh7g\" (UID: \"a788050a-0a03-442e-9ef2-b058610fda91\") " pod="calico-system/calico-node-9vh7g" Dec 12 17:41:13.378292 kubelet[3527]: I1212 17:41:13.378251 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a788050a-0a03-442e-9ef2-b058610fda91-lib-modules\") pod \"calico-node-9vh7g\" (UID: \"a788050a-0a03-442e-9ef2-b058610fda91\") " pod="calico-system/calico-node-9vh7g" Dec 12 17:41:13.378402 kubelet[3527]: I1212 17:41:13.378260 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a788050a-0a03-442e-9ef2-b058610fda91-xtables-lock\") pod \"calico-node-9vh7g\" (UID: \"a788050a-0a03-442e-9ef2-b058610fda91\") " pod="calico-system/calico-node-9vh7g" Dec 12 17:41:13.378402 kubelet[3527]: I1212 17:41:13.378276 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/a788050a-0a03-442e-9ef2-b058610fda91-policysync\") pod \"calico-node-9vh7g\" (UID: \"a788050a-0a03-442e-9ef2-b058610fda91\") " pod="calico-system/calico-node-9vh7g" Dec 12 17:41:13.433696 containerd[1916]: time="2025-12-12T17:41:13.433590211Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-679b86445-ckqs8,Uid:1681c0a7-4647-45c3-8da3-c329373abc11,Namespace:calico-system,Attempt:0,}" Dec 12 17:41:13.473701 containerd[1916]: time="2025-12-12T17:41:13.473650662Z" level=info msg="connecting to shim ca1eec65796ca3179ee05661a4075c6d313c9c5b36fad1454c847d36848ed574" address="unix:///run/containerd/s/94715fe5a3b6c8a23a336edf0e939a6b78e56ce4db3cc9615b28f931ddab5e42" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:41:13.488906 kubelet[3527]: E1212 17:41:13.488834 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.488906 kubelet[3527]: W1212 17:41:13.488854 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.488906 kubelet[3527]: E1212 17:41:13.488874 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.494233 kubelet[3527]: E1212 17:41:13.494212 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.494407 kubelet[3527]: W1212 17:41:13.494360 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.494407 kubelet[3527]: E1212 17:41:13.494383 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.502676 systemd[1]: Started cri-containerd-ca1eec65796ca3179ee05661a4075c6d313c9c5b36fad1454c847d36848ed574.scope - libcontainer container ca1eec65796ca3179ee05661a4075c6d313c9c5b36fad1454c847d36848ed574. Dec 12 17:41:13.554239 kubelet[3527]: E1212 17:41:13.553531 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nhpd8" podUID="aae497cc-3748-48de-b6f7-6585350a2476" Dec 12 17:41:13.563615 kubelet[3527]: E1212 17:41:13.563582 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.563615 kubelet[3527]: W1212 17:41:13.563604 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.563787 kubelet[3527]: E1212 17:41:13.563637 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.563787 kubelet[3527]: E1212 17:41:13.563766 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.564247 kubelet[3527]: W1212 17:41:13.563773 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.564247 kubelet[3527]: E1212 17:41:13.563820 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.564247 kubelet[3527]: E1212 17:41:13.563921 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.564247 kubelet[3527]: W1212 17:41:13.563925 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.564247 kubelet[3527]: E1212 17:41:13.563946 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.564247 kubelet[3527]: E1212 17:41:13.564047 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.564247 kubelet[3527]: W1212 17:41:13.564052 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.564247 kubelet[3527]: E1212 17:41:13.564058 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.564247 kubelet[3527]: E1212 17:41:13.564162 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.564247 kubelet[3527]: W1212 17:41:13.564180 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.565956 kubelet[3527]: E1212 17:41:13.564187 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.565956 kubelet[3527]: E1212 17:41:13.564275 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.565956 kubelet[3527]: W1212 17:41:13.564280 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.565956 kubelet[3527]: E1212 17:41:13.564285 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.565956 kubelet[3527]: E1212 17:41:13.564383 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.565956 kubelet[3527]: W1212 17:41:13.564388 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.565956 kubelet[3527]: E1212 17:41:13.564394 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.565956 kubelet[3527]: E1212 17:41:13.564709 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.565956 kubelet[3527]: W1212 17:41:13.564719 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.565956 kubelet[3527]: E1212 17:41:13.564728 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.566112 kubelet[3527]: E1212 17:41:13.564900 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.566112 kubelet[3527]: W1212 17:41:13.564907 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.566112 kubelet[3527]: E1212 17:41:13.564914 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.566112 kubelet[3527]: E1212 17:41:13.565025 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.566112 kubelet[3527]: W1212 17:41:13.565031 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.566112 kubelet[3527]: E1212 17:41:13.565037 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.566112 kubelet[3527]: E1212 17:41:13.565121 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.566112 kubelet[3527]: W1212 17:41:13.565125 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.566112 kubelet[3527]: E1212 17:41:13.565130 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.566112 kubelet[3527]: E1212 17:41:13.565219 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.566290 kubelet[3527]: W1212 17:41:13.565223 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.566290 kubelet[3527]: E1212 17:41:13.565228 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.566290 kubelet[3527]: E1212 17:41:13.565357 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.566290 kubelet[3527]: W1212 17:41:13.565364 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.566290 kubelet[3527]: E1212 17:41:13.565372 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.566290 kubelet[3527]: E1212 17:41:13.565486 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.566290 kubelet[3527]: W1212 17:41:13.565491 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.566290 kubelet[3527]: E1212 17:41:13.565508 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.566290 kubelet[3527]: E1212 17:41:13.565737 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.566290 kubelet[3527]: W1212 17:41:13.565745 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.566424 kubelet[3527]: E1212 17:41:13.565753 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.566424 kubelet[3527]: E1212 17:41:13.565970 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.566424 kubelet[3527]: W1212 17:41:13.565978 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.566424 kubelet[3527]: E1212 17:41:13.565987 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.566424 kubelet[3527]: E1212 17:41:13.566109 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.566424 kubelet[3527]: W1212 17:41:13.566115 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.566424 kubelet[3527]: E1212 17:41:13.566122 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.566424 kubelet[3527]: E1212 17:41:13.566202 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.566424 kubelet[3527]: W1212 17:41:13.566207 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.566424 kubelet[3527]: E1212 17:41:13.566213 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.567218 kubelet[3527]: E1212 17:41:13.566284 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.567218 kubelet[3527]: W1212 17:41:13.566289 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.567218 kubelet[3527]: E1212 17:41:13.566293 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.567218 kubelet[3527]: E1212 17:41:13.566368 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.567218 kubelet[3527]: W1212 17:41:13.566372 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.567218 kubelet[3527]: E1212 17:41:13.566376 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.567731 containerd[1916]: time="2025-12-12T17:41:13.567476170Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-679b86445-ckqs8,Uid:1681c0a7-4647-45c3-8da3-c329373abc11,Namespace:calico-system,Attempt:0,} returns sandbox id \"ca1eec65796ca3179ee05661a4075c6d313c9c5b36fad1454c847d36848ed574\"" Dec 12 17:41:13.571564 containerd[1916]: time="2025-12-12T17:41:13.571493563Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Dec 12 17:41:13.580394 kubelet[3527]: E1212 17:41:13.580368 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.580394 kubelet[3527]: W1212 17:41:13.580389 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.580612 kubelet[3527]: E1212 17:41:13.580407 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.580612 kubelet[3527]: I1212 17:41:13.580428 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aae497cc-3748-48de-b6f7-6585350a2476-kubelet-dir\") pod \"csi-node-driver-nhpd8\" (UID: \"aae497cc-3748-48de-b6f7-6585350a2476\") " pod="calico-system/csi-node-driver-nhpd8" Dec 12 17:41:13.580649 kubelet[3527]: E1212 17:41:13.580633 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.580649 kubelet[3527]: W1212 17:41:13.580642 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.580680 kubelet[3527]: E1212 17:41:13.580649 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.580680 kubelet[3527]: I1212 17:41:13.580671 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kdvj\" (UniqueName: \"kubernetes.io/projected/aae497cc-3748-48de-b6f7-6585350a2476-kube-api-access-5kdvj\") pod \"csi-node-driver-nhpd8\" (UID: \"aae497cc-3748-48de-b6f7-6585350a2476\") " pod="calico-system/csi-node-driver-nhpd8" Dec 12 17:41:13.580970 kubelet[3527]: E1212 17:41:13.580814 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.580970 kubelet[3527]: W1212 17:41:13.580825 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.580970 kubelet[3527]: E1212 17:41:13.580832 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.580970 kubelet[3527]: I1212 17:41:13.580848 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/aae497cc-3748-48de-b6f7-6585350a2476-registration-dir\") pod \"csi-node-driver-nhpd8\" (UID: \"aae497cc-3748-48de-b6f7-6585350a2476\") " pod="calico-system/csi-node-driver-nhpd8" Dec 12 17:41:13.580970 kubelet[3527]: E1212 17:41:13.580978 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.581203 kubelet[3527]: W1212 17:41:13.580985 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.581203 kubelet[3527]: E1212 17:41:13.580993 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.581203 kubelet[3527]: I1212 17:41:13.581012 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/aae497cc-3748-48de-b6f7-6585350a2476-varrun\") pod \"csi-node-driver-nhpd8\" (UID: \"aae497cc-3748-48de-b6f7-6585350a2476\") " pod="calico-system/csi-node-driver-nhpd8" Dec 12 17:41:13.581203 kubelet[3527]: E1212 17:41:13.581116 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.581203 kubelet[3527]: W1212 17:41:13.581133 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.581203 kubelet[3527]: E1212 17:41:13.581139 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.581203 kubelet[3527]: I1212 17:41:13.581151 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/aae497cc-3748-48de-b6f7-6585350a2476-socket-dir\") pod \"csi-node-driver-nhpd8\" (UID: \"aae497cc-3748-48de-b6f7-6585350a2476\") " pod="calico-system/csi-node-driver-nhpd8" Dec 12 17:41:13.581796 kubelet[3527]: E1212 17:41:13.581281 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.581796 kubelet[3527]: W1212 17:41:13.581287 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.581796 kubelet[3527]: E1212 17:41:13.581295 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.581796 kubelet[3527]: E1212 17:41:13.581390 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.581796 kubelet[3527]: W1212 17:41:13.581395 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.581796 kubelet[3527]: E1212 17:41:13.581400 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.581796 kubelet[3527]: E1212 17:41:13.581526 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.581796 kubelet[3527]: W1212 17:41:13.581531 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.581796 kubelet[3527]: E1212 17:41:13.581537 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.581796 kubelet[3527]: E1212 17:41:13.581674 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.582062 kubelet[3527]: W1212 17:41:13.581680 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.582062 kubelet[3527]: E1212 17:41:13.581686 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.582062 kubelet[3527]: E1212 17:41:13.581808 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.582062 kubelet[3527]: W1212 17:41:13.581820 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.582062 kubelet[3527]: E1212 17:41:13.581826 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.582062 kubelet[3527]: E1212 17:41:13.581915 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.582062 kubelet[3527]: W1212 17:41:13.581919 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.582062 kubelet[3527]: E1212 17:41:13.581924 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.582977 kubelet[3527]: E1212 17:41:13.582113 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.582977 kubelet[3527]: W1212 17:41:13.582120 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.582977 kubelet[3527]: E1212 17:41:13.582127 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.582977 kubelet[3527]: E1212 17:41:13.582719 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.582977 kubelet[3527]: W1212 17:41:13.582747 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.582977 kubelet[3527]: E1212 17:41:13.582774 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.582977 kubelet[3527]: E1212 17:41:13.582946 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.582977 kubelet[3527]: W1212 17:41:13.582953 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.582977 kubelet[3527]: E1212 17:41:13.582960 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.583710 kubelet[3527]: E1212 17:41:13.583681 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.583710 kubelet[3527]: W1212 17:41:13.583706 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.583797 kubelet[3527]: E1212 17:41:13.583717 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.670406 containerd[1916]: time="2025-12-12T17:41:13.670353983Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9vh7g,Uid:a788050a-0a03-442e-9ef2-b058610fda91,Namespace:calico-system,Attempt:0,}" Dec 12 17:41:13.682165 kubelet[3527]: E1212 17:41:13.682134 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.682165 kubelet[3527]: W1212 17:41:13.682161 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.682296 kubelet[3527]: E1212 17:41:13.682181 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.682381 kubelet[3527]: E1212 17:41:13.682363 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.682381 kubelet[3527]: W1212 17:41:13.682376 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.682412 kubelet[3527]: E1212 17:41:13.682384 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.682831 kubelet[3527]: E1212 17:41:13.682812 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.682831 kubelet[3527]: W1212 17:41:13.682827 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.682898 kubelet[3527]: E1212 17:41:13.682837 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.683288 kubelet[3527]: E1212 17:41:13.683271 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.683288 kubelet[3527]: W1212 17:41:13.683284 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.683338 kubelet[3527]: E1212 17:41:13.683295 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.683478 kubelet[3527]: E1212 17:41:13.683463 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.683478 kubelet[3527]: W1212 17:41:13.683474 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.684668 kubelet[3527]: E1212 17:41:13.683481 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.684668 kubelet[3527]: E1212 17:41:13.683633 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.684668 kubelet[3527]: W1212 17:41:13.683639 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.684668 kubelet[3527]: E1212 17:41:13.683645 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.684668 kubelet[3527]: E1212 17:41:13.683768 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.684668 kubelet[3527]: W1212 17:41:13.683774 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.684668 kubelet[3527]: E1212 17:41:13.683780 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.684668 kubelet[3527]: E1212 17:41:13.683906 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.684668 kubelet[3527]: W1212 17:41:13.683912 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.684668 kubelet[3527]: E1212 17:41:13.683918 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.684817 kubelet[3527]: E1212 17:41:13.684017 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.684817 kubelet[3527]: W1212 17:41:13.684023 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.684817 kubelet[3527]: E1212 17:41:13.684029 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.684817 kubelet[3527]: E1212 17:41:13.684136 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.684817 kubelet[3527]: W1212 17:41:13.684142 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.684817 kubelet[3527]: E1212 17:41:13.684147 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.684817 kubelet[3527]: E1212 17:41:13.684311 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.684817 kubelet[3527]: W1212 17:41:13.684317 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.684817 kubelet[3527]: E1212 17:41:13.684323 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.684817 kubelet[3527]: E1212 17:41:13.684468 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.684950 kubelet[3527]: W1212 17:41:13.684474 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.684950 kubelet[3527]: E1212 17:41:13.684479 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.684950 kubelet[3527]: E1212 17:41:13.684669 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.684950 kubelet[3527]: W1212 17:41:13.684676 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.684950 kubelet[3527]: E1212 17:41:13.684684 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.684950 kubelet[3527]: E1212 17:41:13.684811 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.684950 kubelet[3527]: W1212 17:41:13.684817 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.684950 kubelet[3527]: E1212 17:41:13.684823 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.684950 kubelet[3527]: E1212 17:41:13.684929 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.684950 kubelet[3527]: W1212 17:41:13.684934 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.685083 kubelet[3527]: E1212 17:41:13.684939 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.685192 kubelet[3527]: E1212 17:41:13.685175 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.685192 kubelet[3527]: W1212 17:41:13.685186 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.685192 kubelet[3527]: E1212 17:41:13.685192 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.685312 kubelet[3527]: E1212 17:41:13.685299 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.685312 kubelet[3527]: W1212 17:41:13.685308 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.685343 kubelet[3527]: E1212 17:41:13.685313 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.685456 kubelet[3527]: E1212 17:41:13.685441 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.685456 kubelet[3527]: W1212 17:41:13.685451 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.685456 kubelet[3527]: E1212 17:41:13.685456 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.685638 kubelet[3527]: E1212 17:41:13.685623 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.685638 kubelet[3527]: W1212 17:41:13.685633 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.685687 kubelet[3527]: E1212 17:41:13.685639 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.685767 kubelet[3527]: E1212 17:41:13.685752 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.685767 kubelet[3527]: W1212 17:41:13.685762 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.685798 kubelet[3527]: E1212 17:41:13.685768 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.686113 kubelet[3527]: E1212 17:41:13.686094 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.686113 kubelet[3527]: W1212 17:41:13.686109 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.686158 kubelet[3527]: E1212 17:41:13.686117 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.686317 kubelet[3527]: E1212 17:41:13.686301 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.686317 kubelet[3527]: W1212 17:41:13.686311 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.686317 kubelet[3527]: E1212 17:41:13.686318 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.686464 kubelet[3527]: E1212 17:41:13.686449 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.686464 kubelet[3527]: W1212 17:41:13.686460 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.686519 kubelet[3527]: E1212 17:41:13.686466 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.686652 kubelet[3527]: E1212 17:41:13.686637 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.686652 kubelet[3527]: W1212 17:41:13.686647 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.686652 kubelet[3527]: E1212 17:41:13.686654 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.686880 kubelet[3527]: E1212 17:41:13.686866 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.686880 kubelet[3527]: W1212 17:41:13.686876 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.686931 kubelet[3527]: E1212 17:41:13.686884 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.694096 kubelet[3527]: E1212 17:41:13.694071 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:13.694096 kubelet[3527]: W1212 17:41:13.694088 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:13.694096 kubelet[3527]: E1212 17:41:13.694099 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:13.715900 containerd[1916]: time="2025-12-12T17:41:13.715850670Z" level=info msg="connecting to shim bea02ce28146269d9a594a53e7b5ed032a010077140e6e783a5b7287b441771c" address="unix:///run/containerd/s/ffa60f5488f50beb794af8d3aa738da9c5fd9bf52faa1fe4188f831e9a4e5082" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:41:13.734668 systemd[1]: Started cri-containerd-bea02ce28146269d9a594a53e7b5ed032a010077140e6e783a5b7287b441771c.scope - libcontainer container bea02ce28146269d9a594a53e7b5ed032a010077140e6e783a5b7287b441771c. Dec 12 17:41:13.759594 containerd[1916]: time="2025-12-12T17:41:13.759552751Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9vh7g,Uid:a788050a-0a03-442e-9ef2-b058610fda91,Namespace:calico-system,Attempt:0,} returns sandbox id \"bea02ce28146269d9a594a53e7b5ed032a010077140e6e783a5b7287b441771c\"" Dec 12 17:41:14.881459 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1795395832.mount: Deactivated successfully. Dec 12 17:41:15.267515 containerd[1916]: time="2025-12-12T17:41:15.267437063Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:41:15.270589 containerd[1916]: time="2025-12-12T17:41:15.270554565Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=33090687" Dec 12 17:41:15.273966 containerd[1916]: time="2025-12-12T17:41:15.273918763Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:41:15.277714 containerd[1916]: time="2025-12-12T17:41:15.277662916Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:41:15.278240 containerd[1916]: time="2025-12-12T17:41:15.277950845Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 1.706391664s" Dec 12 17:41:15.278240 containerd[1916]: time="2025-12-12T17:41:15.277978390Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Dec 12 17:41:15.280468 containerd[1916]: time="2025-12-12T17:41:15.280442552Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Dec 12 17:41:15.296228 containerd[1916]: time="2025-12-12T17:41:15.296138859Z" level=info msg="CreateContainer within sandbox \"ca1eec65796ca3179ee05661a4075c6d313c9c5b36fad1454c847d36848ed574\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Dec 12 17:41:15.318519 containerd[1916]: time="2025-12-12T17:41:15.316883694Z" level=info msg="Container f58bf678c4a03c1bcbc84dec5755e34e4745db111ca4ff45b2daf120ada2d376: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:41:15.321346 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount787263746.mount: Deactivated successfully. Dec 12 17:41:15.338364 containerd[1916]: time="2025-12-12T17:41:15.338318517Z" level=info msg="CreateContainer within sandbox \"ca1eec65796ca3179ee05661a4075c6d313c9c5b36fad1454c847d36848ed574\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"f58bf678c4a03c1bcbc84dec5755e34e4745db111ca4ff45b2daf120ada2d376\"" Dec 12 17:41:15.342598 containerd[1916]: time="2025-12-12T17:41:15.342563110Z" level=info msg="StartContainer for \"f58bf678c4a03c1bcbc84dec5755e34e4745db111ca4ff45b2daf120ada2d376\"" Dec 12 17:41:15.343556 containerd[1916]: time="2025-12-12T17:41:15.343529515Z" level=info msg="connecting to shim f58bf678c4a03c1bcbc84dec5755e34e4745db111ca4ff45b2daf120ada2d376" address="unix:///run/containerd/s/94715fe5a3b6c8a23a336edf0e939a6b78e56ce4db3cc9615b28f931ddab5e42" protocol=ttrpc version=3 Dec 12 17:41:15.353633 kubelet[3527]: E1212 17:41:15.353178 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nhpd8" podUID="aae497cc-3748-48de-b6f7-6585350a2476" Dec 12 17:41:15.373660 systemd[1]: Started cri-containerd-f58bf678c4a03c1bcbc84dec5755e34e4745db111ca4ff45b2daf120ada2d376.scope - libcontainer container f58bf678c4a03c1bcbc84dec5755e34e4745db111ca4ff45b2daf120ada2d376. Dec 12 17:41:15.411328 containerd[1916]: time="2025-12-12T17:41:15.411270498Z" level=info msg="StartContainer for \"f58bf678c4a03c1bcbc84dec5755e34e4745db111ca4ff45b2daf120ada2d376\" returns successfully" Dec 12 17:41:15.449309 kubelet[3527]: I1212 17:41:15.449223 3527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-679b86445-ckqs8" podStartSLOduration=0.74111173 podStartE2EDuration="2.449201397s" podCreationTimestamp="2025-12-12 17:41:13 +0000 UTC" firstStartedPulling="2025-12-12 17:41:13.570867104 +0000 UTC m=+22.309098069" lastFinishedPulling="2025-12-12 17:41:15.278956755 +0000 UTC m=+24.017187736" observedRunningTime="2025-12-12 17:41:15.447450256 +0000 UTC m=+24.185681237" watchObservedRunningTime="2025-12-12 17:41:15.449201397 +0000 UTC m=+24.187432362" Dec 12 17:41:15.477367 kubelet[3527]: E1212 17:41:15.477330 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.477798 kubelet[3527]: W1212 17:41:15.477521 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.477798 kubelet[3527]: E1212 17:41:15.477553 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.478160 kubelet[3527]: E1212 17:41:15.478066 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.478315 kubelet[3527]: W1212 17:41:15.478302 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.478458 kubelet[3527]: E1212 17:41:15.478447 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.478935 kubelet[3527]: E1212 17:41:15.478840 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.479188 kubelet[3527]: W1212 17:41:15.479011 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.479188 kubelet[3527]: E1212 17:41:15.479030 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.479595 kubelet[3527]: E1212 17:41:15.479423 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.479595 kubelet[3527]: W1212 17:41:15.479433 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.479595 kubelet[3527]: E1212 17:41:15.479442 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.480169 kubelet[3527]: E1212 17:41:15.479995 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.480169 kubelet[3527]: W1212 17:41:15.480008 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.480169 kubelet[3527]: E1212 17:41:15.480017 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.480544 kubelet[3527]: E1212 17:41:15.480532 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.480709 kubelet[3527]: W1212 17:41:15.480696 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.480862 kubelet[3527]: E1212 17:41:15.480848 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.481270 kubelet[3527]: E1212 17:41:15.481169 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.481574 kubelet[3527]: W1212 17:41:15.481337 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.481574 kubelet[3527]: E1212 17:41:15.481355 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.481875 kubelet[3527]: E1212 17:41:15.481780 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.482020 kubelet[3527]: W1212 17:41:15.482007 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.482178 kubelet[3527]: E1212 17:41:15.482167 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.482666 kubelet[3527]: E1212 17:41:15.482654 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.483615 kubelet[3527]: W1212 17:41:15.483423 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.483615 kubelet[3527]: E1212 17:41:15.483443 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.483738 kubelet[3527]: E1212 17:41:15.483727 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.483860 kubelet[3527]: W1212 17:41:15.483776 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.483860 kubelet[3527]: E1212 17:41:15.483789 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.483953 kubelet[3527]: E1212 17:41:15.483944 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.484001 kubelet[3527]: W1212 17:41:15.483993 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.484124 kubelet[3527]: E1212 17:41:15.484043 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.484211 kubelet[3527]: E1212 17:41:15.484202 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.484252 kubelet[3527]: W1212 17:41:15.484244 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.485574 kubelet[3527]: E1212 17:41:15.485554 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.485802 kubelet[3527]: E1212 17:41:15.485791 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.485953 kubelet[3527]: W1212 17:41:15.485869 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.485953 kubelet[3527]: E1212 17:41:15.485883 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.486129 kubelet[3527]: E1212 17:41:15.486118 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.486279 kubelet[3527]: W1212 17:41:15.486183 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.486279 kubelet[3527]: E1212 17:41:15.486197 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.486454 kubelet[3527]: E1212 17:41:15.486388 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.486454 kubelet[3527]: W1212 17:41:15.486397 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.486454 kubelet[3527]: E1212 17:41:15.486405 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.493688 kubelet[3527]: E1212 17:41:15.493671 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.493688 kubelet[3527]: W1212 17:41:15.493683 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.493688 kubelet[3527]: E1212 17:41:15.493693 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.493940 kubelet[3527]: E1212 17:41:15.493923 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.493940 kubelet[3527]: W1212 17:41:15.493936 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.494003 kubelet[3527]: E1212 17:41:15.493946 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.494642 kubelet[3527]: E1212 17:41:15.494622 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.494642 kubelet[3527]: W1212 17:41:15.494637 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.494642 kubelet[3527]: E1212 17:41:15.494647 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.494915 kubelet[3527]: E1212 17:41:15.494798 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.494915 kubelet[3527]: W1212 17:41:15.494804 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.494915 kubelet[3527]: E1212 17:41:15.494810 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.495181 kubelet[3527]: E1212 17:41:15.495165 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.495181 kubelet[3527]: W1212 17:41:15.495177 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.495291 kubelet[3527]: E1212 17:41:15.495187 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.496686 kubelet[3527]: E1212 17:41:15.496662 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.496686 kubelet[3527]: W1212 17:41:15.496677 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.496686 kubelet[3527]: E1212 17:41:15.496686 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.497371 kubelet[3527]: E1212 17:41:15.496820 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.497371 kubelet[3527]: W1212 17:41:15.496830 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.497371 kubelet[3527]: E1212 17:41:15.496837 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.497371 kubelet[3527]: E1212 17:41:15.496936 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.497371 kubelet[3527]: W1212 17:41:15.496940 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.497371 kubelet[3527]: E1212 17:41:15.496945 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.497371 kubelet[3527]: E1212 17:41:15.497025 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.497371 kubelet[3527]: W1212 17:41:15.497031 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.497371 kubelet[3527]: E1212 17:41:15.497035 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.497371 kubelet[3527]: E1212 17:41:15.497111 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.497576 kubelet[3527]: W1212 17:41:15.497115 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.497576 kubelet[3527]: E1212 17:41:15.497119 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.497576 kubelet[3527]: E1212 17:41:15.497182 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.497576 kubelet[3527]: W1212 17:41:15.497186 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.497576 kubelet[3527]: E1212 17:41:15.497190 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.497576 kubelet[3527]: E1212 17:41:15.497269 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.497576 kubelet[3527]: W1212 17:41:15.497273 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.497576 kubelet[3527]: E1212 17:41:15.497277 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.497576 kubelet[3527]: E1212 17:41:15.497486 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.497576 kubelet[3527]: W1212 17:41:15.497492 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.497744 kubelet[3527]: E1212 17:41:15.497498 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.497744 kubelet[3527]: E1212 17:41:15.497735 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.497744 kubelet[3527]: W1212 17:41:15.497741 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.497789 kubelet[3527]: E1212 17:41:15.497747 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.497921 kubelet[3527]: E1212 17:41:15.497908 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.497921 kubelet[3527]: W1212 17:41:15.497917 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.498043 kubelet[3527]: E1212 17:41:15.497925 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.498157 kubelet[3527]: E1212 17:41:15.498144 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.498157 kubelet[3527]: W1212 17:41:15.498154 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.498223 kubelet[3527]: E1212 17:41:15.498162 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.498631 kubelet[3527]: E1212 17:41:15.498616 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.498631 kubelet[3527]: W1212 17:41:15.498627 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.498631 kubelet[3527]: E1212 17:41:15.498636 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:15.499873 kubelet[3527]: E1212 17:41:15.499856 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:15.499873 kubelet[3527]: W1212 17:41:15.499869 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:15.499990 kubelet[3527]: E1212 17:41:15.499879 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.437060 kubelet[3527]: I1212 17:41:16.437030 3527 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 12 17:41:16.457051 containerd[1916]: time="2025-12-12T17:41:16.456962432Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:41:16.462374 containerd[1916]: time="2025-12-12T17:41:16.462329498Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4266741" Dec 12 17:41:16.465249 containerd[1916]: time="2025-12-12T17:41:16.465198449Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:41:16.469407 containerd[1916]: time="2025-12-12T17:41:16.469370975Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:41:16.470169 containerd[1916]: time="2025-12-12T17:41:16.470140478Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 1.189669725s" Dec 12 17:41:16.470217 containerd[1916]: time="2025-12-12T17:41:16.470174615Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Dec 12 17:41:16.478646 containerd[1916]: time="2025-12-12T17:41:16.478616511Z" level=info msg="CreateContainer within sandbox \"bea02ce28146269d9a594a53e7b5ed032a010077140e6e783a5b7287b441771c\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Dec 12 17:41:16.490062 kubelet[3527]: E1212 17:41:16.490034 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.490261 kubelet[3527]: W1212 17:41:16.490190 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.490261 kubelet[3527]: E1212 17:41:16.490216 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.490513 kubelet[3527]: E1212 17:41:16.490489 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.490625 kubelet[3527]: W1212 17:41:16.490575 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.490625 kubelet[3527]: E1212 17:41:16.490591 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.490807 kubelet[3527]: E1212 17:41:16.490797 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.490960 kubelet[3527]: W1212 17:41:16.490860 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.490960 kubelet[3527]: E1212 17:41:16.490879 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.491194 kubelet[3527]: E1212 17:41:16.491183 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.491246 kubelet[3527]: W1212 17:41:16.491238 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.491282 kubelet[3527]: E1212 17:41:16.491275 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.491921 kubelet[3527]: E1212 17:41:16.491846 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.491921 kubelet[3527]: W1212 17:41:16.491857 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.491921 kubelet[3527]: E1212 17:41:16.491866 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.492182 kubelet[3527]: E1212 17:41:16.492094 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.492182 kubelet[3527]: W1212 17:41:16.492104 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.492182 kubelet[3527]: E1212 17:41:16.492113 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.492669 kubelet[3527]: E1212 17:41:16.492576 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.492669 kubelet[3527]: W1212 17:41:16.492588 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.492669 kubelet[3527]: E1212 17:41:16.492598 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.493205 kubelet[3527]: E1212 17:41:16.493194 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.493365 kubelet[3527]: W1212 17:41:16.493260 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.493365 kubelet[3527]: E1212 17:41:16.493274 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.493462 kubelet[3527]: E1212 17:41:16.493453 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.493518 kubelet[3527]: W1212 17:41:16.493494 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.493650 kubelet[3527]: E1212 17:41:16.493572 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.493845 kubelet[3527]: E1212 17:41:16.493835 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.493985 kubelet[3527]: W1212 17:41:16.493905 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.493985 kubelet[3527]: E1212 17:41:16.493919 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.494250 kubelet[3527]: E1212 17:41:16.494166 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.494250 kubelet[3527]: W1212 17:41:16.494174 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.494250 kubelet[3527]: E1212 17:41:16.494183 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.494442 kubelet[3527]: E1212 17:41:16.494380 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.494442 kubelet[3527]: W1212 17:41:16.494389 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.494442 kubelet[3527]: E1212 17:41:16.494397 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.495735 kubelet[3527]: E1212 17:41:16.494660 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.495735 kubelet[3527]: W1212 17:41:16.494671 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.495735 kubelet[3527]: E1212 17:41:16.494683 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.495735 kubelet[3527]: E1212 17:41:16.494906 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.495735 kubelet[3527]: W1212 17:41:16.494915 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.495735 kubelet[3527]: E1212 17:41:16.494926 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.495735 kubelet[3527]: E1212 17:41:16.495055 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.495735 kubelet[3527]: W1212 17:41:16.495062 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.495735 kubelet[3527]: E1212 17:41:16.495071 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.501732 containerd[1916]: time="2025-12-12T17:41:16.499707308Z" level=info msg="Container 0d520b56917a85060f613a0d89f54472f33be0c6e7eb147fc19e3a472fea647c: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:41:16.504351 kubelet[3527]: E1212 17:41:16.504321 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.504351 kubelet[3527]: W1212 17:41:16.504340 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.504351 kubelet[3527]: E1212 17:41:16.504355 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.506532 kubelet[3527]: E1212 17:41:16.506494 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.506532 kubelet[3527]: W1212 17:41:16.506523 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.506676 kubelet[3527]: E1212 17:41:16.506537 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.506699 kubelet[3527]: E1212 17:41:16.506694 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.506724 kubelet[3527]: W1212 17:41:16.506700 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.506724 kubelet[3527]: E1212 17:41:16.506708 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.506822 kubelet[3527]: E1212 17:41:16.506811 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.506822 kubelet[3527]: W1212 17:41:16.506819 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.506866 kubelet[3527]: E1212 17:41:16.506825 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.507013 kubelet[3527]: E1212 17:41:16.506998 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.507013 kubelet[3527]: W1212 17:41:16.507009 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.507068 kubelet[3527]: E1212 17:41:16.507015 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.507382 kubelet[3527]: E1212 17:41:16.507309 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.507382 kubelet[3527]: W1212 17:41:16.507323 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.507382 kubelet[3527]: E1212 17:41:16.507333 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.507459 kubelet[3527]: E1212 17:41:16.507451 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.507474 kubelet[3527]: W1212 17:41:16.507460 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.507474 kubelet[3527]: E1212 17:41:16.507468 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.507670 kubelet[3527]: E1212 17:41:16.507656 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.507670 kubelet[3527]: W1212 17:41:16.507667 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.507714 kubelet[3527]: E1212 17:41:16.507675 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.507868 kubelet[3527]: E1212 17:41:16.507853 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.507868 kubelet[3527]: W1212 17:41:16.507865 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.507918 kubelet[3527]: E1212 17:41:16.507872 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.508041 kubelet[3527]: E1212 17:41:16.508028 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.508041 kubelet[3527]: W1212 17:41:16.508038 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.508089 kubelet[3527]: E1212 17:41:16.508045 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.508162 kubelet[3527]: E1212 17:41:16.508150 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.508162 kubelet[3527]: W1212 17:41:16.508159 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.508202 kubelet[3527]: E1212 17:41:16.508164 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.508327 kubelet[3527]: E1212 17:41:16.508314 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.508327 kubelet[3527]: W1212 17:41:16.508323 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.508376 kubelet[3527]: E1212 17:41:16.508329 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.508573 kubelet[3527]: E1212 17:41:16.508560 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.508656 kubelet[3527]: W1212 17:41:16.508624 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.508656 kubelet[3527]: E1212 17:41:16.508639 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.508798 kubelet[3527]: E1212 17:41:16.508782 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.508798 kubelet[3527]: W1212 17:41:16.508793 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.508845 kubelet[3527]: E1212 17:41:16.508801 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.508977 kubelet[3527]: E1212 17:41:16.508965 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.508977 kubelet[3527]: W1212 17:41:16.508974 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.509052 kubelet[3527]: E1212 17:41:16.508982 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.509093 kubelet[3527]: E1212 17:41:16.509076 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.509093 kubelet[3527]: W1212 17:41:16.509081 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.509093 kubelet[3527]: E1212 17:41:16.509086 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.509190 kubelet[3527]: E1212 17:41:16.509181 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.509190 kubelet[3527]: W1212 17:41:16.509187 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.509225 kubelet[3527]: E1212 17:41:16.509192 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.509446 kubelet[3527]: E1212 17:41:16.509436 3527 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:41:16.509446 kubelet[3527]: W1212 17:41:16.509443 3527 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:41:16.509488 kubelet[3527]: E1212 17:41:16.509450 3527 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:41:16.522208 containerd[1916]: time="2025-12-12T17:41:16.522134962Z" level=info msg="CreateContainer within sandbox \"bea02ce28146269d9a594a53e7b5ed032a010077140e6e783a5b7287b441771c\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"0d520b56917a85060f613a0d89f54472f33be0c6e7eb147fc19e3a472fea647c\"" Dec 12 17:41:16.522990 containerd[1916]: time="2025-12-12T17:41:16.522943930Z" level=info msg="StartContainer for \"0d520b56917a85060f613a0d89f54472f33be0c6e7eb147fc19e3a472fea647c\"" Dec 12 17:41:16.524832 containerd[1916]: time="2025-12-12T17:41:16.524797514Z" level=info msg="connecting to shim 0d520b56917a85060f613a0d89f54472f33be0c6e7eb147fc19e3a472fea647c" address="unix:///run/containerd/s/ffa60f5488f50beb794af8d3aa738da9c5fd9bf52faa1fe4188f831e9a4e5082" protocol=ttrpc version=3 Dec 12 17:41:16.543656 systemd[1]: Started cri-containerd-0d520b56917a85060f613a0d89f54472f33be0c6e7eb147fc19e3a472fea647c.scope - libcontainer container 0d520b56917a85060f613a0d89f54472f33be0c6e7eb147fc19e3a472fea647c. Dec 12 17:41:16.604410 containerd[1916]: time="2025-12-12T17:41:16.604372128Z" level=info msg="StartContainer for \"0d520b56917a85060f613a0d89f54472f33be0c6e7eb147fc19e3a472fea647c\" returns successfully" Dec 12 17:41:16.616180 systemd[1]: cri-containerd-0d520b56917a85060f613a0d89f54472f33be0c6e7eb147fc19e3a472fea647c.scope: Deactivated successfully. Dec 12 17:41:16.619352 containerd[1916]: time="2025-12-12T17:41:16.619317643Z" level=info msg="received container exit event container_id:\"0d520b56917a85060f613a0d89f54472f33be0c6e7eb147fc19e3a472fea647c\" id:\"0d520b56917a85060f613a0d89f54472f33be0c6e7eb147fc19e3a472fea647c\" pid:4233 exited_at:{seconds:1765561276 nanos:618903911}" Dec 12 17:41:16.636117 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0d520b56917a85060f613a0d89f54472f33be0c6e7eb147fc19e3a472fea647c-rootfs.mount: Deactivated successfully. Dec 12 17:41:17.353944 kubelet[3527]: E1212 17:41:17.352783 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nhpd8" podUID="aae497cc-3748-48de-b6f7-6585350a2476" Dec 12 17:41:18.446862 containerd[1916]: time="2025-12-12T17:41:18.446821549Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Dec 12 17:41:19.353066 kubelet[3527]: E1212 17:41:19.352644 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nhpd8" podUID="aae497cc-3748-48de-b6f7-6585350a2476" Dec 12 17:41:20.606183 containerd[1916]: time="2025-12-12T17:41:20.605691339Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:41:20.608779 containerd[1916]: time="2025-12-12T17:41:20.608757654Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65925816" Dec 12 17:41:20.611544 containerd[1916]: time="2025-12-12T17:41:20.611515480Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:41:20.616207 containerd[1916]: time="2025-12-12T17:41:20.615985573Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:41:20.616389 containerd[1916]: time="2025-12-12T17:41:20.616367640Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 2.169510346s" Dec 12 17:41:20.616482 containerd[1916]: time="2025-12-12T17:41:20.616467547Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Dec 12 17:41:20.623956 containerd[1916]: time="2025-12-12T17:41:20.623925313Z" level=info msg="CreateContainer within sandbox \"bea02ce28146269d9a594a53e7b5ed032a010077140e6e783a5b7287b441771c\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Dec 12 17:41:20.642842 containerd[1916]: time="2025-12-12T17:41:20.641264139Z" level=info msg="Container 4d5b9c3c65686c9c7c7bc8ee1fd46438ef15e10085bb3b5fd449865a9027e722: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:41:20.659309 containerd[1916]: time="2025-12-12T17:41:20.659192744Z" level=info msg="CreateContainer within sandbox \"bea02ce28146269d9a594a53e7b5ed032a010077140e6e783a5b7287b441771c\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"4d5b9c3c65686c9c7c7bc8ee1fd46438ef15e10085bb3b5fd449865a9027e722\"" Dec 12 17:41:20.659956 containerd[1916]: time="2025-12-12T17:41:20.659884148Z" level=info msg="StartContainer for \"4d5b9c3c65686c9c7c7bc8ee1fd46438ef15e10085bb3b5fd449865a9027e722\"" Dec 12 17:41:20.661078 containerd[1916]: time="2025-12-12T17:41:20.661048471Z" level=info msg="connecting to shim 4d5b9c3c65686c9c7c7bc8ee1fd46438ef15e10085bb3b5fd449865a9027e722" address="unix:///run/containerd/s/ffa60f5488f50beb794af8d3aa738da9c5fd9bf52faa1fe4188f831e9a4e5082" protocol=ttrpc version=3 Dec 12 17:41:20.683653 systemd[1]: Started cri-containerd-4d5b9c3c65686c9c7c7bc8ee1fd46438ef15e10085bb3b5fd449865a9027e722.scope - libcontainer container 4d5b9c3c65686c9c7c7bc8ee1fd46438ef15e10085bb3b5fd449865a9027e722. Dec 12 17:41:20.747667 containerd[1916]: time="2025-12-12T17:41:20.747421932Z" level=info msg="StartContainer for \"4d5b9c3c65686c9c7c7bc8ee1fd46438ef15e10085bb3b5fd449865a9027e722\" returns successfully" Dec 12 17:41:21.352843 kubelet[3527]: E1212 17:41:21.352521 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nhpd8" podUID="aae497cc-3748-48de-b6f7-6585350a2476" Dec 12 17:41:21.880943 containerd[1916]: time="2025-12-12T17:41:21.880485451Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 12 17:41:21.884383 systemd[1]: cri-containerd-4d5b9c3c65686c9c7c7bc8ee1fd46438ef15e10085bb3b5fd449865a9027e722.scope: Deactivated successfully. Dec 12 17:41:21.884647 systemd[1]: cri-containerd-4d5b9c3c65686c9c7c7bc8ee1fd46438ef15e10085bb3b5fd449865a9027e722.scope: Consumed 321ms CPU time, 185.6M memory peak, 165.9M written to disk. Dec 12 17:41:21.888526 containerd[1916]: time="2025-12-12T17:41:21.888110550Z" level=info msg="received container exit event container_id:\"4d5b9c3c65686c9c7c7bc8ee1fd46438ef15e10085bb3b5fd449865a9027e722\" id:\"4d5b9c3c65686c9c7c7bc8ee1fd46438ef15e10085bb3b5fd449865a9027e722\" pid:4291 exited_at:{seconds:1765561281 nanos:886540599}" Dec 12 17:41:21.904439 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4d5b9c3c65686c9c7c7bc8ee1fd46438ef15e10085bb3b5fd449865a9027e722-rootfs.mount: Deactivated successfully. Dec 12 17:41:21.945655 kubelet[3527]: I1212 17:41:21.945410 3527 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Dec 12 17:41:22.685129 systemd[1]: Created slice kubepods-burstable-podbb311974_0acf_4583_8d90_53c2a33e8927.slice - libcontainer container kubepods-burstable-podbb311974_0acf_4583_8d90_53c2a33e8927.slice. Dec 12 17:41:22.692036 systemd[1]: Created slice kubepods-besteffort-podaae497cc_3748_48de_b6f7_6585350a2476.slice - libcontainer container kubepods-besteffort-podaae497cc_3748_48de_b6f7_6585350a2476.slice. Dec 12 17:41:22.741242 containerd[1916]: time="2025-12-12T17:41:22.741196787Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-nhpd8,Uid:aae497cc-3748-48de-b6f7-6585350a2476,Namespace:calico-system,Attempt:0,}" Dec 12 17:41:22.743903 kubelet[3527]: I1212 17:41:22.743870 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb311974-0acf-4583-8d90-53c2a33e8927-config-volume\") pod \"coredns-66bc5c9577-vn5cd\" (UID: \"bb311974-0acf-4583-8d90-53c2a33e8927\") " pod="kube-system/coredns-66bc5c9577-vn5cd" Dec 12 17:41:22.744672 kubelet[3527]: I1212 17:41:22.744010 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6cb9\" (UniqueName: \"kubernetes.io/projected/bb311974-0acf-4583-8d90-53c2a33e8927-kube-api-access-c6cb9\") pod \"coredns-66bc5c9577-vn5cd\" (UID: \"bb311974-0acf-4583-8d90-53c2a33e8927\") " pod="kube-system/coredns-66bc5c9577-vn5cd" Dec 12 17:41:22.758497 systemd[1]: Created slice kubepods-burstable-pod08030ccf_dd27_4a9f_ad22_e1aee80e0ddf.slice - libcontainer container kubepods-burstable-pod08030ccf_dd27_4a9f_ad22_e1aee80e0ddf.slice. Dec 12 17:41:22.774282 systemd[1]: Created slice kubepods-besteffort-pod9d6c6448_7253_4405_b42f_a3327529c933.slice - libcontainer container kubepods-besteffort-pod9d6c6448_7253_4405_b42f_a3327529c933.slice. Dec 12 17:41:22.784665 systemd[1]: Created slice kubepods-besteffort-pod6c556bc6_ca1e_488b_87dc_eb6c6785bf8c.slice - libcontainer container kubepods-besteffort-pod6c556bc6_ca1e_488b_87dc_eb6c6785bf8c.slice. Dec 12 17:41:22.789261 systemd[1]: Created slice kubepods-besteffort-pod65781289_4c97_4182_9b09_b4c93c5b6dd1.slice - libcontainer container kubepods-besteffort-pod65781289_4c97_4182_9b09_b4c93c5b6dd1.slice. Dec 12 17:41:22.801726 systemd[1]: Created slice kubepods-besteffort-podcc75188a_1547_4b56_bf96_4cc7e7818e13.slice - libcontainer container kubepods-besteffort-podcc75188a_1547_4b56_bf96_4cc7e7818e13.slice. Dec 12 17:41:22.812935 systemd[1]: Created slice kubepods-besteffort-pode00a2951_72e5_4c7e_b07e_7aa028d3e079.slice - libcontainer container kubepods-besteffort-pode00a2951_72e5_4c7e_b07e_7aa028d3e079.slice. Dec 12 17:41:22.822763 containerd[1916]: time="2025-12-12T17:41:22.822721728Z" level=error msg="Failed to destroy network for sandbox \"7995592dae481eb161c05c5072d786858852060c003f318ba0eb3a27c54c2ab9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:41:22.825902 systemd[1]: run-netns-cni\x2d93068e4a\x2dba5f\x2d904d\x2d5b41\x2d05344a2edaad.mount: Deactivated successfully. Dec 12 17:41:22.827229 containerd[1916]: time="2025-12-12T17:41:22.827149420Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-nhpd8,Uid:aae497cc-3748-48de-b6f7-6585350a2476,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7995592dae481eb161c05c5072d786858852060c003f318ba0eb3a27c54c2ab9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:41:22.827818 kubelet[3527]: E1212 17:41:22.827523 3527 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7995592dae481eb161c05c5072d786858852060c003f318ba0eb3a27c54c2ab9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:41:22.827818 kubelet[3527]: E1212 17:41:22.827583 3527 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7995592dae481eb161c05c5072d786858852060c003f318ba0eb3a27c54c2ab9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-nhpd8" Dec 12 17:41:22.827818 kubelet[3527]: E1212 17:41:22.827598 3527 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7995592dae481eb161c05c5072d786858852060c003f318ba0eb3a27c54c2ab9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-nhpd8" Dec 12 17:41:22.827936 kubelet[3527]: E1212 17:41:22.827642 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-nhpd8_calico-system(aae497cc-3748-48de-b6f7-6585350a2476)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-nhpd8_calico-system(aae497cc-3748-48de-b6f7-6585350a2476)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7995592dae481eb161c05c5072d786858852060c003f318ba0eb3a27c54c2ab9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-nhpd8" podUID="aae497cc-3748-48de-b6f7-6585350a2476" Dec 12 17:41:22.845017 kubelet[3527]: I1212 17:41:22.844815 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7zfw\" (UniqueName: \"kubernetes.io/projected/e00a2951-72e5-4c7e-b07e-7aa028d3e079-kube-api-access-n7zfw\") pod \"whisker-7b8694d555-nrd6j\" (UID: \"e00a2951-72e5-4c7e-b07e-7aa028d3e079\") " pod="calico-system/whisker-7b8694d555-nrd6j" Dec 12 17:41:22.845355 kubelet[3527]: I1212 17:41:22.845298 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/65781289-4c97-4182-9b09-b4c93c5b6dd1-calico-apiserver-certs\") pod \"calico-apiserver-5cf495b795-pnxm5\" (UID: \"65781289-4c97-4182-9b09-b4c93c5b6dd1\") " pod="calico-apiserver/calico-apiserver-5cf495b795-pnxm5" Dec 12 17:41:22.845518 kubelet[3527]: I1212 17:41:22.845485 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/6c556bc6-ca1e-488b-87dc-eb6c6785bf8c-calico-apiserver-certs\") pod \"calico-apiserver-5cf495b795-djp4n\" (UID: \"6c556bc6-ca1e-488b-87dc-eb6c6785bf8c\") " pod="calico-apiserver/calico-apiserver-5cf495b795-djp4n" Dec 12 17:41:22.845748 kubelet[3527]: I1212 17:41:22.845660 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5nsd\" (UniqueName: \"kubernetes.io/projected/6c556bc6-ca1e-488b-87dc-eb6c6785bf8c-kube-api-access-m5nsd\") pod \"calico-apiserver-5cf495b795-djp4n\" (UID: \"6c556bc6-ca1e-488b-87dc-eb6c6785bf8c\") " pod="calico-apiserver/calico-apiserver-5cf495b795-djp4n" Dec 12 17:41:22.845844 kubelet[3527]: I1212 17:41:22.845829 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6c6448-7253-4405-b42f-a3327529c933-config\") pod \"goldmane-7c778bb748-ftsnw\" (UID: \"9d6c6448-7253-4405-b42f-a3327529c933\") " pod="calico-system/goldmane-7c778bb748-ftsnw" Dec 12 17:41:22.845999 kubelet[3527]: I1212 17:41:22.845987 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/9d6c6448-7253-4405-b42f-a3327529c933-goldmane-key-pair\") pod \"goldmane-7c778bb748-ftsnw\" (UID: \"9d6c6448-7253-4405-b42f-a3327529c933\") " pod="calico-system/goldmane-7c778bb748-ftsnw" Dec 12 17:41:22.846090 kubelet[3527]: I1212 17:41:22.846071 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8tpz\" (UniqueName: \"kubernetes.io/projected/cc75188a-1547-4b56-bf96-4cc7e7818e13-kube-api-access-j8tpz\") pod \"calico-kube-controllers-8645b67b4b-hg8jx\" (UID: \"cc75188a-1547-4b56-bf96-4cc7e7818e13\") " pod="calico-system/calico-kube-controllers-8645b67b4b-hg8jx" Dec 12 17:41:22.846274 kubelet[3527]: I1212 17:41:22.846260 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e00a2951-72e5-4c7e-b07e-7aa028d3e079-whisker-backend-key-pair\") pod \"whisker-7b8694d555-nrd6j\" (UID: \"e00a2951-72e5-4c7e-b07e-7aa028d3e079\") " pod="calico-system/whisker-7b8694d555-nrd6j" Dec 12 17:41:22.846365 kubelet[3527]: I1212 17:41:22.846354 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2bnt\" (UniqueName: \"kubernetes.io/projected/9d6c6448-7253-4405-b42f-a3327529c933-kube-api-access-t2bnt\") pod \"goldmane-7c778bb748-ftsnw\" (UID: \"9d6c6448-7253-4405-b42f-a3327529c933\") " pod="calico-system/goldmane-7c778bb748-ftsnw" Dec 12 17:41:22.846433 kubelet[3527]: I1212 17:41:22.846414 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc75188a-1547-4b56-bf96-4cc7e7818e13-tigera-ca-bundle\") pod \"calico-kube-controllers-8645b67b4b-hg8jx\" (UID: \"cc75188a-1547-4b56-bf96-4cc7e7818e13\") " pod="calico-system/calico-kube-controllers-8645b67b4b-hg8jx" Dec 12 17:41:22.846600 kubelet[3527]: I1212 17:41:22.846547 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d6c6448-7253-4405-b42f-a3327529c933-goldmane-ca-bundle\") pod \"goldmane-7c778bb748-ftsnw\" (UID: \"9d6c6448-7253-4405-b42f-a3327529c933\") " pod="calico-system/goldmane-7c778bb748-ftsnw" Dec 12 17:41:22.847243 kubelet[3527]: I1212 17:41:22.847214 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74j4p\" (UniqueName: \"kubernetes.io/projected/65781289-4c97-4182-9b09-b4c93c5b6dd1-kube-api-access-74j4p\") pod \"calico-apiserver-5cf495b795-pnxm5\" (UID: \"65781289-4c97-4182-9b09-b4c93c5b6dd1\") " pod="calico-apiserver/calico-apiserver-5cf495b795-pnxm5" Dec 12 17:41:22.847826 kubelet[3527]: I1212 17:41:22.847424 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzpkn\" (UniqueName: \"kubernetes.io/projected/08030ccf-dd27-4a9f-ad22-e1aee80e0ddf-kube-api-access-tzpkn\") pod \"coredns-66bc5c9577-hc27m\" (UID: \"08030ccf-dd27-4a9f-ad22-e1aee80e0ddf\") " pod="kube-system/coredns-66bc5c9577-hc27m" Dec 12 17:41:22.847826 kubelet[3527]: I1212 17:41:22.847453 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e00a2951-72e5-4c7e-b07e-7aa028d3e079-whisker-ca-bundle\") pod \"whisker-7b8694d555-nrd6j\" (UID: \"e00a2951-72e5-4c7e-b07e-7aa028d3e079\") " pod="calico-system/whisker-7b8694d555-nrd6j" Dec 12 17:41:22.847826 kubelet[3527]: I1212 17:41:22.847469 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/08030ccf-dd27-4a9f-ad22-e1aee80e0ddf-config-volume\") pod \"coredns-66bc5c9577-hc27m\" (UID: \"08030ccf-dd27-4a9f-ad22-e1aee80e0ddf\") " pod="kube-system/coredns-66bc5c9577-hc27m" Dec 12 17:41:22.994952 containerd[1916]: time="2025-12-12T17:41:22.994834751Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-vn5cd,Uid:bb311974-0acf-4583-8d90-53c2a33e8927,Namespace:kube-system,Attempt:0,}" Dec 12 17:41:23.033756 containerd[1916]: time="2025-12-12T17:41:23.033700546Z" level=error msg="Failed to destroy network for sandbox \"2416d9301fb0f85d9be3a5f8db84b7c9745d5ec9398e0daf95a235c465534353\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:41:23.037137 containerd[1916]: time="2025-12-12T17:41:23.037093446Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-vn5cd,Uid:bb311974-0acf-4583-8d90-53c2a33e8927,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2416d9301fb0f85d9be3a5f8db84b7c9745d5ec9398e0daf95a235c465534353\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:41:23.037499 kubelet[3527]: E1212 17:41:23.037428 3527 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2416d9301fb0f85d9be3a5f8db84b7c9745d5ec9398e0daf95a235c465534353\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:41:23.037499 kubelet[3527]: E1212 17:41:23.037481 3527 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2416d9301fb0f85d9be3a5f8db84b7c9745d5ec9398e0daf95a235c465534353\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-vn5cd" Dec 12 17:41:23.037688 kubelet[3527]: E1212 17:41:23.037623 3527 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2416d9301fb0f85d9be3a5f8db84b7c9745d5ec9398e0daf95a235c465534353\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-vn5cd" Dec 12 17:41:23.037767 kubelet[3527]: E1212 17:41:23.037746 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-vn5cd_kube-system(bb311974-0acf-4583-8d90-53c2a33e8927)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-vn5cd_kube-system(bb311974-0acf-4583-8d90-53c2a33e8927)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2416d9301fb0f85d9be3a5f8db84b7c9745d5ec9398e0daf95a235c465534353\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-vn5cd" podUID="bb311974-0acf-4583-8d90-53c2a33e8927" Dec 12 17:41:23.081678 containerd[1916]: time="2025-12-12T17:41:23.081556055Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-hc27m,Uid:08030ccf-dd27-4a9f-ad22-e1aee80e0ddf,Namespace:kube-system,Attempt:0,}" Dec 12 17:41:23.092957 containerd[1916]: time="2025-12-12T17:41:23.092656704Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-ftsnw,Uid:9d6c6448-7253-4405-b42f-a3327529c933,Namespace:calico-system,Attempt:0,}" Dec 12 17:41:23.099566 containerd[1916]: time="2025-12-12T17:41:23.099529301Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5cf495b795-djp4n,Uid:6c556bc6-ca1e-488b-87dc-eb6c6785bf8c,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:41:23.106793 containerd[1916]: time="2025-12-12T17:41:23.106756467Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5cf495b795-pnxm5,Uid:65781289-4c97-4182-9b09-b4c93c5b6dd1,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:41:23.122961 containerd[1916]: time="2025-12-12T17:41:23.122917731Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8645b67b4b-hg8jx,Uid:cc75188a-1547-4b56-bf96-4cc7e7818e13,Namespace:calico-system,Attempt:0,}" Dec 12 17:41:23.132540 containerd[1916]: time="2025-12-12T17:41:23.132492191Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7b8694d555-nrd6j,Uid:e00a2951-72e5-4c7e-b07e-7aa028d3e079,Namespace:calico-system,Attempt:0,}" Dec 12 17:41:23.159151 containerd[1916]: time="2025-12-12T17:41:23.159077845Z" level=error msg="Failed to destroy network for sandbox \"e2a8c7e897eae7a88f15fc65f653a4981658306c58d60cb71e13633569090797\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:41:23.160366 containerd[1916]: time="2025-12-12T17:41:23.160340858Z" level=error msg="Failed to destroy network for sandbox \"9960b63382de319cbe607f3115ae256d4c9a79cfd9f65e57b0f557ae70a65950\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:41:23.171687 containerd[1916]: time="2025-12-12T17:41:23.171596993Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-ftsnw,Uid:9d6c6448-7253-4405-b42f-a3327529c933,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e2a8c7e897eae7a88f15fc65f653a4981658306c58d60cb71e13633569090797\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:41:23.172478 kubelet[3527]: E1212 17:41:23.172362 3527 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e2a8c7e897eae7a88f15fc65f653a4981658306c58d60cb71e13633569090797\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:41:23.172478 kubelet[3527]: E1212 17:41:23.172426 3527 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e2a8c7e897eae7a88f15fc65f653a4981658306c58d60cb71e13633569090797\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-ftsnw" Dec 12 17:41:23.172478 kubelet[3527]: E1212 17:41:23.172442 3527 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e2a8c7e897eae7a88f15fc65f653a4981658306c58d60cb71e13633569090797\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-ftsnw" Dec 12 17:41:23.172778 kubelet[3527]: E1212 17:41:23.172653 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7c778bb748-ftsnw_calico-system(9d6c6448-7253-4405-b42f-a3327529c933)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7c778bb748-ftsnw_calico-system(9d6c6448-7253-4405-b42f-a3327529c933)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e2a8c7e897eae7a88f15fc65f653a4981658306c58d60cb71e13633569090797\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7c778bb748-ftsnw" podUID="9d6c6448-7253-4405-b42f-a3327529c933" Dec 12 17:41:23.175215 containerd[1916]: time="2025-12-12T17:41:23.175163051Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-hc27m,Uid:08030ccf-dd27-4a9f-ad22-e1aee80e0ddf,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9960b63382de319cbe607f3115ae256d4c9a79cfd9f65e57b0f557ae70a65950\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:41:23.175685 kubelet[3527]: E1212 17:41:23.175550 3527 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9960b63382de319cbe607f3115ae256d4c9a79cfd9f65e57b0f557ae70a65950\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:41:23.175685 kubelet[3527]: E1212 17:41:23.175592 3527 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9960b63382de319cbe607f3115ae256d4c9a79cfd9f65e57b0f557ae70a65950\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-hc27m" Dec 12 17:41:23.175685 kubelet[3527]: E1212 17:41:23.175606 3527 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9960b63382de319cbe607f3115ae256d4c9a79cfd9f65e57b0f557ae70a65950\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-hc27m" Dec 12 17:41:23.175800 kubelet[3527]: E1212 17:41:23.175655 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-hc27m_kube-system(08030ccf-dd27-4a9f-ad22-e1aee80e0ddf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-hc27m_kube-system(08030ccf-dd27-4a9f-ad22-e1aee80e0ddf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9960b63382de319cbe607f3115ae256d4c9a79cfd9f65e57b0f557ae70a65950\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-hc27m" podUID="08030ccf-dd27-4a9f-ad22-e1aee80e0ddf" Dec 12 17:41:23.221535 containerd[1916]: time="2025-12-12T17:41:23.221448969Z" level=error msg="Failed to destroy network for sandbox \"e88cf0e8b816cf1fa446be66c764ffec6e85da43c11e04f84775adf6d08ecea7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:41:23.225663 containerd[1916]: time="2025-12-12T17:41:23.225618517Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5cf495b795-djp4n,Uid:6c556bc6-ca1e-488b-87dc-eb6c6785bf8c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e88cf0e8b816cf1fa446be66c764ffec6e85da43c11e04f84775adf6d08ecea7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:41:23.226216 kubelet[3527]: E1212 17:41:23.226121 3527 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e88cf0e8b816cf1fa446be66c764ffec6e85da43c11e04f84775adf6d08ecea7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:41:23.226321 kubelet[3527]: E1212 17:41:23.226295 3527 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e88cf0e8b816cf1fa446be66c764ffec6e85da43c11e04f84775adf6d08ecea7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5cf495b795-djp4n" Dec 12 17:41:23.226354 kubelet[3527]: E1212 17:41:23.226325 3527 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e88cf0e8b816cf1fa446be66c764ffec6e85da43c11e04f84775adf6d08ecea7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5cf495b795-djp4n" Dec 12 17:41:23.226406 kubelet[3527]: E1212 17:41:23.226382 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5cf495b795-djp4n_calico-apiserver(6c556bc6-ca1e-488b-87dc-eb6c6785bf8c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5cf495b795-djp4n_calico-apiserver(6c556bc6-ca1e-488b-87dc-eb6c6785bf8c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e88cf0e8b816cf1fa446be66c764ffec6e85da43c11e04f84775adf6d08ecea7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5cf495b795-djp4n" podUID="6c556bc6-ca1e-488b-87dc-eb6c6785bf8c" Dec 12 17:41:23.235923 containerd[1916]: time="2025-12-12T17:41:23.235884654Z" level=error msg="Failed to destroy network for sandbox \"034e7fd9ee9be473a4d6b98ef66281a093fe625b904e5e550265819202375cd2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:41:23.239968 containerd[1916]: time="2025-12-12T17:41:23.239875508Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7b8694d555-nrd6j,Uid:e00a2951-72e5-4c7e-b07e-7aa028d3e079,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"034e7fd9ee9be473a4d6b98ef66281a093fe625b904e5e550265819202375cd2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:41:23.240195 kubelet[3527]: E1212 17:41:23.240085 3527 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"034e7fd9ee9be473a4d6b98ef66281a093fe625b904e5e550265819202375cd2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:41:23.240195 kubelet[3527]: E1212 17:41:23.240144 3527 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"034e7fd9ee9be473a4d6b98ef66281a093fe625b904e5e550265819202375cd2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7b8694d555-nrd6j" Dec 12 17:41:23.240195 kubelet[3527]: E1212 17:41:23.240157 3527 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"034e7fd9ee9be473a4d6b98ef66281a093fe625b904e5e550265819202375cd2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7b8694d555-nrd6j" Dec 12 17:41:23.240376 kubelet[3527]: E1212 17:41:23.240198 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7b8694d555-nrd6j_calico-system(e00a2951-72e5-4c7e-b07e-7aa028d3e079)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7b8694d555-nrd6j_calico-system(e00a2951-72e5-4c7e-b07e-7aa028d3e079)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"034e7fd9ee9be473a4d6b98ef66281a093fe625b904e5e550265819202375cd2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7b8694d555-nrd6j" podUID="e00a2951-72e5-4c7e-b07e-7aa028d3e079" Dec 12 17:41:23.249996 containerd[1916]: time="2025-12-12T17:41:23.249631326Z" level=error msg="Failed to destroy network for sandbox \"b13c37ff730fec274886aa3a7ca8d1146c4e31507fd2bc0b2dfdb0bd6252a654\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:41:23.252079 containerd[1916]: time="2025-12-12T17:41:23.252047446Z" level=error msg="Failed to destroy network for sandbox \"38e59909964566e929e3344c43bf95d9b752a78afffc165f5eac1a61fa1aad8d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:41:23.253477 containerd[1916]: time="2025-12-12T17:41:23.253445175Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5cf495b795-pnxm5,Uid:65781289-4c97-4182-9b09-b4c93c5b6dd1,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b13c37ff730fec274886aa3a7ca8d1146c4e31507fd2bc0b2dfdb0bd6252a654\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:41:23.253731 kubelet[3527]: E1212 17:41:23.253652 3527 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b13c37ff730fec274886aa3a7ca8d1146c4e31507fd2bc0b2dfdb0bd6252a654\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:41:23.253731 kubelet[3527]: E1212 17:41:23.253707 3527 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b13c37ff730fec274886aa3a7ca8d1146c4e31507fd2bc0b2dfdb0bd6252a654\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5cf495b795-pnxm5" Dec 12 17:41:23.253731 kubelet[3527]: E1212 17:41:23.253723 3527 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b13c37ff730fec274886aa3a7ca8d1146c4e31507fd2bc0b2dfdb0bd6252a654\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5cf495b795-pnxm5" Dec 12 17:41:23.253830 kubelet[3527]: E1212 17:41:23.253785 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5cf495b795-pnxm5_calico-apiserver(65781289-4c97-4182-9b09-b4c93c5b6dd1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5cf495b795-pnxm5_calico-apiserver(65781289-4c97-4182-9b09-b4c93c5b6dd1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b13c37ff730fec274886aa3a7ca8d1146c4e31507fd2bc0b2dfdb0bd6252a654\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5cf495b795-pnxm5" podUID="65781289-4c97-4182-9b09-b4c93c5b6dd1" Dec 12 17:41:23.257025 containerd[1916]: time="2025-12-12T17:41:23.256930111Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8645b67b4b-hg8jx,Uid:cc75188a-1547-4b56-bf96-4cc7e7818e13,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"38e59909964566e929e3344c43bf95d9b752a78afffc165f5eac1a61fa1aad8d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:41:23.257235 kubelet[3527]: E1212 17:41:23.257205 3527 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"38e59909964566e929e3344c43bf95d9b752a78afffc165f5eac1a61fa1aad8d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:41:23.257374 kubelet[3527]: E1212 17:41:23.257317 3527 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"38e59909964566e929e3344c43bf95d9b752a78afffc165f5eac1a61fa1aad8d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8645b67b4b-hg8jx" Dec 12 17:41:23.257374 kubelet[3527]: E1212 17:41:23.257334 3527 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"38e59909964566e929e3344c43bf95d9b752a78afffc165f5eac1a61fa1aad8d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8645b67b4b-hg8jx" Dec 12 17:41:23.257480 kubelet[3527]: E1212 17:41:23.257460 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-8645b67b4b-hg8jx_calico-system(cc75188a-1547-4b56-bf96-4cc7e7818e13)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-8645b67b4b-hg8jx_calico-system(cc75188a-1547-4b56-bf96-4cc7e7818e13)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"38e59909964566e929e3344c43bf95d9b752a78afffc165f5eac1a61fa1aad8d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-8645b67b4b-hg8jx" podUID="cc75188a-1547-4b56-bf96-4cc7e7818e13" Dec 12 17:41:23.462601 containerd[1916]: time="2025-12-12T17:41:23.462548185Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Dec 12 17:41:27.087159 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4179531435.mount: Deactivated successfully. Dec 12 17:41:27.986768 containerd[1916]: time="2025-12-12T17:41:27.986634795Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:41:28.034525 containerd[1916]: time="2025-12-12T17:41:28.034376597Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150934562" Dec 12 17:41:28.039422 containerd[1916]: time="2025-12-12T17:41:28.039280965Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:41:28.082550 containerd[1916]: time="2025-12-12T17:41:28.081968763Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:41:28.082928 containerd[1916]: time="2025-12-12T17:41:28.082145136Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 4.619376793s" Dec 12 17:41:28.083157 containerd[1916]: time="2025-12-12T17:41:28.083141838Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Dec 12 17:41:28.339296 containerd[1916]: time="2025-12-12T17:41:28.339179514Z" level=info msg="CreateContainer within sandbox \"bea02ce28146269d9a594a53e7b5ed032a010077140e6e783a5b7287b441771c\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Dec 12 17:41:28.359865 containerd[1916]: time="2025-12-12T17:41:28.359803093Z" level=info msg="Container a48585459b6e19337bc1941e647fd531e34207deeca65c3fdd5f01cb739e52f1: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:41:28.362030 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1389135547.mount: Deactivated successfully. Dec 12 17:41:28.385027 containerd[1916]: time="2025-12-12T17:41:28.384980132Z" level=info msg="CreateContainer within sandbox \"bea02ce28146269d9a594a53e7b5ed032a010077140e6e783a5b7287b441771c\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"a48585459b6e19337bc1941e647fd531e34207deeca65c3fdd5f01cb739e52f1\"" Dec 12 17:41:28.387668 containerd[1916]: time="2025-12-12T17:41:28.386543898Z" level=info msg="StartContainer for \"a48585459b6e19337bc1941e647fd531e34207deeca65c3fdd5f01cb739e52f1\"" Dec 12 17:41:28.387977 containerd[1916]: time="2025-12-12T17:41:28.387958219Z" level=info msg="connecting to shim a48585459b6e19337bc1941e647fd531e34207deeca65c3fdd5f01cb739e52f1" address="unix:///run/containerd/s/ffa60f5488f50beb794af8d3aa738da9c5fd9bf52faa1fe4188f831e9a4e5082" protocol=ttrpc version=3 Dec 12 17:41:28.412661 systemd[1]: Started cri-containerd-a48585459b6e19337bc1941e647fd531e34207deeca65c3fdd5f01cb739e52f1.scope - libcontainer container a48585459b6e19337bc1941e647fd531e34207deeca65c3fdd5f01cb739e52f1. Dec 12 17:41:28.493819 containerd[1916]: time="2025-12-12T17:41:28.493676794Z" level=info msg="StartContainer for \"a48585459b6e19337bc1941e647fd531e34207deeca65c3fdd5f01cb739e52f1\" returns successfully" Dec 12 17:41:28.641338 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Dec 12 17:41:28.641481 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Dec 12 17:41:28.882461 kubelet[3527]: I1212 17:41:28.882386 3527 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e00a2951-72e5-4c7e-b07e-7aa028d3e079-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "e00a2951-72e5-4c7e-b07e-7aa028d3e079" (UID: "e00a2951-72e5-4c7e-b07e-7aa028d3e079"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Dec 12 17:41:28.883112 kubelet[3527]: I1212 17:41:28.882876 3527 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e00a2951-72e5-4c7e-b07e-7aa028d3e079-whisker-ca-bundle\") pod \"e00a2951-72e5-4c7e-b07e-7aa028d3e079\" (UID: \"e00a2951-72e5-4c7e-b07e-7aa028d3e079\") " Dec 12 17:41:28.883112 kubelet[3527]: I1212 17:41:28.882945 3527 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e00a2951-72e5-4c7e-b07e-7aa028d3e079-whisker-backend-key-pair\") pod \"e00a2951-72e5-4c7e-b07e-7aa028d3e079\" (UID: \"e00a2951-72e5-4c7e-b07e-7aa028d3e079\") " Dec 12 17:41:28.883112 kubelet[3527]: I1212 17:41:28.882962 3527 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7zfw\" (UniqueName: \"kubernetes.io/projected/e00a2951-72e5-4c7e-b07e-7aa028d3e079-kube-api-access-n7zfw\") pod \"e00a2951-72e5-4c7e-b07e-7aa028d3e079\" (UID: \"e00a2951-72e5-4c7e-b07e-7aa028d3e079\") " Dec 12 17:41:28.883112 kubelet[3527]: I1212 17:41:28.883018 3527 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e00a2951-72e5-4c7e-b07e-7aa028d3e079-whisker-ca-bundle\") on node \"ci-4459.2.2-a-9f5170e2ca\" DevicePath \"\"" Dec 12 17:41:28.886113 kubelet[3527]: I1212 17:41:28.886073 3527 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e00a2951-72e5-4c7e-b07e-7aa028d3e079-kube-api-access-n7zfw" (OuterVolumeSpecName: "kube-api-access-n7zfw") pod "e00a2951-72e5-4c7e-b07e-7aa028d3e079" (UID: "e00a2951-72e5-4c7e-b07e-7aa028d3e079"). InnerVolumeSpecName "kube-api-access-n7zfw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Dec 12 17:41:28.886487 kubelet[3527]: I1212 17:41:28.886372 3527 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e00a2951-72e5-4c7e-b07e-7aa028d3e079-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "e00a2951-72e5-4c7e-b07e-7aa028d3e079" (UID: "e00a2951-72e5-4c7e-b07e-7aa028d3e079"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Dec 12 17:41:28.984107 kubelet[3527]: I1212 17:41:28.984019 3527 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e00a2951-72e5-4c7e-b07e-7aa028d3e079-whisker-backend-key-pair\") on node \"ci-4459.2.2-a-9f5170e2ca\" DevicePath \"\"" Dec 12 17:41:28.984107 kubelet[3527]: I1212 17:41:28.984051 3527 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n7zfw\" (UniqueName: \"kubernetes.io/projected/e00a2951-72e5-4c7e-b07e-7aa028d3e079-kube-api-access-n7zfw\") on node \"ci-4459.2.2-a-9f5170e2ca\" DevicePath \"\"" Dec 12 17:41:29.286391 systemd[1]: var-lib-kubelet-pods-e00a2951\x2d72e5\x2d4c7e\x2db07e\x2d7aa028d3e079-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dn7zfw.mount: Deactivated successfully. Dec 12 17:41:29.286492 systemd[1]: var-lib-kubelet-pods-e00a2951\x2d72e5\x2d4c7e\x2db07e\x2d7aa028d3e079-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Dec 12 17:41:29.357846 systemd[1]: Removed slice kubepods-besteffort-pode00a2951_72e5_4c7e_b07e_7aa028d3e079.slice - libcontainer container kubepods-besteffort-pode00a2951_72e5_4c7e_b07e_7aa028d3e079.slice. Dec 12 17:41:29.509427 kubelet[3527]: I1212 17:41:29.508765 3527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-9vh7g" podStartSLOduration=2.185720633 podStartE2EDuration="16.508750437s" podCreationTimestamp="2025-12-12 17:41:13 +0000 UTC" firstStartedPulling="2025-12-12 17:41:13.760846919 +0000 UTC m=+22.499077884" lastFinishedPulling="2025-12-12 17:41:28.083876715 +0000 UTC m=+36.822107688" observedRunningTime="2025-12-12 17:41:29.508494694 +0000 UTC m=+38.246725667" watchObservedRunningTime="2025-12-12 17:41:29.508750437 +0000 UTC m=+38.246981402" Dec 12 17:41:29.583322 systemd[1]: Created slice kubepods-besteffort-podea69765d_2386_4a5a_bcbf_e5190366c632.slice - libcontainer container kubepods-besteffort-podea69765d_2386_4a5a_bcbf_e5190366c632.slice. Dec 12 17:41:29.590194 kubelet[3527]: I1212 17:41:29.589756 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22vql\" (UniqueName: \"kubernetes.io/projected/ea69765d-2386-4a5a-bcbf-e5190366c632-kube-api-access-22vql\") pod \"whisker-f998dd77f-h9sx8\" (UID: \"ea69765d-2386-4a5a-bcbf-e5190366c632\") " pod="calico-system/whisker-f998dd77f-h9sx8" Dec 12 17:41:29.590194 kubelet[3527]: I1212 17:41:29.589976 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ea69765d-2386-4a5a-bcbf-e5190366c632-whisker-backend-key-pair\") pod \"whisker-f998dd77f-h9sx8\" (UID: \"ea69765d-2386-4a5a-bcbf-e5190366c632\") " pod="calico-system/whisker-f998dd77f-h9sx8" Dec 12 17:41:29.590194 kubelet[3527]: I1212 17:41:29.589994 3527 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea69765d-2386-4a5a-bcbf-e5190366c632-whisker-ca-bundle\") pod \"whisker-f998dd77f-h9sx8\" (UID: \"ea69765d-2386-4a5a-bcbf-e5190366c632\") " pod="calico-system/whisker-f998dd77f-h9sx8" Dec 12 17:41:29.916930 containerd[1916]: time="2025-12-12T17:41:29.916894956Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-f998dd77f-h9sx8,Uid:ea69765d-2386-4a5a-bcbf-e5190366c632,Namespace:calico-system,Attempt:0,}" Dec 12 17:41:30.055478 systemd-networkd[1495]: cali02aa591e074: Link UP Dec 12 17:41:30.060835 systemd-networkd[1495]: cali02aa591e074: Gained carrier Dec 12 17:41:30.077736 containerd[1916]: 2025-12-12 17:41:29.948 [INFO][4614] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 12 17:41:30.077736 containerd[1916]: 2025-12-12 17:41:29.979 [INFO][4614] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.2--a--9f5170e2ca-k8s-whisker--f998dd77f--h9sx8-eth0 whisker-f998dd77f- calico-system ea69765d-2386-4a5a-bcbf-e5190366c632 874 0 2025-12-12 17:41:29 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:f998dd77f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459.2.2-a-9f5170e2ca whisker-f998dd77f-h9sx8 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali02aa591e074 [] [] }} ContainerID="eacd5e5cd173654ef6d3850e7921c4007d630f9038ef44693ee69797a64b021e" Namespace="calico-system" Pod="whisker-f998dd77f-h9sx8" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-whisker--f998dd77f--h9sx8-" Dec 12 17:41:30.077736 containerd[1916]: 2025-12-12 17:41:29.979 [INFO][4614] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="eacd5e5cd173654ef6d3850e7921c4007d630f9038ef44693ee69797a64b021e" Namespace="calico-system" Pod="whisker-f998dd77f-h9sx8" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-whisker--f998dd77f--h9sx8-eth0" Dec 12 17:41:30.077736 containerd[1916]: 2025-12-12 17:41:30.009 [INFO][4648] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="eacd5e5cd173654ef6d3850e7921c4007d630f9038ef44693ee69797a64b021e" HandleID="k8s-pod-network.eacd5e5cd173654ef6d3850e7921c4007d630f9038ef44693ee69797a64b021e" Workload="ci--4459.2.2--a--9f5170e2ca-k8s-whisker--f998dd77f--h9sx8-eth0" Dec 12 17:41:30.078425 containerd[1916]: 2025-12-12 17:41:30.009 [INFO][4648] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="eacd5e5cd173654ef6d3850e7921c4007d630f9038ef44693ee69797a64b021e" HandleID="k8s-pod-network.eacd5e5cd173654ef6d3850e7921c4007d630f9038ef44693ee69797a64b021e" Workload="ci--4459.2.2--a--9f5170e2ca-k8s-whisker--f998dd77f--h9sx8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024afa0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.2-a-9f5170e2ca", "pod":"whisker-f998dd77f-h9sx8", "timestamp":"2025-12-12 17:41:30.009141386 +0000 UTC"}, Hostname:"ci-4459.2.2-a-9f5170e2ca", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:41:30.078425 containerd[1916]: 2025-12-12 17:41:30.009 [INFO][4648] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:41:30.078425 containerd[1916]: 2025-12-12 17:41:30.009 [INFO][4648] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:41:30.078425 containerd[1916]: 2025-12-12 17:41:30.009 [INFO][4648] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.2-a-9f5170e2ca' Dec 12 17:41:30.078425 containerd[1916]: 2025-12-12 17:41:30.015 [INFO][4648] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.eacd5e5cd173654ef6d3850e7921c4007d630f9038ef44693ee69797a64b021e" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:30.078425 containerd[1916]: 2025-12-12 17:41:30.020 [INFO][4648] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:30.078425 containerd[1916]: 2025-12-12 17:41:30.025 [INFO][4648] ipam/ipam.go 511: Trying affinity for 192.168.100.64/26 host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:30.078425 containerd[1916]: 2025-12-12 17:41:30.027 [INFO][4648] ipam/ipam.go 158: Attempting to load block cidr=192.168.100.64/26 host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:30.078425 containerd[1916]: 2025-12-12 17:41:30.029 [INFO][4648] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.100.64/26 host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:30.079854 containerd[1916]: 2025-12-12 17:41:30.029 [INFO][4648] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.100.64/26 handle="k8s-pod-network.eacd5e5cd173654ef6d3850e7921c4007d630f9038ef44693ee69797a64b021e" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:30.079854 containerd[1916]: 2025-12-12 17:41:30.033 [INFO][4648] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.eacd5e5cd173654ef6d3850e7921c4007d630f9038ef44693ee69797a64b021e Dec 12 17:41:30.079854 containerd[1916]: 2025-12-12 17:41:30.039 [INFO][4648] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.100.64/26 handle="k8s-pod-network.eacd5e5cd173654ef6d3850e7921c4007d630f9038ef44693ee69797a64b021e" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:30.079854 containerd[1916]: 2025-12-12 17:41:30.044 [INFO][4648] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.100.65/26] block=192.168.100.64/26 handle="k8s-pod-network.eacd5e5cd173654ef6d3850e7921c4007d630f9038ef44693ee69797a64b021e" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:30.079854 containerd[1916]: 2025-12-12 17:41:30.044 [INFO][4648] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.100.65/26] handle="k8s-pod-network.eacd5e5cd173654ef6d3850e7921c4007d630f9038ef44693ee69797a64b021e" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:30.079854 containerd[1916]: 2025-12-12 17:41:30.044 [INFO][4648] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:41:30.079854 containerd[1916]: 2025-12-12 17:41:30.044 [INFO][4648] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.100.65/26] IPv6=[] ContainerID="eacd5e5cd173654ef6d3850e7921c4007d630f9038ef44693ee69797a64b021e" HandleID="k8s-pod-network.eacd5e5cd173654ef6d3850e7921c4007d630f9038ef44693ee69797a64b021e" Workload="ci--4459.2.2--a--9f5170e2ca-k8s-whisker--f998dd77f--h9sx8-eth0" Dec 12 17:41:30.079952 containerd[1916]: 2025-12-12 17:41:30.048 [INFO][4614] cni-plugin/k8s.go 418: Populated endpoint ContainerID="eacd5e5cd173654ef6d3850e7921c4007d630f9038ef44693ee69797a64b021e" Namespace="calico-system" Pod="whisker-f998dd77f-h9sx8" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-whisker--f998dd77f--h9sx8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--9f5170e2ca-k8s-whisker--f998dd77f--h9sx8-eth0", GenerateName:"whisker-f998dd77f-", Namespace:"calico-system", SelfLink:"", UID:"ea69765d-2386-4a5a-bcbf-e5190366c632", ResourceVersion:"874", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 41, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"f998dd77f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-9f5170e2ca", ContainerID:"", Pod:"whisker-f998dd77f-h9sx8", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.100.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali02aa591e074", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:41:30.079952 containerd[1916]: 2025-12-12 17:41:30.050 [INFO][4614] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.100.65/32] ContainerID="eacd5e5cd173654ef6d3850e7921c4007d630f9038ef44693ee69797a64b021e" Namespace="calico-system" Pod="whisker-f998dd77f-h9sx8" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-whisker--f998dd77f--h9sx8-eth0" Dec 12 17:41:30.080011 containerd[1916]: 2025-12-12 17:41:30.050 [INFO][4614] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali02aa591e074 ContainerID="eacd5e5cd173654ef6d3850e7921c4007d630f9038ef44693ee69797a64b021e" Namespace="calico-system" Pod="whisker-f998dd77f-h9sx8" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-whisker--f998dd77f--h9sx8-eth0" Dec 12 17:41:30.080011 containerd[1916]: 2025-12-12 17:41:30.055 [INFO][4614] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="eacd5e5cd173654ef6d3850e7921c4007d630f9038ef44693ee69797a64b021e" Namespace="calico-system" Pod="whisker-f998dd77f-h9sx8" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-whisker--f998dd77f--h9sx8-eth0" Dec 12 17:41:30.080038 containerd[1916]: 2025-12-12 17:41:30.058 [INFO][4614] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="eacd5e5cd173654ef6d3850e7921c4007d630f9038ef44693ee69797a64b021e" Namespace="calico-system" Pod="whisker-f998dd77f-h9sx8" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-whisker--f998dd77f--h9sx8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--9f5170e2ca-k8s-whisker--f998dd77f--h9sx8-eth0", GenerateName:"whisker-f998dd77f-", Namespace:"calico-system", SelfLink:"", UID:"ea69765d-2386-4a5a-bcbf-e5190366c632", ResourceVersion:"874", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 41, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"f998dd77f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-9f5170e2ca", ContainerID:"eacd5e5cd173654ef6d3850e7921c4007d630f9038ef44693ee69797a64b021e", Pod:"whisker-f998dd77f-h9sx8", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.100.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali02aa591e074", MAC:"aa:78:5c:bd:48:75", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:41:30.080071 containerd[1916]: 2025-12-12 17:41:30.073 [INFO][4614] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="eacd5e5cd173654ef6d3850e7921c4007d630f9038ef44693ee69797a64b021e" Namespace="calico-system" Pod="whisker-f998dd77f-h9sx8" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-whisker--f998dd77f--h9sx8-eth0" Dec 12 17:41:30.129151 containerd[1916]: time="2025-12-12T17:41:30.128871306Z" level=info msg="connecting to shim eacd5e5cd173654ef6d3850e7921c4007d630f9038ef44693ee69797a64b021e" address="unix:///run/containerd/s/c424d5c11b58d8e3c0a627264167e27febfa9d0073bc96b8dd9f4d70862bd89c" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:41:30.165670 systemd[1]: Started cri-containerd-eacd5e5cd173654ef6d3850e7921c4007d630f9038ef44693ee69797a64b021e.scope - libcontainer container eacd5e5cd173654ef6d3850e7921c4007d630f9038ef44693ee69797a64b021e. Dec 12 17:41:30.295043 containerd[1916]: time="2025-12-12T17:41:30.294981901Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-f998dd77f-h9sx8,Uid:ea69765d-2386-4a5a-bcbf-e5190366c632,Namespace:calico-system,Attempt:0,} returns sandbox id \"eacd5e5cd173654ef6d3850e7921c4007d630f9038ef44693ee69797a64b021e\"" Dec 12 17:41:30.296815 containerd[1916]: time="2025-12-12T17:41:30.296786962Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:41:30.562758 containerd[1916]: time="2025-12-12T17:41:30.562573419Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:30.565988 containerd[1916]: time="2025-12-12T17:41:30.565908084Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:41:30.565988 containerd[1916]: time="2025-12-12T17:41:30.565948702Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:41:30.570346 kubelet[3527]: E1212 17:41:30.570260 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:41:30.570346 kubelet[3527]: E1212 17:41:30.570322 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:41:30.572399 kubelet[3527]: E1212 17:41:30.572355 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-f998dd77f-h9sx8_calico-system(ea69765d-2386-4a5a-bcbf-e5190366c632): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:30.573348 containerd[1916]: time="2025-12-12T17:41:30.573150040Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:41:30.822591 containerd[1916]: time="2025-12-12T17:41:30.822456057Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:30.825907 containerd[1916]: time="2025-12-12T17:41:30.825801715Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:41:30.825907 containerd[1916]: time="2025-12-12T17:41:30.825867004Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:41:30.826156 kubelet[3527]: E1212 17:41:30.826111 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:41:30.826261 kubelet[3527]: E1212 17:41:30.826162 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:41:30.826261 kubelet[3527]: E1212 17:41:30.826242 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-f998dd77f-h9sx8_calico-system(ea69765d-2386-4a5a-bcbf-e5190366c632): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:30.826328 kubelet[3527]: E1212 17:41:30.826285 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f998dd77f-h9sx8" podUID="ea69765d-2386-4a5a-bcbf-e5190366c632" Dec 12 17:41:31.141672 systemd-networkd[1495]: cali02aa591e074: Gained IPv6LL Dec 12 17:41:31.356920 kubelet[3527]: I1212 17:41:31.356767 3527 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e00a2951-72e5-4c7e-b07e-7aa028d3e079" path="/var/lib/kubelet/pods/e00a2951-72e5-4c7e-b07e-7aa028d3e079/volumes" Dec 12 17:41:31.494729 kubelet[3527]: E1212 17:41:31.494622 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f998dd77f-h9sx8" podUID="ea69765d-2386-4a5a-bcbf-e5190366c632" Dec 12 17:41:31.498195 kubelet[3527]: I1212 17:41:31.497882 3527 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 12 17:41:32.684673 systemd-networkd[1495]: vxlan.calico: Link UP Dec 12 17:41:32.684680 systemd-networkd[1495]: vxlan.calico: Gained carrier Dec 12 17:41:33.765678 systemd-networkd[1495]: vxlan.calico: Gained IPv6LL Dec 12 17:41:34.356781 kubelet[3527]: I1212 17:41:34.356738 3527 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 12 17:41:34.358955 containerd[1916]: time="2025-12-12T17:41:34.358705988Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8645b67b4b-hg8jx,Uid:cc75188a-1547-4b56-bf96-4cc7e7818e13,Namespace:calico-system,Attempt:0,}" Dec 12 17:41:34.363676 containerd[1916]: time="2025-12-12T17:41:34.363624235Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-hc27m,Uid:08030ccf-dd27-4a9f-ad22-e1aee80e0ddf,Namespace:kube-system,Attempt:0,}" Dec 12 17:41:34.529274 systemd-networkd[1495]: calie067502be96: Link UP Dec 12 17:41:34.529430 systemd-networkd[1495]: calie067502be96: Gained carrier Dec 12 17:41:34.548468 containerd[1916]: 2025-12-12 17:41:34.431 [INFO][4949] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.2--a--9f5170e2ca-k8s-calico--kube--controllers--8645b67b4b--hg8jx-eth0 calico-kube-controllers-8645b67b4b- calico-system cc75188a-1547-4b56-bf96-4cc7e7818e13 810 0 2025-12-12 17:41:13 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:8645b67b4b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459.2.2-a-9f5170e2ca calico-kube-controllers-8645b67b4b-hg8jx eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calie067502be96 [] [] }} ContainerID="7221fe22b8017ea8bc924a070bcb84400d7913202ff476519a059d78542ad905" Namespace="calico-system" Pod="calico-kube-controllers-8645b67b4b-hg8jx" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-calico--kube--controllers--8645b67b4b--hg8jx-" Dec 12 17:41:34.548468 containerd[1916]: 2025-12-12 17:41:34.431 [INFO][4949] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7221fe22b8017ea8bc924a070bcb84400d7913202ff476519a059d78542ad905" Namespace="calico-system" Pod="calico-kube-controllers-8645b67b4b-hg8jx" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-calico--kube--controllers--8645b67b4b--hg8jx-eth0" Dec 12 17:41:34.548468 containerd[1916]: 2025-12-12 17:41:34.468 [INFO][4990] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7221fe22b8017ea8bc924a070bcb84400d7913202ff476519a059d78542ad905" HandleID="k8s-pod-network.7221fe22b8017ea8bc924a070bcb84400d7913202ff476519a059d78542ad905" Workload="ci--4459.2.2--a--9f5170e2ca-k8s-calico--kube--controllers--8645b67b4b--hg8jx-eth0" Dec 12 17:41:34.548677 containerd[1916]: 2025-12-12 17:41:34.468 [INFO][4990] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="7221fe22b8017ea8bc924a070bcb84400d7913202ff476519a059d78542ad905" HandleID="k8s-pod-network.7221fe22b8017ea8bc924a070bcb84400d7913202ff476519a059d78542ad905" Workload="ci--4459.2.2--a--9f5170e2ca-k8s-calico--kube--controllers--8645b67b4b--hg8jx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b060), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.2-a-9f5170e2ca", "pod":"calico-kube-controllers-8645b67b4b-hg8jx", "timestamp":"2025-12-12 17:41:34.468056973 +0000 UTC"}, Hostname:"ci-4459.2.2-a-9f5170e2ca", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:41:34.548677 containerd[1916]: 2025-12-12 17:41:34.468 [INFO][4990] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:41:34.548677 containerd[1916]: 2025-12-12 17:41:34.468 [INFO][4990] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:41:34.548677 containerd[1916]: 2025-12-12 17:41:34.468 [INFO][4990] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.2-a-9f5170e2ca' Dec 12 17:41:34.548677 containerd[1916]: 2025-12-12 17:41:34.477 [INFO][4990] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7221fe22b8017ea8bc924a070bcb84400d7913202ff476519a059d78542ad905" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:34.548677 containerd[1916]: 2025-12-12 17:41:34.484 [INFO][4990] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:34.548677 containerd[1916]: 2025-12-12 17:41:34.501 [INFO][4990] ipam/ipam.go 511: Trying affinity for 192.168.100.64/26 host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:34.548677 containerd[1916]: 2025-12-12 17:41:34.503 [INFO][4990] ipam/ipam.go 158: Attempting to load block cidr=192.168.100.64/26 host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:34.548677 containerd[1916]: 2025-12-12 17:41:34.505 [INFO][4990] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.100.64/26 host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:34.548815 containerd[1916]: 2025-12-12 17:41:34.506 [INFO][4990] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.100.64/26 handle="k8s-pod-network.7221fe22b8017ea8bc924a070bcb84400d7913202ff476519a059d78542ad905" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:34.548815 containerd[1916]: 2025-12-12 17:41:34.507 [INFO][4990] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.7221fe22b8017ea8bc924a070bcb84400d7913202ff476519a059d78542ad905 Dec 12 17:41:34.548815 containerd[1916]: 2025-12-12 17:41:34.516 [INFO][4990] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.100.64/26 handle="k8s-pod-network.7221fe22b8017ea8bc924a070bcb84400d7913202ff476519a059d78542ad905" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:34.548815 containerd[1916]: 2025-12-12 17:41:34.522 [INFO][4990] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.100.66/26] block=192.168.100.64/26 handle="k8s-pod-network.7221fe22b8017ea8bc924a070bcb84400d7913202ff476519a059d78542ad905" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:34.548815 containerd[1916]: 2025-12-12 17:41:34.522 [INFO][4990] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.100.66/26] handle="k8s-pod-network.7221fe22b8017ea8bc924a070bcb84400d7913202ff476519a059d78542ad905" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:34.548815 containerd[1916]: 2025-12-12 17:41:34.522 [INFO][4990] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:41:34.548815 containerd[1916]: 2025-12-12 17:41:34.522 [INFO][4990] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.100.66/26] IPv6=[] ContainerID="7221fe22b8017ea8bc924a070bcb84400d7913202ff476519a059d78542ad905" HandleID="k8s-pod-network.7221fe22b8017ea8bc924a070bcb84400d7913202ff476519a059d78542ad905" Workload="ci--4459.2.2--a--9f5170e2ca-k8s-calico--kube--controllers--8645b67b4b--hg8jx-eth0" Dec 12 17:41:34.548906 containerd[1916]: 2025-12-12 17:41:34.524 [INFO][4949] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7221fe22b8017ea8bc924a070bcb84400d7913202ff476519a059d78542ad905" Namespace="calico-system" Pod="calico-kube-controllers-8645b67b4b-hg8jx" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-calico--kube--controllers--8645b67b4b--hg8jx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--9f5170e2ca-k8s-calico--kube--controllers--8645b67b4b--hg8jx-eth0", GenerateName:"calico-kube-controllers-8645b67b4b-", Namespace:"calico-system", SelfLink:"", UID:"cc75188a-1547-4b56-bf96-4cc7e7818e13", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 41, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8645b67b4b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-9f5170e2ca", ContainerID:"", Pod:"calico-kube-controllers-8645b67b4b-hg8jx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.100.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calie067502be96", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:41:34.548939 containerd[1916]: 2025-12-12 17:41:34.525 [INFO][4949] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.100.66/32] ContainerID="7221fe22b8017ea8bc924a070bcb84400d7913202ff476519a059d78542ad905" Namespace="calico-system" Pod="calico-kube-controllers-8645b67b4b-hg8jx" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-calico--kube--controllers--8645b67b4b--hg8jx-eth0" Dec 12 17:41:34.548939 containerd[1916]: 2025-12-12 17:41:34.525 [INFO][4949] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie067502be96 ContainerID="7221fe22b8017ea8bc924a070bcb84400d7913202ff476519a059d78542ad905" Namespace="calico-system" Pod="calico-kube-controllers-8645b67b4b-hg8jx" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-calico--kube--controllers--8645b67b4b--hg8jx-eth0" Dec 12 17:41:34.548939 containerd[1916]: 2025-12-12 17:41:34.529 [INFO][4949] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7221fe22b8017ea8bc924a070bcb84400d7913202ff476519a059d78542ad905" Namespace="calico-system" Pod="calico-kube-controllers-8645b67b4b-hg8jx" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-calico--kube--controllers--8645b67b4b--hg8jx-eth0" Dec 12 17:41:34.548982 containerd[1916]: 2025-12-12 17:41:34.530 [INFO][4949] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7221fe22b8017ea8bc924a070bcb84400d7913202ff476519a059d78542ad905" Namespace="calico-system" Pod="calico-kube-controllers-8645b67b4b-hg8jx" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-calico--kube--controllers--8645b67b4b--hg8jx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--9f5170e2ca-k8s-calico--kube--controllers--8645b67b4b--hg8jx-eth0", GenerateName:"calico-kube-controllers-8645b67b4b-", Namespace:"calico-system", SelfLink:"", UID:"cc75188a-1547-4b56-bf96-4cc7e7818e13", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 41, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8645b67b4b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-9f5170e2ca", ContainerID:"7221fe22b8017ea8bc924a070bcb84400d7913202ff476519a059d78542ad905", Pod:"calico-kube-controllers-8645b67b4b-hg8jx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.100.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calie067502be96", MAC:"d2:20:fe:38:7d:3a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:41:34.549018 containerd[1916]: 2025-12-12 17:41:34.543 [INFO][4949] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7221fe22b8017ea8bc924a070bcb84400d7913202ff476519a059d78542ad905" Namespace="calico-system" Pod="calico-kube-controllers-8645b67b4b-hg8jx" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-calico--kube--controllers--8645b67b4b--hg8jx-eth0" Dec 12 17:41:34.598748 containerd[1916]: time="2025-12-12T17:41:34.598624106Z" level=info msg="connecting to shim 7221fe22b8017ea8bc924a070bcb84400d7913202ff476519a059d78542ad905" address="unix:///run/containerd/s/3c267c3ebcae2a158c6305f32d3c1af472457fcb12c94fb78c8b08c882616962" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:41:34.627635 systemd[1]: Started cri-containerd-7221fe22b8017ea8bc924a070bcb84400d7913202ff476519a059d78542ad905.scope - libcontainer container 7221fe22b8017ea8bc924a070bcb84400d7913202ff476519a059d78542ad905. Dec 12 17:41:34.635250 systemd-networkd[1495]: calib3580464743: Link UP Dec 12 17:41:34.637491 systemd-networkd[1495]: calib3580464743: Gained carrier Dec 12 17:41:34.654484 containerd[1916]: 2025-12-12 17:41:34.419 [INFO][4958] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--hc27m-eth0 coredns-66bc5c9577- kube-system 08030ccf-dd27-4a9f-ad22-e1aee80e0ddf 805 0 2025-12-12 17:40:57 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459.2.2-a-9f5170e2ca coredns-66bc5c9577-hc27m eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib3580464743 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4" Namespace="kube-system" Pod="coredns-66bc5c9577-hc27m" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--hc27m-" Dec 12 17:41:34.654484 containerd[1916]: 2025-12-12 17:41:34.419 [INFO][4958] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4" Namespace="kube-system" Pod="coredns-66bc5c9577-hc27m" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--hc27m-eth0" Dec 12 17:41:34.654484 containerd[1916]: 2025-12-12 17:41:34.468 [INFO][4983] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4" HandleID="k8s-pod-network.65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4" Workload="ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--hc27m-eth0" Dec 12 17:41:34.654674 containerd[1916]: 2025-12-12 17:41:34.468 [INFO][4983] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4" HandleID="k8s-pod-network.65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4" Workload="ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--hc27m-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c95b0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459.2.2-a-9f5170e2ca", "pod":"coredns-66bc5c9577-hc27m", "timestamp":"2025-12-12 17:41:34.46816784 +0000 UTC"}, Hostname:"ci-4459.2.2-a-9f5170e2ca", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:41:34.654674 containerd[1916]: 2025-12-12 17:41:34.469 [INFO][4983] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:41:34.654674 containerd[1916]: 2025-12-12 17:41:34.522 [INFO][4983] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:41:34.654674 containerd[1916]: 2025-12-12 17:41:34.523 [INFO][4983] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.2-a-9f5170e2ca' Dec 12 17:41:34.654674 containerd[1916]: 2025-12-12 17:41:34.577 [INFO][4983] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:34.654674 containerd[1916]: 2025-12-12 17:41:34.584 [INFO][4983] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:34.654674 containerd[1916]: 2025-12-12 17:41:34.598 [INFO][4983] ipam/ipam.go 511: Trying affinity for 192.168.100.64/26 host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:34.654674 containerd[1916]: 2025-12-12 17:41:34.601 [INFO][4983] ipam/ipam.go 158: Attempting to load block cidr=192.168.100.64/26 host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:34.654674 containerd[1916]: 2025-12-12 17:41:34.603 [INFO][4983] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.100.64/26 host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:34.654823 containerd[1916]: 2025-12-12 17:41:34.603 [INFO][4983] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.100.64/26 handle="k8s-pod-network.65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:34.654823 containerd[1916]: 2025-12-12 17:41:34.604 [INFO][4983] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4 Dec 12 17:41:34.654823 containerd[1916]: 2025-12-12 17:41:34.612 [INFO][4983] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.100.64/26 handle="k8s-pod-network.65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:34.654823 containerd[1916]: 2025-12-12 17:41:34.620 [INFO][4983] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.100.67/26] block=192.168.100.64/26 handle="k8s-pod-network.65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:34.654823 containerd[1916]: 2025-12-12 17:41:34.621 [INFO][4983] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.100.67/26] handle="k8s-pod-network.65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:34.654823 containerd[1916]: 2025-12-12 17:41:34.621 [INFO][4983] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:41:34.654823 containerd[1916]: 2025-12-12 17:41:34.621 [INFO][4983] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.100.67/26] IPv6=[] ContainerID="65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4" HandleID="k8s-pod-network.65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4" Workload="ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--hc27m-eth0" Dec 12 17:41:34.654916 containerd[1916]: 2025-12-12 17:41:34.626 [INFO][4958] cni-plugin/k8s.go 418: Populated endpoint ContainerID="65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4" Namespace="kube-system" Pod="coredns-66bc5c9577-hc27m" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--hc27m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--hc27m-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"08030ccf-dd27-4a9f-ad22-e1aee80e0ddf", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 40, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-9f5170e2ca", ContainerID:"", Pod:"coredns-66bc5c9577-hc27m", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.100.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib3580464743", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:41:34.654916 containerd[1916]: 2025-12-12 17:41:34.626 [INFO][4958] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.100.67/32] ContainerID="65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4" Namespace="kube-system" Pod="coredns-66bc5c9577-hc27m" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--hc27m-eth0" Dec 12 17:41:34.654916 containerd[1916]: 2025-12-12 17:41:34.626 [INFO][4958] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib3580464743 ContainerID="65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4" Namespace="kube-system" Pod="coredns-66bc5c9577-hc27m" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--hc27m-eth0" Dec 12 17:41:34.654916 containerd[1916]: 2025-12-12 17:41:34.639 [INFO][4958] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4" Namespace="kube-system" Pod="coredns-66bc5c9577-hc27m" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--hc27m-eth0" Dec 12 17:41:34.654916 containerd[1916]: 2025-12-12 17:41:34.640 [INFO][4958] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4" Namespace="kube-system" Pod="coredns-66bc5c9577-hc27m" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--hc27m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--hc27m-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"08030ccf-dd27-4a9f-ad22-e1aee80e0ddf", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 40, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-9f5170e2ca", ContainerID:"65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4", Pod:"coredns-66bc5c9577-hc27m", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.100.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib3580464743", MAC:"fe:40:c5:87:b9:47", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:41:34.655036 containerd[1916]: 2025-12-12 17:41:34.650 [INFO][4958] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4" Namespace="kube-system" Pod="coredns-66bc5c9577-hc27m" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--hc27m-eth0" Dec 12 17:41:34.686904 containerd[1916]: time="2025-12-12T17:41:34.686802336Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8645b67b4b-hg8jx,Uid:cc75188a-1547-4b56-bf96-4cc7e7818e13,Namespace:calico-system,Attempt:0,} returns sandbox id \"7221fe22b8017ea8bc924a070bcb84400d7913202ff476519a059d78542ad905\"" Dec 12 17:41:34.689566 containerd[1916]: time="2025-12-12T17:41:34.688885029Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:41:34.701950 containerd[1916]: time="2025-12-12T17:41:34.701880993Z" level=info msg="connecting to shim 65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4" address="unix:///run/containerd/s/d09500a2b3d49925c3f5856787b7040f6f4a07b3eca0161a2a26b8a2153d44fc" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:41:34.722690 systemd[1]: Started cri-containerd-65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4.scope - libcontainer container 65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4. Dec 12 17:41:34.762102 containerd[1916]: time="2025-12-12T17:41:34.762028333Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-hc27m,Uid:08030ccf-dd27-4a9f-ad22-e1aee80e0ddf,Namespace:kube-system,Attempt:0,} returns sandbox id \"65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4\"" Dec 12 17:41:34.774765 containerd[1916]: time="2025-12-12T17:41:34.774410055Z" level=info msg="CreateContainer within sandbox \"65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 12 17:41:34.796009 containerd[1916]: time="2025-12-12T17:41:34.795973420Z" level=info msg="Container 93aab942a50f31c92bf5427f42043ba931bfae4b75e764a863fd9db6d56af8bd: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:41:34.808605 containerd[1916]: time="2025-12-12T17:41:34.808566652Z" level=info msg="CreateContainer within sandbox \"65ce551886243513f1a51209ed399e2c0b5a025271f721c129481498257137a4\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"93aab942a50f31c92bf5427f42043ba931bfae4b75e764a863fd9db6d56af8bd\"" Dec 12 17:41:34.809479 containerd[1916]: time="2025-12-12T17:41:34.809457646Z" level=info msg="StartContainer for \"93aab942a50f31c92bf5427f42043ba931bfae4b75e764a863fd9db6d56af8bd\"" Dec 12 17:41:34.810211 containerd[1916]: time="2025-12-12T17:41:34.810181995Z" level=info msg="connecting to shim 93aab942a50f31c92bf5427f42043ba931bfae4b75e764a863fd9db6d56af8bd" address="unix:///run/containerd/s/d09500a2b3d49925c3f5856787b7040f6f4a07b3eca0161a2a26b8a2153d44fc" protocol=ttrpc version=3 Dec 12 17:41:34.826640 systemd[1]: Started cri-containerd-93aab942a50f31c92bf5427f42043ba931bfae4b75e764a863fd9db6d56af8bd.scope - libcontainer container 93aab942a50f31c92bf5427f42043ba931bfae4b75e764a863fd9db6d56af8bd. Dec 12 17:41:34.858792 containerd[1916]: time="2025-12-12T17:41:34.858753102Z" level=info msg="StartContainer for \"93aab942a50f31c92bf5427f42043ba931bfae4b75e764a863fd9db6d56af8bd\" returns successfully" Dec 12 17:41:34.980837 containerd[1916]: time="2025-12-12T17:41:34.980791433Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:34.984248 containerd[1916]: time="2025-12-12T17:41:34.984203733Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:41:34.984456 containerd[1916]: time="2025-12-12T17:41:34.984287943Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:41:34.984513 kubelet[3527]: E1212 17:41:34.984457 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:41:34.984569 kubelet[3527]: E1212 17:41:34.984548 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:41:34.984908 kubelet[3527]: E1212 17:41:34.984643 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-8645b67b4b-hg8jx_calico-system(cc75188a-1547-4b56-bf96-4cc7e7818e13): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:34.984908 kubelet[3527]: E1212 17:41:34.984709 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8645b67b4b-hg8jx" podUID="cc75188a-1547-4b56-bf96-4cc7e7818e13" Dec 12 17:41:35.359463 containerd[1916]: time="2025-12-12T17:41:35.359365968Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-ftsnw,Uid:9d6c6448-7253-4405-b42f-a3327529c933,Namespace:calico-system,Attempt:0,}" Dec 12 17:41:35.364600 containerd[1916]: time="2025-12-12T17:41:35.364564904Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5cf495b795-djp4n,Uid:6c556bc6-ca1e-488b-87dc-eb6c6785bf8c,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:41:35.475946 systemd-networkd[1495]: cali853e1159944: Link UP Dec 12 17:41:35.476845 systemd-networkd[1495]: cali853e1159944: Gained carrier Dec 12 17:41:35.490690 containerd[1916]: 2025-12-12 17:41:35.410 [INFO][5182] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--djp4n-eth0 calico-apiserver-5cf495b795- calico-apiserver 6c556bc6-ca1e-488b-87dc-eb6c6785bf8c 807 0 2025-12-12 17:41:07 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5cf495b795 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459.2.2-a-9f5170e2ca calico-apiserver-5cf495b795-djp4n eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali853e1159944 [] [] }} ContainerID="40ba3b6f7c38ca0ece1ede96e6a4959fa676c16a66e9f79549ecd3ea9376553d" Namespace="calico-apiserver" Pod="calico-apiserver-5cf495b795-djp4n" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--djp4n-" Dec 12 17:41:35.490690 containerd[1916]: 2025-12-12 17:41:35.410 [INFO][5182] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="40ba3b6f7c38ca0ece1ede96e6a4959fa676c16a66e9f79549ecd3ea9376553d" Namespace="calico-apiserver" Pod="calico-apiserver-5cf495b795-djp4n" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--djp4n-eth0" Dec 12 17:41:35.490690 containerd[1916]: 2025-12-12 17:41:35.433 [INFO][5192] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="40ba3b6f7c38ca0ece1ede96e6a4959fa676c16a66e9f79549ecd3ea9376553d" HandleID="k8s-pod-network.40ba3b6f7c38ca0ece1ede96e6a4959fa676c16a66e9f79549ecd3ea9376553d" Workload="ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--djp4n-eth0" Dec 12 17:41:35.490690 containerd[1916]: 2025-12-12 17:41:35.433 [INFO][5192] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="40ba3b6f7c38ca0ece1ede96e6a4959fa676c16a66e9f79549ecd3ea9376553d" HandleID="k8s-pod-network.40ba3b6f7c38ca0ece1ede96e6a4959fa676c16a66e9f79549ecd3ea9376553d" Workload="ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--djp4n-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b0b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459.2.2-a-9f5170e2ca", "pod":"calico-apiserver-5cf495b795-djp4n", "timestamp":"2025-12-12 17:41:35.433129106 +0000 UTC"}, Hostname:"ci-4459.2.2-a-9f5170e2ca", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:41:35.490690 containerd[1916]: 2025-12-12 17:41:35.433 [INFO][5192] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:41:35.490690 containerd[1916]: 2025-12-12 17:41:35.433 [INFO][5192] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:41:35.490690 containerd[1916]: 2025-12-12 17:41:35.433 [INFO][5192] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.2-a-9f5170e2ca' Dec 12 17:41:35.490690 containerd[1916]: 2025-12-12 17:41:35.439 [INFO][5192] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.40ba3b6f7c38ca0ece1ede96e6a4959fa676c16a66e9f79549ecd3ea9376553d" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:35.490690 containerd[1916]: 2025-12-12 17:41:35.443 [INFO][5192] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:35.490690 containerd[1916]: 2025-12-12 17:41:35.447 [INFO][5192] ipam/ipam.go 511: Trying affinity for 192.168.100.64/26 host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:35.490690 containerd[1916]: 2025-12-12 17:41:35.449 [INFO][5192] ipam/ipam.go 158: Attempting to load block cidr=192.168.100.64/26 host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:35.490690 containerd[1916]: 2025-12-12 17:41:35.450 [INFO][5192] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.100.64/26 host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:35.490690 containerd[1916]: 2025-12-12 17:41:35.451 [INFO][5192] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.100.64/26 handle="k8s-pod-network.40ba3b6f7c38ca0ece1ede96e6a4959fa676c16a66e9f79549ecd3ea9376553d" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:35.490690 containerd[1916]: 2025-12-12 17:41:35.452 [INFO][5192] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.40ba3b6f7c38ca0ece1ede96e6a4959fa676c16a66e9f79549ecd3ea9376553d Dec 12 17:41:35.490690 containerd[1916]: 2025-12-12 17:41:35.456 [INFO][5192] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.100.64/26 handle="k8s-pod-network.40ba3b6f7c38ca0ece1ede96e6a4959fa676c16a66e9f79549ecd3ea9376553d" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:35.490690 containerd[1916]: 2025-12-12 17:41:35.463 [INFO][5192] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.100.68/26] block=192.168.100.64/26 handle="k8s-pod-network.40ba3b6f7c38ca0ece1ede96e6a4959fa676c16a66e9f79549ecd3ea9376553d" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:35.490690 containerd[1916]: 2025-12-12 17:41:35.463 [INFO][5192] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.100.68/26] handle="k8s-pod-network.40ba3b6f7c38ca0ece1ede96e6a4959fa676c16a66e9f79549ecd3ea9376553d" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:35.490690 containerd[1916]: 2025-12-12 17:41:35.463 [INFO][5192] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:41:35.490690 containerd[1916]: 2025-12-12 17:41:35.463 [INFO][5192] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.100.68/26] IPv6=[] ContainerID="40ba3b6f7c38ca0ece1ede96e6a4959fa676c16a66e9f79549ecd3ea9376553d" HandleID="k8s-pod-network.40ba3b6f7c38ca0ece1ede96e6a4959fa676c16a66e9f79549ecd3ea9376553d" Workload="ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--djp4n-eth0" Dec 12 17:41:35.491091 containerd[1916]: 2025-12-12 17:41:35.467 [INFO][5182] cni-plugin/k8s.go 418: Populated endpoint ContainerID="40ba3b6f7c38ca0ece1ede96e6a4959fa676c16a66e9f79549ecd3ea9376553d" Namespace="calico-apiserver" Pod="calico-apiserver-5cf495b795-djp4n" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--djp4n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--djp4n-eth0", GenerateName:"calico-apiserver-5cf495b795-", Namespace:"calico-apiserver", SelfLink:"", UID:"6c556bc6-ca1e-488b-87dc-eb6c6785bf8c", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 41, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5cf495b795", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-9f5170e2ca", ContainerID:"", Pod:"calico-apiserver-5cf495b795-djp4n", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.100.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali853e1159944", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:41:35.491091 containerd[1916]: 2025-12-12 17:41:35.468 [INFO][5182] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.100.68/32] ContainerID="40ba3b6f7c38ca0ece1ede96e6a4959fa676c16a66e9f79549ecd3ea9376553d" Namespace="calico-apiserver" Pod="calico-apiserver-5cf495b795-djp4n" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--djp4n-eth0" Dec 12 17:41:35.491091 containerd[1916]: 2025-12-12 17:41:35.468 [INFO][5182] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali853e1159944 ContainerID="40ba3b6f7c38ca0ece1ede96e6a4959fa676c16a66e9f79549ecd3ea9376553d" Namespace="calico-apiserver" Pod="calico-apiserver-5cf495b795-djp4n" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--djp4n-eth0" Dec 12 17:41:35.491091 containerd[1916]: 2025-12-12 17:41:35.477 [INFO][5182] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="40ba3b6f7c38ca0ece1ede96e6a4959fa676c16a66e9f79549ecd3ea9376553d" Namespace="calico-apiserver" Pod="calico-apiserver-5cf495b795-djp4n" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--djp4n-eth0" Dec 12 17:41:35.491091 containerd[1916]: 2025-12-12 17:41:35.477 [INFO][5182] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="40ba3b6f7c38ca0ece1ede96e6a4959fa676c16a66e9f79549ecd3ea9376553d" Namespace="calico-apiserver" Pod="calico-apiserver-5cf495b795-djp4n" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--djp4n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--djp4n-eth0", GenerateName:"calico-apiserver-5cf495b795-", Namespace:"calico-apiserver", SelfLink:"", UID:"6c556bc6-ca1e-488b-87dc-eb6c6785bf8c", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 41, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5cf495b795", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-9f5170e2ca", ContainerID:"40ba3b6f7c38ca0ece1ede96e6a4959fa676c16a66e9f79549ecd3ea9376553d", Pod:"calico-apiserver-5cf495b795-djp4n", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.100.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali853e1159944", MAC:"da:48:98:f5:8e:e1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:41:35.491091 containerd[1916]: 2025-12-12 17:41:35.488 [INFO][5182] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="40ba3b6f7c38ca0ece1ede96e6a4959fa676c16a66e9f79549ecd3ea9376553d" Namespace="calico-apiserver" Pod="calico-apiserver-5cf495b795-djp4n" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--djp4n-eth0" Dec 12 17:41:35.506536 kubelet[3527]: E1212 17:41:35.506396 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8645b67b4b-hg8jx" podUID="cc75188a-1547-4b56-bf96-4cc7e7818e13" Dec 12 17:41:35.518604 kubelet[3527]: I1212 17:41:35.518143 3527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-hc27m" podStartSLOduration=38.51813034 podStartE2EDuration="38.51813034s" podCreationTimestamp="2025-12-12 17:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:41:35.517781402 +0000 UTC m=+44.256012375" watchObservedRunningTime="2025-12-12 17:41:35.51813034 +0000 UTC m=+44.256361385" Dec 12 17:41:35.543432 containerd[1916]: time="2025-12-12T17:41:35.543047948Z" level=info msg="connecting to shim 40ba3b6f7c38ca0ece1ede96e6a4959fa676c16a66e9f79549ecd3ea9376553d" address="unix:///run/containerd/s/db42d6ab87d78b341de9ca098376e4cbe78ba2cc70f7e3498caab47bf8d71446" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:41:35.581861 systemd[1]: Started cri-containerd-40ba3b6f7c38ca0ece1ede96e6a4959fa676c16a66e9f79549ecd3ea9376553d.scope - libcontainer container 40ba3b6f7c38ca0ece1ede96e6a4959fa676c16a66e9f79549ecd3ea9376553d. Dec 12 17:41:35.610036 systemd-networkd[1495]: caliae1177b8439: Link UP Dec 12 17:41:35.611994 systemd-networkd[1495]: caliae1177b8439: Gained carrier Dec 12 17:41:35.622572 systemd-networkd[1495]: calie067502be96: Gained IPv6LL Dec 12 17:41:35.628390 containerd[1916]: 2025-12-12 17:41:35.413 [INFO][5169] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.2--a--9f5170e2ca-k8s-goldmane--7c778bb748--ftsnw-eth0 goldmane-7c778bb748- calico-system 9d6c6448-7253-4405-b42f-a3327529c933 806 0 2025-12-12 17:41:11 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7c778bb748 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459.2.2-a-9f5170e2ca goldmane-7c778bb748-ftsnw eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] caliae1177b8439 [] [] }} ContainerID="ed6cba8e7b57483f88f6c9934b1923f22d11c01da88336c293d67d5bc9bd457b" Namespace="calico-system" Pod="goldmane-7c778bb748-ftsnw" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-goldmane--7c778bb748--ftsnw-" Dec 12 17:41:35.628390 containerd[1916]: 2025-12-12 17:41:35.413 [INFO][5169] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ed6cba8e7b57483f88f6c9934b1923f22d11c01da88336c293d67d5bc9bd457b" Namespace="calico-system" Pod="goldmane-7c778bb748-ftsnw" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-goldmane--7c778bb748--ftsnw-eth0" Dec 12 17:41:35.628390 containerd[1916]: 2025-12-12 17:41:35.434 [INFO][5198] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ed6cba8e7b57483f88f6c9934b1923f22d11c01da88336c293d67d5bc9bd457b" HandleID="k8s-pod-network.ed6cba8e7b57483f88f6c9934b1923f22d11c01da88336c293d67d5bc9bd457b" Workload="ci--4459.2.2--a--9f5170e2ca-k8s-goldmane--7c778bb748--ftsnw-eth0" Dec 12 17:41:35.628390 containerd[1916]: 2025-12-12 17:41:35.435 [INFO][5198] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="ed6cba8e7b57483f88f6c9934b1923f22d11c01da88336c293d67d5bc9bd457b" HandleID="k8s-pod-network.ed6cba8e7b57483f88f6c9934b1923f22d11c01da88336c293d67d5bc9bd457b" Workload="ci--4459.2.2--a--9f5170e2ca-k8s-goldmane--7c778bb748--ftsnw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ab3a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.2-a-9f5170e2ca", "pod":"goldmane-7c778bb748-ftsnw", "timestamp":"2025-12-12 17:41:35.43490691 +0000 UTC"}, Hostname:"ci-4459.2.2-a-9f5170e2ca", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:41:35.628390 containerd[1916]: 2025-12-12 17:41:35.435 [INFO][5198] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:41:35.628390 containerd[1916]: 2025-12-12 17:41:35.463 [INFO][5198] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:41:35.628390 containerd[1916]: 2025-12-12 17:41:35.464 [INFO][5198] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.2-a-9f5170e2ca' Dec 12 17:41:35.628390 containerd[1916]: 2025-12-12 17:41:35.541 [INFO][5198] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ed6cba8e7b57483f88f6c9934b1923f22d11c01da88336c293d67d5bc9bd457b" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:35.628390 containerd[1916]: 2025-12-12 17:41:35.555 [INFO][5198] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:35.628390 containerd[1916]: 2025-12-12 17:41:35.578 [INFO][5198] ipam/ipam.go 511: Trying affinity for 192.168.100.64/26 host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:35.628390 containerd[1916]: 2025-12-12 17:41:35.581 [INFO][5198] ipam/ipam.go 158: Attempting to load block cidr=192.168.100.64/26 host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:35.628390 containerd[1916]: 2025-12-12 17:41:35.585 [INFO][5198] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.100.64/26 host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:35.628390 containerd[1916]: 2025-12-12 17:41:35.585 [INFO][5198] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.100.64/26 handle="k8s-pod-network.ed6cba8e7b57483f88f6c9934b1923f22d11c01da88336c293d67d5bc9bd457b" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:35.628390 containerd[1916]: 2025-12-12 17:41:35.587 [INFO][5198] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.ed6cba8e7b57483f88f6c9934b1923f22d11c01da88336c293d67d5bc9bd457b Dec 12 17:41:35.628390 containerd[1916]: 2025-12-12 17:41:35.591 [INFO][5198] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.100.64/26 handle="k8s-pod-network.ed6cba8e7b57483f88f6c9934b1923f22d11c01da88336c293d67d5bc9bd457b" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:35.628390 containerd[1916]: 2025-12-12 17:41:35.600 [INFO][5198] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.100.69/26] block=192.168.100.64/26 handle="k8s-pod-network.ed6cba8e7b57483f88f6c9934b1923f22d11c01da88336c293d67d5bc9bd457b" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:35.628390 containerd[1916]: 2025-12-12 17:41:35.600 [INFO][5198] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.100.69/26] handle="k8s-pod-network.ed6cba8e7b57483f88f6c9934b1923f22d11c01da88336c293d67d5bc9bd457b" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:35.628390 containerd[1916]: 2025-12-12 17:41:35.600 [INFO][5198] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:41:35.628390 containerd[1916]: 2025-12-12 17:41:35.600 [INFO][5198] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.100.69/26] IPv6=[] ContainerID="ed6cba8e7b57483f88f6c9934b1923f22d11c01da88336c293d67d5bc9bd457b" HandleID="k8s-pod-network.ed6cba8e7b57483f88f6c9934b1923f22d11c01da88336c293d67d5bc9bd457b" Workload="ci--4459.2.2--a--9f5170e2ca-k8s-goldmane--7c778bb748--ftsnw-eth0" Dec 12 17:41:35.629196 containerd[1916]: 2025-12-12 17:41:35.602 [INFO][5169] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ed6cba8e7b57483f88f6c9934b1923f22d11c01da88336c293d67d5bc9bd457b" Namespace="calico-system" Pod="goldmane-7c778bb748-ftsnw" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-goldmane--7c778bb748--ftsnw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--9f5170e2ca-k8s-goldmane--7c778bb748--ftsnw-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"9d6c6448-7253-4405-b42f-a3327529c933", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 41, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-9f5170e2ca", ContainerID:"", Pod:"goldmane-7c778bb748-ftsnw", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.100.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"caliae1177b8439", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:41:35.629196 containerd[1916]: 2025-12-12 17:41:35.602 [INFO][5169] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.100.69/32] ContainerID="ed6cba8e7b57483f88f6c9934b1923f22d11c01da88336c293d67d5bc9bd457b" Namespace="calico-system" Pod="goldmane-7c778bb748-ftsnw" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-goldmane--7c778bb748--ftsnw-eth0" Dec 12 17:41:35.629196 containerd[1916]: 2025-12-12 17:41:35.602 [INFO][5169] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliae1177b8439 ContainerID="ed6cba8e7b57483f88f6c9934b1923f22d11c01da88336c293d67d5bc9bd457b" Namespace="calico-system" Pod="goldmane-7c778bb748-ftsnw" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-goldmane--7c778bb748--ftsnw-eth0" Dec 12 17:41:35.629196 containerd[1916]: 2025-12-12 17:41:35.612 [INFO][5169] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ed6cba8e7b57483f88f6c9934b1923f22d11c01da88336c293d67d5bc9bd457b" Namespace="calico-system" Pod="goldmane-7c778bb748-ftsnw" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-goldmane--7c778bb748--ftsnw-eth0" Dec 12 17:41:35.629196 containerd[1916]: 2025-12-12 17:41:35.613 [INFO][5169] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ed6cba8e7b57483f88f6c9934b1923f22d11c01da88336c293d67d5bc9bd457b" Namespace="calico-system" Pod="goldmane-7c778bb748-ftsnw" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-goldmane--7c778bb748--ftsnw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--9f5170e2ca-k8s-goldmane--7c778bb748--ftsnw-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"9d6c6448-7253-4405-b42f-a3327529c933", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 41, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-9f5170e2ca", ContainerID:"ed6cba8e7b57483f88f6c9934b1923f22d11c01da88336c293d67d5bc9bd457b", Pod:"goldmane-7c778bb748-ftsnw", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.100.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"caliae1177b8439", MAC:"46:6e:94:ea:5a:64", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:41:35.629196 containerd[1916]: 2025-12-12 17:41:35.624 [INFO][5169] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ed6cba8e7b57483f88f6c9934b1923f22d11c01da88336c293d67d5bc9bd457b" Namespace="calico-system" Pod="goldmane-7c778bb748-ftsnw" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-goldmane--7c778bb748--ftsnw-eth0" Dec 12 17:41:35.691139 containerd[1916]: time="2025-12-12T17:41:35.691011036Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5cf495b795-djp4n,Uid:6c556bc6-ca1e-488b-87dc-eb6c6785bf8c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"40ba3b6f7c38ca0ece1ede96e6a4959fa676c16a66e9f79549ecd3ea9376553d\"" Dec 12 17:41:35.693833 containerd[1916]: time="2025-12-12T17:41:35.693771797Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:41:35.732008 containerd[1916]: time="2025-12-12T17:41:35.731943992Z" level=info msg="connecting to shim ed6cba8e7b57483f88f6c9934b1923f22d11c01da88336c293d67d5bc9bd457b" address="unix:///run/containerd/s/e2f3bee135ecbeaba990fca29d5419b7d0ee84f867a9ed125dbfa553e87fb1c7" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:41:35.759651 systemd[1]: Started cri-containerd-ed6cba8e7b57483f88f6c9934b1923f22d11c01da88336c293d67d5bc9bd457b.scope - libcontainer container ed6cba8e7b57483f88f6c9934b1923f22d11c01da88336c293d67d5bc9bd457b. Dec 12 17:41:35.789253 containerd[1916]: time="2025-12-12T17:41:35.789160367Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-ftsnw,Uid:9d6c6448-7253-4405-b42f-a3327529c933,Namespace:calico-system,Attempt:0,} returns sandbox id \"ed6cba8e7b57483f88f6c9934b1923f22d11c01da88336c293d67d5bc9bd457b\"" Dec 12 17:41:35.941666 systemd-networkd[1495]: calib3580464743: Gained IPv6LL Dec 12 17:41:35.975281 containerd[1916]: time="2025-12-12T17:41:35.975230632Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:35.978577 containerd[1916]: time="2025-12-12T17:41:35.978485599Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:41:35.978577 containerd[1916]: time="2025-12-12T17:41:35.978544097Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:41:35.978893 kubelet[3527]: E1212 17:41:35.978834 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:41:35.978952 kubelet[3527]: E1212 17:41:35.978896 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:41:35.979075 kubelet[3527]: E1212 17:41:35.979043 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-5cf495b795-djp4n_calico-apiserver(6c556bc6-ca1e-488b-87dc-eb6c6785bf8c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:35.979109 kubelet[3527]: E1212 17:41:35.979088 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cf495b795-djp4n" podUID="6c556bc6-ca1e-488b-87dc-eb6c6785bf8c" Dec 12 17:41:35.979409 containerd[1916]: time="2025-12-12T17:41:35.979385057Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:41:36.233190 containerd[1916]: time="2025-12-12T17:41:36.233059473Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:36.236932 containerd[1916]: time="2025-12-12T17:41:36.236882456Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:41:36.237043 containerd[1916]: time="2025-12-12T17:41:36.236969491Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:41:36.237412 kubelet[3527]: E1212 17:41:36.237214 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:41:36.237412 kubelet[3527]: E1212 17:41:36.237270 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:41:36.237412 kubelet[3527]: E1212 17:41:36.237346 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-ftsnw_calico-system(9d6c6448-7253-4405-b42f-a3327529c933): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:36.237412 kubelet[3527]: E1212 17:41:36.237373 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ftsnw" podUID="9d6c6448-7253-4405-b42f-a3327529c933" Dec 12 17:41:36.359167 containerd[1916]: time="2025-12-12T17:41:36.359110905Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-vn5cd,Uid:bb311974-0acf-4583-8d90-53c2a33e8927,Namespace:kube-system,Attempt:0,}" Dec 12 17:41:36.454221 systemd-networkd[1495]: cali9ff7027a1e4: Link UP Dec 12 17:41:36.456417 systemd-networkd[1495]: cali9ff7027a1e4: Gained carrier Dec 12 17:41:36.467834 containerd[1916]: 2025-12-12 17:41:36.392 [INFO][5319] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--vn5cd-eth0 coredns-66bc5c9577- kube-system bb311974-0acf-4583-8d90-53c2a33e8927 804 0 2025-12-12 17:40:57 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459.2.2-a-9f5170e2ca coredns-66bc5c9577-vn5cd eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali9ff7027a1e4 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17" Namespace="kube-system" Pod="coredns-66bc5c9577-vn5cd" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--vn5cd-" Dec 12 17:41:36.467834 containerd[1916]: 2025-12-12 17:41:36.392 [INFO][5319] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17" Namespace="kube-system" Pod="coredns-66bc5c9577-vn5cd" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--vn5cd-eth0" Dec 12 17:41:36.467834 containerd[1916]: 2025-12-12 17:41:36.412 [INFO][5329] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17" HandleID="k8s-pod-network.d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17" Workload="ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--vn5cd-eth0" Dec 12 17:41:36.467834 containerd[1916]: 2025-12-12 17:41:36.413 [INFO][5329] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17" HandleID="k8s-pod-network.d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17" Workload="ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--vn5cd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b740), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459.2.2-a-9f5170e2ca", "pod":"coredns-66bc5c9577-vn5cd", "timestamp":"2025-12-12 17:41:36.412986079 +0000 UTC"}, Hostname:"ci-4459.2.2-a-9f5170e2ca", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:41:36.467834 containerd[1916]: 2025-12-12 17:41:36.413 [INFO][5329] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:41:36.467834 containerd[1916]: 2025-12-12 17:41:36.413 [INFO][5329] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:41:36.467834 containerd[1916]: 2025-12-12 17:41:36.413 [INFO][5329] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.2-a-9f5170e2ca' Dec 12 17:41:36.467834 containerd[1916]: 2025-12-12 17:41:36.420 [INFO][5329] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:36.467834 containerd[1916]: 2025-12-12 17:41:36.423 [INFO][5329] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:36.467834 containerd[1916]: 2025-12-12 17:41:36.427 [INFO][5329] ipam/ipam.go 511: Trying affinity for 192.168.100.64/26 host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:36.467834 containerd[1916]: 2025-12-12 17:41:36.431 [INFO][5329] ipam/ipam.go 158: Attempting to load block cidr=192.168.100.64/26 host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:36.467834 containerd[1916]: 2025-12-12 17:41:36.434 [INFO][5329] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.100.64/26 host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:36.467834 containerd[1916]: 2025-12-12 17:41:36.434 [INFO][5329] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.100.64/26 handle="k8s-pod-network.d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:36.467834 containerd[1916]: 2025-12-12 17:41:36.436 [INFO][5329] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17 Dec 12 17:41:36.467834 containerd[1916]: 2025-12-12 17:41:36.441 [INFO][5329] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.100.64/26 handle="k8s-pod-network.d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:36.467834 containerd[1916]: 2025-12-12 17:41:36.448 [INFO][5329] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.100.70/26] block=192.168.100.64/26 handle="k8s-pod-network.d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:36.467834 containerd[1916]: 2025-12-12 17:41:36.448 [INFO][5329] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.100.70/26] handle="k8s-pod-network.d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:36.467834 containerd[1916]: 2025-12-12 17:41:36.448 [INFO][5329] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:41:36.467834 containerd[1916]: 2025-12-12 17:41:36.448 [INFO][5329] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.100.70/26] IPv6=[] ContainerID="d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17" HandleID="k8s-pod-network.d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17" Workload="ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--vn5cd-eth0" Dec 12 17:41:36.468861 containerd[1916]: 2025-12-12 17:41:36.450 [INFO][5319] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17" Namespace="kube-system" Pod="coredns-66bc5c9577-vn5cd" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--vn5cd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--vn5cd-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"bb311974-0acf-4583-8d90-53c2a33e8927", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 40, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-9f5170e2ca", ContainerID:"", Pod:"coredns-66bc5c9577-vn5cd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.100.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9ff7027a1e4", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:41:36.468861 containerd[1916]: 2025-12-12 17:41:36.450 [INFO][5319] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.100.70/32] ContainerID="d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17" Namespace="kube-system" Pod="coredns-66bc5c9577-vn5cd" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--vn5cd-eth0" Dec 12 17:41:36.468861 containerd[1916]: 2025-12-12 17:41:36.450 [INFO][5319] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9ff7027a1e4 ContainerID="d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17" Namespace="kube-system" Pod="coredns-66bc5c9577-vn5cd" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--vn5cd-eth0" Dec 12 17:41:36.468861 containerd[1916]: 2025-12-12 17:41:36.455 [INFO][5319] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17" Namespace="kube-system" Pod="coredns-66bc5c9577-vn5cd" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--vn5cd-eth0" Dec 12 17:41:36.468861 containerd[1916]: 2025-12-12 17:41:36.455 [INFO][5319] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17" Namespace="kube-system" Pod="coredns-66bc5c9577-vn5cd" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--vn5cd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--vn5cd-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"bb311974-0acf-4583-8d90-53c2a33e8927", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 40, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-9f5170e2ca", ContainerID:"d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17", Pod:"coredns-66bc5c9577-vn5cd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.100.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9ff7027a1e4", MAC:"3e:5e:66:01:b8:26", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:41:36.468989 containerd[1916]: 2025-12-12 17:41:36.465 [INFO][5319] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17" Namespace="kube-system" Pod="coredns-66bc5c9577-vn5cd" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-coredns--66bc5c9577--vn5cd-eth0" Dec 12 17:41:36.512867 kubelet[3527]: E1212 17:41:36.511404 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ftsnw" podUID="9d6c6448-7253-4405-b42f-a3327529c933" Dec 12 17:41:36.518701 kubelet[3527]: E1212 17:41:36.518671 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8645b67b4b-hg8jx" podUID="cc75188a-1547-4b56-bf96-4cc7e7818e13" Dec 12 17:41:36.519002 kubelet[3527]: E1212 17:41:36.518748 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cf495b795-djp4n" podUID="6c556bc6-ca1e-488b-87dc-eb6c6785bf8c" Dec 12 17:41:36.520490 containerd[1916]: time="2025-12-12T17:41:36.520418456Z" level=info msg="connecting to shim d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17" address="unix:///run/containerd/s/9acd762587c788032a707188cffd3f3e952a71595c9fd5b8f55efb998ed68cfe" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:41:36.553832 systemd[1]: Started cri-containerd-d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17.scope - libcontainer container d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17. Dec 12 17:41:36.615368 containerd[1916]: time="2025-12-12T17:41:36.615270426Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-vn5cd,Uid:bb311974-0acf-4583-8d90-53c2a33e8927,Namespace:kube-system,Attempt:0,} returns sandbox id \"d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17\"" Dec 12 17:41:36.625336 containerd[1916]: time="2025-12-12T17:41:36.625256237Z" level=info msg="CreateContainer within sandbox \"d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 12 17:41:36.649104 containerd[1916]: time="2025-12-12T17:41:36.648670145Z" level=info msg="Container 4e48a14d37f5c34216d3b7e1ef48a60e5f03fa9f171780014baf219c045bff69: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:41:36.662096 containerd[1916]: time="2025-12-12T17:41:36.662042407Z" level=info msg="CreateContainer within sandbox \"d0eaba670104fe14a5e8612ceaf830ed99ab090283ed4124bdd6995e5eb4ef17\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"4e48a14d37f5c34216d3b7e1ef48a60e5f03fa9f171780014baf219c045bff69\"" Dec 12 17:41:36.662760 containerd[1916]: time="2025-12-12T17:41:36.662727971Z" level=info msg="StartContainer for \"4e48a14d37f5c34216d3b7e1ef48a60e5f03fa9f171780014baf219c045bff69\"" Dec 12 17:41:36.664450 containerd[1916]: time="2025-12-12T17:41:36.664375772Z" level=info msg="connecting to shim 4e48a14d37f5c34216d3b7e1ef48a60e5f03fa9f171780014baf219c045bff69" address="unix:///run/containerd/s/9acd762587c788032a707188cffd3f3e952a71595c9fd5b8f55efb998ed68cfe" protocol=ttrpc version=3 Dec 12 17:41:36.688690 systemd[1]: Started cri-containerd-4e48a14d37f5c34216d3b7e1ef48a60e5f03fa9f171780014baf219c045bff69.scope - libcontainer container 4e48a14d37f5c34216d3b7e1ef48a60e5f03fa9f171780014baf219c045bff69. Dec 12 17:41:36.719548 containerd[1916]: time="2025-12-12T17:41:36.719493453Z" level=info msg="StartContainer for \"4e48a14d37f5c34216d3b7e1ef48a60e5f03fa9f171780014baf219c045bff69\" returns successfully" Dec 12 17:41:37.359179 containerd[1916]: time="2025-12-12T17:41:37.359141731Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5cf495b795-pnxm5,Uid:65781289-4c97-4182-9b09-b4c93c5b6dd1,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:41:37.363477 containerd[1916]: time="2025-12-12T17:41:37.363444209Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-nhpd8,Uid:aae497cc-3748-48de-b6f7-6585350a2476,Namespace:calico-system,Attempt:0,}" Dec 12 17:41:37.477658 systemd-networkd[1495]: cali853e1159944: Gained IPv6LL Dec 12 17:41:37.498840 systemd-networkd[1495]: calid6f990e9221: Link UP Dec 12 17:41:37.499891 systemd-networkd[1495]: calid6f990e9221: Gained carrier Dec 12 17:41:37.512530 containerd[1916]: 2025-12-12 17:41:37.413 [INFO][5436] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.2--a--9f5170e2ca-k8s-csi--node--driver--nhpd8-eth0 csi-node-driver- calico-system aae497cc-3748-48de-b6f7-6585350a2476 702 0 2025-12-12 17:41:13 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:9d99788f7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459.2.2-a-9f5170e2ca csi-node-driver-nhpd8 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calid6f990e9221 [] [] }} ContainerID="0f7bbbab3602e041b920db2390842296925d6204ab9af1b8555c0bd7125643be" Namespace="calico-system" Pod="csi-node-driver-nhpd8" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-csi--node--driver--nhpd8-" Dec 12 17:41:37.512530 containerd[1916]: 2025-12-12 17:41:37.413 [INFO][5436] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0f7bbbab3602e041b920db2390842296925d6204ab9af1b8555c0bd7125643be" Namespace="calico-system" Pod="csi-node-driver-nhpd8" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-csi--node--driver--nhpd8-eth0" Dec 12 17:41:37.512530 containerd[1916]: 2025-12-12 17:41:37.437 [INFO][5449] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0f7bbbab3602e041b920db2390842296925d6204ab9af1b8555c0bd7125643be" HandleID="k8s-pod-network.0f7bbbab3602e041b920db2390842296925d6204ab9af1b8555c0bd7125643be" Workload="ci--4459.2.2--a--9f5170e2ca-k8s-csi--node--driver--nhpd8-eth0" Dec 12 17:41:37.512530 containerd[1916]: 2025-12-12 17:41:37.437 [INFO][5449] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="0f7bbbab3602e041b920db2390842296925d6204ab9af1b8555c0bd7125643be" HandleID="k8s-pod-network.0f7bbbab3602e041b920db2390842296925d6204ab9af1b8555c0bd7125643be" Workload="ci--4459.2.2--a--9f5170e2ca-k8s-csi--node--driver--nhpd8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003309b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.2-a-9f5170e2ca", "pod":"csi-node-driver-nhpd8", "timestamp":"2025-12-12 17:41:37.43732011 +0000 UTC"}, Hostname:"ci-4459.2.2-a-9f5170e2ca", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:41:37.512530 containerd[1916]: 2025-12-12 17:41:37.437 [INFO][5449] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:41:37.512530 containerd[1916]: 2025-12-12 17:41:37.437 [INFO][5449] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:41:37.512530 containerd[1916]: 2025-12-12 17:41:37.437 [INFO][5449] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.2-a-9f5170e2ca' Dec 12 17:41:37.512530 containerd[1916]: 2025-12-12 17:41:37.444 [INFO][5449] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0f7bbbab3602e041b920db2390842296925d6204ab9af1b8555c0bd7125643be" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:37.512530 containerd[1916]: 2025-12-12 17:41:37.450 [INFO][5449] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:37.512530 containerd[1916]: 2025-12-12 17:41:37.454 [INFO][5449] ipam/ipam.go 511: Trying affinity for 192.168.100.64/26 host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:37.512530 containerd[1916]: 2025-12-12 17:41:37.455 [INFO][5449] ipam/ipam.go 158: Attempting to load block cidr=192.168.100.64/26 host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:37.512530 containerd[1916]: 2025-12-12 17:41:37.457 [INFO][5449] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.100.64/26 host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:37.512530 containerd[1916]: 2025-12-12 17:41:37.457 [INFO][5449] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.100.64/26 handle="k8s-pod-network.0f7bbbab3602e041b920db2390842296925d6204ab9af1b8555c0bd7125643be" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:37.512530 containerd[1916]: 2025-12-12 17:41:37.459 [INFO][5449] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.0f7bbbab3602e041b920db2390842296925d6204ab9af1b8555c0bd7125643be Dec 12 17:41:37.512530 containerd[1916]: 2025-12-12 17:41:37.480 [INFO][5449] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.100.64/26 handle="k8s-pod-network.0f7bbbab3602e041b920db2390842296925d6204ab9af1b8555c0bd7125643be" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:37.512530 containerd[1916]: 2025-12-12 17:41:37.487 [INFO][5449] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.100.71/26] block=192.168.100.64/26 handle="k8s-pod-network.0f7bbbab3602e041b920db2390842296925d6204ab9af1b8555c0bd7125643be" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:37.512530 containerd[1916]: 2025-12-12 17:41:37.487 [INFO][5449] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.100.71/26] handle="k8s-pod-network.0f7bbbab3602e041b920db2390842296925d6204ab9af1b8555c0bd7125643be" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:37.512530 containerd[1916]: 2025-12-12 17:41:37.487 [INFO][5449] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:41:37.512530 containerd[1916]: 2025-12-12 17:41:37.487 [INFO][5449] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.100.71/26] IPv6=[] ContainerID="0f7bbbab3602e041b920db2390842296925d6204ab9af1b8555c0bd7125643be" HandleID="k8s-pod-network.0f7bbbab3602e041b920db2390842296925d6204ab9af1b8555c0bd7125643be" Workload="ci--4459.2.2--a--9f5170e2ca-k8s-csi--node--driver--nhpd8-eth0" Dec 12 17:41:37.513087 containerd[1916]: 2025-12-12 17:41:37.490 [INFO][5436] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0f7bbbab3602e041b920db2390842296925d6204ab9af1b8555c0bd7125643be" Namespace="calico-system" Pod="csi-node-driver-nhpd8" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-csi--node--driver--nhpd8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--9f5170e2ca-k8s-csi--node--driver--nhpd8-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"aae497cc-3748-48de-b6f7-6585350a2476", ResourceVersion:"702", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 41, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-9f5170e2ca", ContainerID:"", Pod:"csi-node-driver-nhpd8", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.100.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid6f990e9221", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:41:37.513087 containerd[1916]: 2025-12-12 17:41:37.490 [INFO][5436] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.100.71/32] ContainerID="0f7bbbab3602e041b920db2390842296925d6204ab9af1b8555c0bd7125643be" Namespace="calico-system" Pod="csi-node-driver-nhpd8" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-csi--node--driver--nhpd8-eth0" Dec 12 17:41:37.513087 containerd[1916]: 2025-12-12 17:41:37.490 [INFO][5436] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid6f990e9221 ContainerID="0f7bbbab3602e041b920db2390842296925d6204ab9af1b8555c0bd7125643be" Namespace="calico-system" Pod="csi-node-driver-nhpd8" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-csi--node--driver--nhpd8-eth0" Dec 12 17:41:37.513087 containerd[1916]: 2025-12-12 17:41:37.500 [INFO][5436] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0f7bbbab3602e041b920db2390842296925d6204ab9af1b8555c0bd7125643be" Namespace="calico-system" Pod="csi-node-driver-nhpd8" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-csi--node--driver--nhpd8-eth0" Dec 12 17:41:37.513087 containerd[1916]: 2025-12-12 17:41:37.500 [INFO][5436] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0f7bbbab3602e041b920db2390842296925d6204ab9af1b8555c0bd7125643be" Namespace="calico-system" Pod="csi-node-driver-nhpd8" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-csi--node--driver--nhpd8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--9f5170e2ca-k8s-csi--node--driver--nhpd8-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"aae497cc-3748-48de-b6f7-6585350a2476", ResourceVersion:"702", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 41, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-9f5170e2ca", ContainerID:"0f7bbbab3602e041b920db2390842296925d6204ab9af1b8555c0bd7125643be", Pod:"csi-node-driver-nhpd8", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.100.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid6f990e9221", MAC:"36:68:d1:8e:8a:98", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:41:37.513087 containerd[1916]: 2025-12-12 17:41:37.511 [INFO][5436] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0f7bbbab3602e041b920db2390842296925d6204ab9af1b8555c0bd7125643be" Namespace="calico-system" Pod="csi-node-driver-nhpd8" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-csi--node--driver--nhpd8-eth0" Dec 12 17:41:37.523067 kubelet[3527]: E1212 17:41:37.523035 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ftsnw" podUID="9d6c6448-7253-4405-b42f-a3327529c933" Dec 12 17:41:37.524645 kubelet[3527]: E1212 17:41:37.524573 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cf495b795-djp4n" podUID="6c556bc6-ca1e-488b-87dc-eb6c6785bf8c" Dec 12 17:41:37.562303 containerd[1916]: time="2025-12-12T17:41:37.562261895Z" level=info msg="connecting to shim 0f7bbbab3602e041b920db2390842296925d6204ab9af1b8555c0bd7125643be" address="unix:///run/containerd/s/adb5250748c72ca121cabf063a835294c2131b23f218ad587385d059a9707104" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:41:37.595657 systemd[1]: Started cri-containerd-0f7bbbab3602e041b920db2390842296925d6204ab9af1b8555c0bd7125643be.scope - libcontainer container 0f7bbbab3602e041b920db2390842296925d6204ab9af1b8555c0bd7125643be. Dec 12 17:41:37.604520 kubelet[3527]: I1212 17:41:37.604205 3527 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-vn5cd" podStartSLOduration=40.604192071 podStartE2EDuration="40.604192071s" podCreationTimestamp="2025-12-12 17:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:41:37.603621918 +0000 UTC m=+46.341852907" watchObservedRunningTime="2025-12-12 17:41:37.604192071 +0000 UTC m=+46.342423044" Dec 12 17:41:37.605693 systemd-networkd[1495]: caliae1177b8439: Gained IPv6LL Dec 12 17:41:37.640042 systemd-networkd[1495]: calic86aa50d560: Link UP Dec 12 17:41:37.640859 systemd-networkd[1495]: calic86aa50d560: Gained carrier Dec 12 17:41:37.666324 containerd[1916]: 2025-12-12 17:41:37.416 [INFO][5425] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--pnxm5-eth0 calico-apiserver-5cf495b795- calico-apiserver 65781289-4c97-4182-9b09-b4c93c5b6dd1 809 0 2025-12-12 17:41:07 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5cf495b795 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459.2.2-a-9f5170e2ca calico-apiserver-5cf495b795-pnxm5 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calic86aa50d560 [] [] }} ContainerID="8151f29310d791617fae1313156e8b5126d5ca03f0f5f66d550558a504d204a7" Namespace="calico-apiserver" Pod="calico-apiserver-5cf495b795-pnxm5" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--pnxm5-" Dec 12 17:41:37.666324 containerd[1916]: 2025-12-12 17:41:37.416 [INFO][5425] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8151f29310d791617fae1313156e8b5126d5ca03f0f5f66d550558a504d204a7" Namespace="calico-apiserver" Pod="calico-apiserver-5cf495b795-pnxm5" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--pnxm5-eth0" Dec 12 17:41:37.666324 containerd[1916]: 2025-12-12 17:41:37.446 [INFO][5454] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8151f29310d791617fae1313156e8b5126d5ca03f0f5f66d550558a504d204a7" HandleID="k8s-pod-network.8151f29310d791617fae1313156e8b5126d5ca03f0f5f66d550558a504d204a7" Workload="ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--pnxm5-eth0" Dec 12 17:41:37.666324 containerd[1916]: 2025-12-12 17:41:37.446 [INFO][5454] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="8151f29310d791617fae1313156e8b5126d5ca03f0f5f66d550558a504d204a7" HandleID="k8s-pod-network.8151f29310d791617fae1313156e8b5126d5ca03f0f5f66d550558a504d204a7" Workload="ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--pnxm5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cb8f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459.2.2-a-9f5170e2ca", "pod":"calico-apiserver-5cf495b795-pnxm5", "timestamp":"2025-12-12 17:41:37.446549612 +0000 UTC"}, Hostname:"ci-4459.2.2-a-9f5170e2ca", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:41:37.666324 containerd[1916]: 2025-12-12 17:41:37.447 [INFO][5454] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:41:37.666324 containerd[1916]: 2025-12-12 17:41:37.487 [INFO][5454] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:41:37.666324 containerd[1916]: 2025-12-12 17:41:37.488 [INFO][5454] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.2-a-9f5170e2ca' Dec 12 17:41:37.666324 containerd[1916]: 2025-12-12 17:41:37.549 [INFO][5454] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8151f29310d791617fae1313156e8b5126d5ca03f0f5f66d550558a504d204a7" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:37.666324 containerd[1916]: 2025-12-12 17:41:37.577 [INFO][5454] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:37.666324 containerd[1916]: 2025-12-12 17:41:37.592 [INFO][5454] ipam/ipam.go 511: Trying affinity for 192.168.100.64/26 host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:37.666324 containerd[1916]: 2025-12-12 17:41:37.600 [INFO][5454] ipam/ipam.go 158: Attempting to load block cidr=192.168.100.64/26 host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:37.666324 containerd[1916]: 2025-12-12 17:41:37.605 [INFO][5454] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.100.64/26 host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:37.666324 containerd[1916]: 2025-12-12 17:41:37.605 [INFO][5454] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.100.64/26 handle="k8s-pod-network.8151f29310d791617fae1313156e8b5126d5ca03f0f5f66d550558a504d204a7" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:37.666324 containerd[1916]: 2025-12-12 17:41:37.615 [INFO][5454] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.8151f29310d791617fae1313156e8b5126d5ca03f0f5f66d550558a504d204a7 Dec 12 17:41:37.666324 containerd[1916]: 2025-12-12 17:41:37.621 [INFO][5454] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.100.64/26 handle="k8s-pod-network.8151f29310d791617fae1313156e8b5126d5ca03f0f5f66d550558a504d204a7" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:37.666324 containerd[1916]: 2025-12-12 17:41:37.630 [INFO][5454] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.100.72/26] block=192.168.100.64/26 handle="k8s-pod-network.8151f29310d791617fae1313156e8b5126d5ca03f0f5f66d550558a504d204a7" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:37.666324 containerd[1916]: 2025-12-12 17:41:37.630 [INFO][5454] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.100.72/26] handle="k8s-pod-network.8151f29310d791617fae1313156e8b5126d5ca03f0f5f66d550558a504d204a7" host="ci-4459.2.2-a-9f5170e2ca" Dec 12 17:41:37.666324 containerd[1916]: 2025-12-12 17:41:37.630 [INFO][5454] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:41:37.666324 containerd[1916]: 2025-12-12 17:41:37.630 [INFO][5454] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.100.72/26] IPv6=[] ContainerID="8151f29310d791617fae1313156e8b5126d5ca03f0f5f66d550558a504d204a7" HandleID="k8s-pod-network.8151f29310d791617fae1313156e8b5126d5ca03f0f5f66d550558a504d204a7" Workload="ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--pnxm5-eth0" Dec 12 17:41:37.666878 containerd[1916]: 2025-12-12 17:41:37.633 [INFO][5425] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8151f29310d791617fae1313156e8b5126d5ca03f0f5f66d550558a504d204a7" Namespace="calico-apiserver" Pod="calico-apiserver-5cf495b795-pnxm5" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--pnxm5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--pnxm5-eth0", GenerateName:"calico-apiserver-5cf495b795-", Namespace:"calico-apiserver", SelfLink:"", UID:"65781289-4c97-4182-9b09-b4c93c5b6dd1", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 41, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5cf495b795", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-9f5170e2ca", ContainerID:"", Pod:"calico-apiserver-5cf495b795-pnxm5", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.100.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic86aa50d560", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:41:37.666878 containerd[1916]: 2025-12-12 17:41:37.633 [INFO][5425] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.100.72/32] ContainerID="8151f29310d791617fae1313156e8b5126d5ca03f0f5f66d550558a504d204a7" Namespace="calico-apiserver" Pod="calico-apiserver-5cf495b795-pnxm5" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--pnxm5-eth0" Dec 12 17:41:37.666878 containerd[1916]: 2025-12-12 17:41:37.633 [INFO][5425] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic86aa50d560 ContainerID="8151f29310d791617fae1313156e8b5126d5ca03f0f5f66d550558a504d204a7" Namespace="calico-apiserver" Pod="calico-apiserver-5cf495b795-pnxm5" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--pnxm5-eth0" Dec 12 17:41:37.666878 containerd[1916]: 2025-12-12 17:41:37.640 [INFO][5425] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8151f29310d791617fae1313156e8b5126d5ca03f0f5f66d550558a504d204a7" Namespace="calico-apiserver" Pod="calico-apiserver-5cf495b795-pnxm5" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--pnxm5-eth0" Dec 12 17:41:37.666878 containerd[1916]: 2025-12-12 17:41:37.642 [INFO][5425] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8151f29310d791617fae1313156e8b5126d5ca03f0f5f66d550558a504d204a7" Namespace="calico-apiserver" Pod="calico-apiserver-5cf495b795-pnxm5" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--pnxm5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--pnxm5-eth0", GenerateName:"calico-apiserver-5cf495b795-", Namespace:"calico-apiserver", SelfLink:"", UID:"65781289-4c97-4182-9b09-b4c93c5b6dd1", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 41, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5cf495b795", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-9f5170e2ca", ContainerID:"8151f29310d791617fae1313156e8b5126d5ca03f0f5f66d550558a504d204a7", Pod:"calico-apiserver-5cf495b795-pnxm5", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.100.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic86aa50d560", MAC:"ee:e2:2a:27:98:b3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:41:37.666878 containerd[1916]: 2025-12-12 17:41:37.663 [INFO][5425] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8151f29310d791617fae1313156e8b5126d5ca03f0f5f66d550558a504d204a7" Namespace="calico-apiserver" Pod="calico-apiserver-5cf495b795-pnxm5" WorkloadEndpoint="ci--4459.2.2--a--9f5170e2ca-k8s-calico--apiserver--5cf495b795--pnxm5-eth0" Dec 12 17:41:37.713566 containerd[1916]: time="2025-12-12T17:41:37.713445509Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-nhpd8,Uid:aae497cc-3748-48de-b6f7-6585350a2476,Namespace:calico-system,Attempt:0,} returns sandbox id \"0f7bbbab3602e041b920db2390842296925d6204ab9af1b8555c0bd7125643be\"" Dec 12 17:41:37.715612 containerd[1916]: time="2025-12-12T17:41:37.715586660Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:41:37.716708 containerd[1916]: time="2025-12-12T17:41:37.716349026Z" level=info msg="connecting to shim 8151f29310d791617fae1313156e8b5126d5ca03f0f5f66d550558a504d204a7" address="unix:///run/containerd/s/f9e6ef783490439c3607555dcd47b45c0701bf3195064ee3c470b1c42885e952" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:41:37.736636 systemd[1]: Started cri-containerd-8151f29310d791617fae1313156e8b5126d5ca03f0f5f66d550558a504d204a7.scope - libcontainer container 8151f29310d791617fae1313156e8b5126d5ca03f0f5f66d550558a504d204a7. Dec 12 17:41:37.768853 containerd[1916]: time="2025-12-12T17:41:37.768805790Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5cf495b795-pnxm5,Uid:65781289-4c97-4182-9b09-b4c93c5b6dd1,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"8151f29310d791617fae1313156e8b5126d5ca03f0f5f66d550558a504d204a7\"" Dec 12 17:41:37.797684 systemd-networkd[1495]: cali9ff7027a1e4: Gained IPv6LL Dec 12 17:41:37.968192 containerd[1916]: time="2025-12-12T17:41:37.967773416Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:37.972473 containerd[1916]: time="2025-12-12T17:41:37.972422984Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:41:37.973590 containerd[1916]: time="2025-12-12T17:41:37.972530739Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:41:37.973590 containerd[1916]: time="2025-12-12T17:41:37.973367147Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:41:37.973653 kubelet[3527]: E1212 17:41:37.972675 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:41:37.973653 kubelet[3527]: E1212 17:41:37.972719 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:41:37.973653 kubelet[3527]: E1212 17:41:37.972860 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-nhpd8_calico-system(aae497cc-3748-48de-b6f7-6585350a2476): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:38.250396 containerd[1916]: time="2025-12-12T17:41:38.250015226Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:38.253558 containerd[1916]: time="2025-12-12T17:41:38.253465063Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:41:38.253558 containerd[1916]: time="2025-12-12T17:41:38.253516760Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:41:38.253767 kubelet[3527]: E1212 17:41:38.253720 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:41:38.253833 kubelet[3527]: E1212 17:41:38.253768 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:41:38.253994 kubelet[3527]: E1212 17:41:38.253922 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-5cf495b795-pnxm5_calico-apiserver(65781289-4c97-4182-9b09-b4c93c5b6dd1): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:38.254134 containerd[1916]: time="2025-12-12T17:41:38.254059856Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:41:38.254175 kubelet[3527]: E1212 17:41:38.254089 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cf495b795-pnxm5" podUID="65781289-4c97-4182-9b09-b4c93c5b6dd1" Dec 12 17:41:38.524058 containerd[1916]: time="2025-12-12T17:41:38.523891711Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:38.528128 kubelet[3527]: E1212 17:41:38.528071 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cf495b795-pnxm5" podUID="65781289-4c97-4182-9b09-b4c93c5b6dd1" Dec 12 17:41:38.549799 containerd[1916]: time="2025-12-12T17:41:38.549714041Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:41:38.549799 containerd[1916]: time="2025-12-12T17:41:38.549765779Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:41:38.550107 kubelet[3527]: E1212 17:41:38.550066 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:41:38.550179 kubelet[3527]: E1212 17:41:38.550124 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:41:38.550204 kubelet[3527]: E1212 17:41:38.550194 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-nhpd8_calico-system(aae497cc-3748-48de-b6f7-6585350a2476): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:38.550279 kubelet[3527]: E1212 17:41:38.550228 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nhpd8" podUID="aae497cc-3748-48de-b6f7-6585350a2476" Dec 12 17:41:38.758822 systemd-networkd[1495]: calic86aa50d560: Gained IPv6LL Dec 12 17:41:39.461778 systemd-networkd[1495]: calid6f990e9221: Gained IPv6LL Dec 12 17:41:39.531387 kubelet[3527]: E1212 17:41:39.531295 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nhpd8" podUID="aae497cc-3748-48de-b6f7-6585350a2476" Dec 12 17:41:39.531974 kubelet[3527]: E1212 17:41:39.531881 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cf495b795-pnxm5" podUID="65781289-4c97-4182-9b09-b4c93c5b6dd1" Dec 12 17:41:45.357855 containerd[1916]: time="2025-12-12T17:41:45.357612991Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:41:45.646377 containerd[1916]: time="2025-12-12T17:41:45.646187372Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:45.649514 containerd[1916]: time="2025-12-12T17:41:45.649440267Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:41:45.649514 containerd[1916]: time="2025-12-12T17:41:45.649492516Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:41:45.649880 kubelet[3527]: E1212 17:41:45.649818 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:41:45.649880 kubelet[3527]: E1212 17:41:45.649867 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:41:45.650541 kubelet[3527]: E1212 17:41:45.650252 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-f998dd77f-h9sx8_calico-system(ea69765d-2386-4a5a-bcbf-e5190366c632): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:45.651744 containerd[1916]: time="2025-12-12T17:41:45.651708671Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:41:45.940978 containerd[1916]: time="2025-12-12T17:41:45.940929461Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:45.944802 containerd[1916]: time="2025-12-12T17:41:45.944739771Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:41:45.945130 containerd[1916]: time="2025-12-12T17:41:45.944825190Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:41:45.945172 kubelet[3527]: E1212 17:41:45.944983 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:41:45.945172 kubelet[3527]: E1212 17:41:45.945029 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:41:45.945172 kubelet[3527]: E1212 17:41:45.945099 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-f998dd77f-h9sx8_calico-system(ea69765d-2386-4a5a-bcbf-e5190366c632): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:45.945233 kubelet[3527]: E1212 17:41:45.945135 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f998dd77f-h9sx8" podUID="ea69765d-2386-4a5a-bcbf-e5190366c632" Dec 12 17:41:48.354418 containerd[1916]: time="2025-12-12T17:41:48.354374103Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:41:48.641596 containerd[1916]: time="2025-12-12T17:41:48.641462242Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:48.646054 containerd[1916]: time="2025-12-12T17:41:48.646004793Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:41:48.646172 containerd[1916]: time="2025-12-12T17:41:48.646102228Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:41:48.646365 kubelet[3527]: E1212 17:41:48.646323 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:41:48.646721 kubelet[3527]: E1212 17:41:48.646376 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:41:48.646721 kubelet[3527]: E1212 17:41:48.646447 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-ftsnw_calico-system(9d6c6448-7253-4405-b42f-a3327529c933): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:48.646721 kubelet[3527]: E1212 17:41:48.646472 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ftsnw" podUID="9d6c6448-7253-4405-b42f-a3327529c933" Dec 12 17:41:50.353939 containerd[1916]: time="2025-12-12T17:41:50.353795649Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:41:50.610092 containerd[1916]: time="2025-12-12T17:41:50.609823744Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:50.614292 containerd[1916]: time="2025-12-12T17:41:50.614174074Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:41:50.614292 containerd[1916]: time="2025-12-12T17:41:50.614204795Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:41:50.614582 kubelet[3527]: E1212 17:41:50.614541 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:41:50.614846 kubelet[3527]: E1212 17:41:50.614591 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:41:50.614846 kubelet[3527]: E1212 17:41:50.614661 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-8645b67b4b-hg8jx_calico-system(cc75188a-1547-4b56-bf96-4cc7e7818e13): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:50.614846 kubelet[3527]: E1212 17:41:50.614689 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8645b67b4b-hg8jx" podUID="cc75188a-1547-4b56-bf96-4cc7e7818e13" Dec 12 17:41:51.356265 containerd[1916]: time="2025-12-12T17:41:51.356181312Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:41:51.650791 containerd[1916]: time="2025-12-12T17:41:51.650650768Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:51.654591 containerd[1916]: time="2025-12-12T17:41:51.654536156Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:41:51.654689 containerd[1916]: time="2025-12-12T17:41:51.654652616Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:41:51.654915 kubelet[3527]: E1212 17:41:51.654867 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:41:51.655383 kubelet[3527]: E1212 17:41:51.655213 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:41:51.655383 kubelet[3527]: E1212 17:41:51.655318 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-5cf495b795-djp4n_calico-apiserver(6c556bc6-ca1e-488b-87dc-eb6c6785bf8c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:51.655383 kubelet[3527]: E1212 17:41:51.655345 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cf495b795-djp4n" podUID="6c556bc6-ca1e-488b-87dc-eb6c6785bf8c" Dec 12 17:41:53.356332 containerd[1916]: time="2025-12-12T17:41:53.355941224Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:41:53.611373 containerd[1916]: time="2025-12-12T17:41:53.608878018Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:53.612931 containerd[1916]: time="2025-12-12T17:41:53.612756806Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:41:53.612931 containerd[1916]: time="2025-12-12T17:41:53.612811839Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:41:53.613215 kubelet[3527]: E1212 17:41:53.613164 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:41:53.613884 kubelet[3527]: E1212 17:41:53.613222 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:41:53.613884 kubelet[3527]: E1212 17:41:53.613293 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-nhpd8_calico-system(aae497cc-3748-48de-b6f7-6585350a2476): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:53.614765 containerd[1916]: time="2025-12-12T17:41:53.614740625Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:41:53.868590 containerd[1916]: time="2025-12-12T17:41:53.868314095Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:53.871641 containerd[1916]: time="2025-12-12T17:41:53.871538575Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:41:53.871641 containerd[1916]: time="2025-12-12T17:41:53.871580696Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:41:53.871952 kubelet[3527]: E1212 17:41:53.871902 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:41:53.872011 kubelet[3527]: E1212 17:41:53.871956 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:41:53.872046 kubelet[3527]: E1212 17:41:53.872025 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-nhpd8_calico-system(aae497cc-3748-48de-b6f7-6585350a2476): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:53.872131 kubelet[3527]: E1212 17:41:53.872060 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nhpd8" podUID="aae497cc-3748-48de-b6f7-6585350a2476" Dec 12 17:41:55.355174 containerd[1916]: time="2025-12-12T17:41:55.354928248Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:41:55.630002 containerd[1916]: time="2025-12-12T17:41:55.629843301Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:55.650185 containerd[1916]: time="2025-12-12T17:41:55.650124565Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:41:55.650476 containerd[1916]: time="2025-12-12T17:41:55.650233625Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:41:55.650559 kubelet[3527]: E1212 17:41:55.650376 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:41:55.650559 kubelet[3527]: E1212 17:41:55.650415 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:41:55.650559 kubelet[3527]: E1212 17:41:55.650538 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-5cf495b795-pnxm5_calico-apiserver(65781289-4c97-4182-9b09-b4c93c5b6dd1): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:55.651577 kubelet[3527]: E1212 17:41:55.650570 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cf495b795-pnxm5" podUID="65781289-4c97-4182-9b09-b4c93c5b6dd1" Dec 12 17:41:57.355067 kubelet[3527]: E1212 17:41:57.355024 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f998dd77f-h9sx8" podUID="ea69765d-2386-4a5a-bcbf-e5190366c632" Dec 12 17:42:03.355291 kubelet[3527]: E1212 17:42:03.355074 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cf495b795-djp4n" podUID="6c556bc6-ca1e-488b-87dc-eb6c6785bf8c" Dec 12 17:42:03.356944 kubelet[3527]: E1212 17:42:03.355923 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ftsnw" podUID="9d6c6448-7253-4405-b42f-a3327529c933" Dec 12 17:42:05.355491 kubelet[3527]: E1212 17:42:05.355449 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8645b67b4b-hg8jx" podUID="cc75188a-1547-4b56-bf96-4cc7e7818e13" Dec 12 17:42:08.355791 kubelet[3527]: E1212 17:42:08.355703 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nhpd8" podUID="aae497cc-3748-48de-b6f7-6585350a2476" Dec 12 17:42:09.354193 kubelet[3527]: E1212 17:42:09.354027 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cf495b795-pnxm5" podUID="65781289-4c97-4182-9b09-b4c93c5b6dd1" Dec 12 17:42:11.356446 containerd[1916]: time="2025-12-12T17:42:11.356403156Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:42:11.680248 containerd[1916]: time="2025-12-12T17:42:11.680071703Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:42:11.683450 containerd[1916]: time="2025-12-12T17:42:11.683347048Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:42:11.683450 containerd[1916]: time="2025-12-12T17:42:11.683403202Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:42:11.683664 kubelet[3527]: E1212 17:42:11.683608 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:42:11.683936 kubelet[3527]: E1212 17:42:11.683670 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:42:11.683936 kubelet[3527]: E1212 17:42:11.683748 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-f998dd77f-h9sx8_calico-system(ea69765d-2386-4a5a-bcbf-e5190366c632): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:42:11.685088 containerd[1916]: time="2025-12-12T17:42:11.685038642Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:42:11.952652 containerd[1916]: time="2025-12-12T17:42:11.952412680Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:42:11.955779 containerd[1916]: time="2025-12-12T17:42:11.955684065Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:42:11.955779 containerd[1916]: time="2025-12-12T17:42:11.955738515Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:42:11.955922 kubelet[3527]: E1212 17:42:11.955883 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:42:11.955961 kubelet[3527]: E1212 17:42:11.955930 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:42:11.956018 kubelet[3527]: E1212 17:42:11.955995 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-f998dd77f-h9sx8_calico-system(ea69765d-2386-4a5a-bcbf-e5190366c632): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:42:11.956056 kubelet[3527]: E1212 17:42:11.956033 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f998dd77f-h9sx8" podUID="ea69765d-2386-4a5a-bcbf-e5190366c632" Dec 12 17:42:15.357185 containerd[1916]: time="2025-12-12T17:42:15.357120292Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:42:15.656163 containerd[1916]: time="2025-12-12T17:42:15.655726603Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:42:15.660016 containerd[1916]: time="2025-12-12T17:42:15.659793820Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:42:15.660016 containerd[1916]: time="2025-12-12T17:42:15.659877447Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:42:15.660558 kubelet[3527]: E1212 17:42:15.660371 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:42:15.660558 kubelet[3527]: E1212 17:42:15.660418 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:42:15.660558 kubelet[3527]: E1212 17:42:15.660485 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-ftsnw_calico-system(9d6c6448-7253-4405-b42f-a3327529c933): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:42:15.661843 kubelet[3527]: E1212 17:42:15.661807 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ftsnw" podUID="9d6c6448-7253-4405-b42f-a3327529c933" Dec 12 17:42:17.358889 containerd[1916]: time="2025-12-12T17:42:17.358149377Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:42:17.641419 containerd[1916]: time="2025-12-12T17:42:17.640952548Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:42:17.644377 containerd[1916]: time="2025-12-12T17:42:17.644276038Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:42:17.644377 containerd[1916]: time="2025-12-12T17:42:17.644322456Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:42:17.644698 kubelet[3527]: E1212 17:42:17.644651 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:42:17.644966 kubelet[3527]: E1212 17:42:17.644702 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:42:17.644966 kubelet[3527]: E1212 17:42:17.644772 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-5cf495b795-djp4n_calico-apiserver(6c556bc6-ca1e-488b-87dc-eb6c6785bf8c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:42:17.644966 kubelet[3527]: E1212 17:42:17.644799 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cf495b795-djp4n" podUID="6c556bc6-ca1e-488b-87dc-eb6c6785bf8c" Dec 12 17:42:20.354358 containerd[1916]: time="2025-12-12T17:42:20.354088606Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:42:20.603096 containerd[1916]: time="2025-12-12T17:42:20.602900727Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:42:20.606479 containerd[1916]: time="2025-12-12T17:42:20.606179504Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:42:20.606479 containerd[1916]: time="2025-12-12T17:42:20.606264795Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:42:20.606587 kubelet[3527]: E1212 17:42:20.606396 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:42:20.606587 kubelet[3527]: E1212 17:42:20.606438 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:42:20.606846 kubelet[3527]: E1212 17:42:20.606604 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-nhpd8_calico-system(aae497cc-3748-48de-b6f7-6585350a2476): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:42:20.607763 containerd[1916]: time="2025-12-12T17:42:20.607598290Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:42:20.897892 containerd[1916]: time="2025-12-12T17:42:20.897488904Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:42:20.900441 containerd[1916]: time="2025-12-12T17:42:20.900399990Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:42:20.900659 containerd[1916]: time="2025-12-12T17:42:20.900480721Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:42:20.900698 kubelet[3527]: E1212 17:42:20.900654 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:42:20.900744 kubelet[3527]: E1212 17:42:20.900707 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:42:20.900904 kubelet[3527]: E1212 17:42:20.900882 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-8645b67b4b-hg8jx_calico-system(cc75188a-1547-4b56-bf96-4cc7e7818e13): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:42:20.900988 kubelet[3527]: E1212 17:42:20.900917 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8645b67b4b-hg8jx" podUID="cc75188a-1547-4b56-bf96-4cc7e7818e13" Dec 12 17:42:20.901312 containerd[1916]: time="2025-12-12T17:42:20.901271288Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:42:21.193133 containerd[1916]: time="2025-12-12T17:42:21.193088166Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:42:21.196343 containerd[1916]: time="2025-12-12T17:42:21.196299894Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:42:21.196449 containerd[1916]: time="2025-12-12T17:42:21.196382576Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:42:21.196574 kubelet[3527]: E1212 17:42:21.196535 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:42:21.196650 kubelet[3527]: E1212 17:42:21.196582 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:42:21.197459 kubelet[3527]: E1212 17:42:21.196652 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-nhpd8_calico-system(aae497cc-3748-48de-b6f7-6585350a2476): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:42:21.197459 kubelet[3527]: E1212 17:42:21.196687 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nhpd8" podUID="aae497cc-3748-48de-b6f7-6585350a2476" Dec 12 17:42:23.355517 containerd[1916]: time="2025-12-12T17:42:23.355254823Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:42:23.613711 containerd[1916]: time="2025-12-12T17:42:23.613335729Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:42:23.616640 containerd[1916]: time="2025-12-12T17:42:23.616575991Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:42:23.616640 containerd[1916]: time="2025-12-12T17:42:23.616611104Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:42:23.617197 kubelet[3527]: E1212 17:42:23.616781 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:42:23.617197 kubelet[3527]: E1212 17:42:23.616821 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:42:23.617197 kubelet[3527]: E1212 17:42:23.616893 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-5cf495b795-pnxm5_calico-apiserver(65781289-4c97-4182-9b09-b4c93c5b6dd1): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:42:23.617197 kubelet[3527]: E1212 17:42:23.616922 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cf495b795-pnxm5" podUID="65781289-4c97-4182-9b09-b4c93c5b6dd1" Dec 12 17:42:25.356337 kubelet[3527]: E1212 17:42:25.356286 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f998dd77f-h9sx8" podUID="ea69765d-2386-4a5a-bcbf-e5190366c632" Dec 12 17:42:28.354221 kubelet[3527]: E1212 17:42:28.354134 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ftsnw" podUID="9d6c6448-7253-4405-b42f-a3327529c933" Dec 12 17:42:31.354576 kubelet[3527]: E1212 17:42:31.354537 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cf495b795-djp4n" podUID="6c556bc6-ca1e-488b-87dc-eb6c6785bf8c" Dec 12 17:42:33.356890 kubelet[3527]: E1212 17:42:33.356846 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8645b67b4b-hg8jx" podUID="cc75188a-1547-4b56-bf96-4cc7e7818e13" Dec 12 17:42:35.341680 systemd[1]: Started sshd@7-10.200.20.11:22-10.200.16.10:60090.service - OpenSSH per-connection server daemon (10.200.16.10:60090). Dec 12 17:42:35.357869 kubelet[3527]: E1212 17:42:35.357657 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nhpd8" podUID="aae497cc-3748-48de-b6f7-6585350a2476" Dec 12 17:42:35.851775 sshd[5679]: Accepted publickey for core from 10.200.16.10 port 60090 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:42:35.854250 sshd-session[5679]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:42:35.859724 systemd-logind[1877]: New session 10 of user core. Dec 12 17:42:35.865646 systemd[1]: Started session-10.scope - Session 10 of User core. Dec 12 17:42:36.256626 sshd[5684]: Connection closed by 10.200.16.10 port 60090 Dec 12 17:42:36.256962 sshd-session[5679]: pam_unix(sshd:session): session closed for user core Dec 12 17:42:36.261166 systemd[1]: session-10.scope: Deactivated successfully. Dec 12 17:42:36.261861 systemd[1]: sshd@7-10.200.20.11:22-10.200.16.10:60090.service: Deactivated successfully. Dec 12 17:42:36.266294 systemd-logind[1877]: Session 10 logged out. Waiting for processes to exit. Dec 12 17:42:36.268143 systemd-logind[1877]: Removed session 10. Dec 12 17:42:38.354007 kubelet[3527]: E1212 17:42:38.353750 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cf495b795-pnxm5" podUID="65781289-4c97-4182-9b09-b4c93c5b6dd1" Dec 12 17:42:40.354797 kubelet[3527]: E1212 17:42:40.354629 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f998dd77f-h9sx8" podUID="ea69765d-2386-4a5a-bcbf-e5190366c632" Dec 12 17:42:41.348905 systemd[1]: Started sshd@8-10.200.20.11:22-10.200.16.10:37606.service - OpenSSH per-connection server daemon (10.200.16.10:37606). Dec 12 17:42:41.854533 sshd[5700]: Accepted publickey for core from 10.200.16.10 port 37606 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:42:41.897638 sshd-session[5700]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:42:41.902764 systemd-logind[1877]: New session 11 of user core. Dec 12 17:42:41.905656 systemd[1]: Started session-11.scope - Session 11 of User core. Dec 12 17:42:42.270538 sshd[5703]: Connection closed by 10.200.16.10 port 37606 Dec 12 17:42:42.271159 sshd-session[5700]: pam_unix(sshd:session): session closed for user core Dec 12 17:42:42.275368 systemd-logind[1877]: Session 11 logged out. Waiting for processes to exit. Dec 12 17:42:42.275560 systemd[1]: sshd@8-10.200.20.11:22-10.200.16.10:37606.service: Deactivated successfully. Dec 12 17:42:42.276932 systemd[1]: session-11.scope: Deactivated successfully. Dec 12 17:42:42.279995 systemd-logind[1877]: Removed session 11. Dec 12 17:42:42.356232 kubelet[3527]: E1212 17:42:42.356178 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cf495b795-djp4n" podUID="6c556bc6-ca1e-488b-87dc-eb6c6785bf8c" Dec 12 17:42:43.355066 kubelet[3527]: E1212 17:42:43.354065 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ftsnw" podUID="9d6c6448-7253-4405-b42f-a3327529c933" Dec 12 17:42:47.356149 kubelet[3527]: E1212 17:42:47.356031 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8645b67b4b-hg8jx" podUID="cc75188a-1547-4b56-bf96-4cc7e7818e13" Dec 12 17:42:47.365753 systemd[1]: Started sshd@9-10.200.20.11:22-10.200.16.10:37616.service - OpenSSH per-connection server daemon (10.200.16.10:37616). Dec 12 17:42:47.868039 sshd[5716]: Accepted publickey for core from 10.200.16.10 port 37616 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:42:47.869736 sshd-session[5716]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:42:47.875377 systemd-logind[1877]: New session 12 of user core. Dec 12 17:42:47.882054 systemd[1]: Started session-12.scope - Session 12 of User core. Dec 12 17:42:48.262606 sshd[5719]: Connection closed by 10.200.16.10 port 37616 Dec 12 17:42:48.262932 sshd-session[5716]: pam_unix(sshd:session): session closed for user core Dec 12 17:42:48.266886 systemd-logind[1877]: Session 12 logged out. Waiting for processes to exit. Dec 12 17:42:48.267179 systemd[1]: sshd@9-10.200.20.11:22-10.200.16.10:37616.service: Deactivated successfully. Dec 12 17:42:48.270203 systemd[1]: session-12.scope: Deactivated successfully. Dec 12 17:42:48.271305 systemd-logind[1877]: Removed session 12. Dec 12 17:42:48.351404 systemd[1]: Started sshd@10-10.200.20.11:22-10.200.16.10:37626.service - OpenSSH per-connection server daemon (10.200.16.10:37626). Dec 12 17:42:48.843327 sshd[5732]: Accepted publickey for core from 10.200.16.10 port 37626 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:42:48.845712 sshd-session[5732]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:42:48.851522 systemd-logind[1877]: New session 13 of user core. Dec 12 17:42:48.856032 systemd[1]: Started session-13.scope - Session 13 of User core. Dec 12 17:42:49.288097 sshd[5735]: Connection closed by 10.200.16.10 port 37626 Dec 12 17:42:49.289048 sshd-session[5732]: pam_unix(sshd:session): session closed for user core Dec 12 17:42:49.292886 systemd[1]: sshd@10-10.200.20.11:22-10.200.16.10:37626.service: Deactivated successfully. Dec 12 17:42:49.294270 systemd[1]: session-13.scope: Deactivated successfully. Dec 12 17:42:49.296155 systemd-logind[1877]: Session 13 logged out. Waiting for processes to exit. Dec 12 17:42:49.298702 systemd-logind[1877]: Removed session 13. Dec 12 17:42:49.357763 kubelet[3527]: E1212 17:42:49.357706 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nhpd8" podUID="aae497cc-3748-48de-b6f7-6585350a2476" Dec 12 17:42:49.372046 systemd[1]: Started sshd@11-10.200.20.11:22-10.200.16.10:37636.service - OpenSSH per-connection server daemon (10.200.16.10:37636). Dec 12 17:42:49.829391 sshd[5745]: Accepted publickey for core from 10.200.16.10 port 37636 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:42:49.830613 sshd-session[5745]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:42:49.834564 systemd-logind[1877]: New session 14 of user core. Dec 12 17:42:49.838642 systemd[1]: Started session-14.scope - Session 14 of User core. Dec 12 17:42:50.221907 sshd[5748]: Connection closed by 10.200.16.10 port 37636 Dec 12 17:42:50.222422 sshd-session[5745]: pam_unix(sshd:session): session closed for user core Dec 12 17:42:50.226969 systemd-logind[1877]: Session 14 logged out. Waiting for processes to exit. Dec 12 17:42:50.227283 systemd[1]: sshd@11-10.200.20.11:22-10.200.16.10:37636.service: Deactivated successfully. Dec 12 17:42:50.229474 systemd[1]: session-14.scope: Deactivated successfully. Dec 12 17:42:50.231572 systemd-logind[1877]: Removed session 14. Dec 12 17:42:52.354183 kubelet[3527]: E1212 17:42:52.354058 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cf495b795-pnxm5" podUID="65781289-4c97-4182-9b09-b4c93c5b6dd1" Dec 12 17:42:54.354810 containerd[1916]: time="2025-12-12T17:42:54.354592253Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:42:54.609911 containerd[1916]: time="2025-12-12T17:42:54.609552895Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:42:54.612873 containerd[1916]: time="2025-12-12T17:42:54.612778950Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:42:54.612873 containerd[1916]: time="2025-12-12T17:42:54.612830336Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:42:54.613068 kubelet[3527]: E1212 17:42:54.613017 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:42:54.613326 kubelet[3527]: E1212 17:42:54.613071 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:42:54.613679 kubelet[3527]: E1212 17:42:54.613151 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-f998dd77f-h9sx8_calico-system(ea69765d-2386-4a5a-bcbf-e5190366c632): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:42:54.615493 containerd[1916]: time="2025-12-12T17:42:54.615140700Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:42:55.010696 containerd[1916]: time="2025-12-12T17:42:55.010647189Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:42:55.014639 containerd[1916]: time="2025-12-12T17:42:55.014598537Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:42:55.014725 containerd[1916]: time="2025-12-12T17:42:55.014687332Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:42:55.015524 kubelet[3527]: E1212 17:42:55.014823 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:42:55.015524 kubelet[3527]: E1212 17:42:55.014877 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:42:55.015524 kubelet[3527]: E1212 17:42:55.014943 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-f998dd77f-h9sx8_calico-system(ea69765d-2386-4a5a-bcbf-e5190366c632): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:42:55.015653 kubelet[3527]: E1212 17:42:55.014976 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f998dd77f-h9sx8" podUID="ea69765d-2386-4a5a-bcbf-e5190366c632" Dec 12 17:42:55.314342 systemd[1]: Started sshd@12-10.200.20.11:22-10.200.16.10:60066.service - OpenSSH per-connection server daemon (10.200.16.10:60066). Dec 12 17:42:55.817574 sshd[5772]: Accepted publickey for core from 10.200.16.10 port 60066 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:42:55.820551 sshd-session[5772]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:42:55.828110 systemd-logind[1877]: New session 15 of user core. Dec 12 17:42:55.834984 systemd[1]: Started session-15.scope - Session 15 of User core. Dec 12 17:42:56.237052 sshd[5775]: Connection closed by 10.200.16.10 port 60066 Dec 12 17:42:56.237418 sshd-session[5772]: pam_unix(sshd:session): session closed for user core Dec 12 17:42:56.243628 systemd[1]: sshd@12-10.200.20.11:22-10.200.16.10:60066.service: Deactivated successfully. Dec 12 17:42:56.245476 systemd[1]: session-15.scope: Deactivated successfully. Dec 12 17:42:56.246125 systemd-logind[1877]: Session 15 logged out. Waiting for processes to exit. Dec 12 17:42:56.247288 systemd-logind[1877]: Removed session 15. Dec 12 17:42:57.354067 kubelet[3527]: E1212 17:42:57.354006 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cf495b795-djp4n" podUID="6c556bc6-ca1e-488b-87dc-eb6c6785bf8c" Dec 12 17:42:58.355415 containerd[1916]: time="2025-12-12T17:42:58.355155731Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:42:58.356363 kubelet[3527]: E1212 17:42:58.355768 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8645b67b4b-hg8jx" podUID="cc75188a-1547-4b56-bf96-4cc7e7818e13" Dec 12 17:42:58.646590 containerd[1916]: time="2025-12-12T17:42:58.646355379Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:42:58.650266 containerd[1916]: time="2025-12-12T17:42:58.650164915Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:42:58.650266 containerd[1916]: time="2025-12-12T17:42:58.650215653Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:42:58.650401 kubelet[3527]: E1212 17:42:58.650363 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:42:58.650440 kubelet[3527]: E1212 17:42:58.650406 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:42:58.650492 kubelet[3527]: E1212 17:42:58.650472 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-ftsnw_calico-system(9d6c6448-7253-4405-b42f-a3327529c933): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:42:58.652592 kubelet[3527]: E1212 17:42:58.652541 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ftsnw" podUID="9d6c6448-7253-4405-b42f-a3327529c933" Dec 12 17:43:01.326699 systemd[1]: Started sshd@13-10.200.20.11:22-10.200.16.10:59740.service - OpenSSH per-connection server daemon (10.200.16.10:59740). Dec 12 17:43:01.821253 sshd[5791]: Accepted publickey for core from 10.200.16.10 port 59740 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:43:01.822483 sshd-session[5791]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:43:01.826407 systemd-logind[1877]: New session 16 of user core. Dec 12 17:43:01.831633 systemd[1]: Started session-16.scope - Session 16 of User core. Dec 12 17:43:02.225552 sshd[5794]: Connection closed by 10.200.16.10 port 59740 Dec 12 17:43:02.226081 sshd-session[5791]: pam_unix(sshd:session): session closed for user core Dec 12 17:43:02.230423 systemd[1]: sshd@13-10.200.20.11:22-10.200.16.10:59740.service: Deactivated successfully. Dec 12 17:43:02.236014 systemd[1]: session-16.scope: Deactivated successfully. Dec 12 17:43:02.237930 systemd-logind[1877]: Session 16 logged out. Waiting for processes to exit. Dec 12 17:43:02.239438 systemd-logind[1877]: Removed session 16. Dec 12 17:43:02.353834 containerd[1916]: time="2025-12-12T17:43:02.353798246Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:43:02.644964 containerd[1916]: time="2025-12-12T17:43:02.644707847Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:43:02.648443 containerd[1916]: time="2025-12-12T17:43:02.648340963Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:43:02.648443 containerd[1916]: time="2025-12-12T17:43:02.648388044Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:43:02.648618 kubelet[3527]: E1212 17:43:02.648584 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:43:02.649129 kubelet[3527]: E1212 17:43:02.648629 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:43:02.649129 kubelet[3527]: E1212 17:43:02.648699 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-nhpd8_calico-system(aae497cc-3748-48de-b6f7-6585350a2476): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:43:02.650472 containerd[1916]: time="2025-12-12T17:43:02.649846295Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:43:02.916754 containerd[1916]: time="2025-12-12T17:43:02.916593885Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:43:02.919813 containerd[1916]: time="2025-12-12T17:43:02.919698513Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:43:02.919813 containerd[1916]: time="2025-12-12T17:43:02.919791723Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:43:02.920126 kubelet[3527]: E1212 17:43:02.920077 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:43:02.920185 kubelet[3527]: E1212 17:43:02.920131 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:43:02.920236 kubelet[3527]: E1212 17:43:02.920216 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-nhpd8_calico-system(aae497cc-3748-48de-b6f7-6585350a2476): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:43:02.920280 kubelet[3527]: E1212 17:43:02.920252 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nhpd8" podUID="aae497cc-3748-48de-b6f7-6585350a2476" Dec 12 17:43:07.318314 systemd[1]: Started sshd@14-10.200.20.11:22-10.200.16.10:59750.service - OpenSSH per-connection server daemon (10.200.16.10:59750). Dec 12 17:43:07.356112 containerd[1916]: time="2025-12-12T17:43:07.356037235Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:43:07.356441 kubelet[3527]: E1212 17:43:07.355483 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f998dd77f-h9sx8" podUID="ea69765d-2386-4a5a-bcbf-e5190366c632" Dec 12 17:43:07.630309 containerd[1916]: time="2025-12-12T17:43:07.629884475Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:43:07.633246 containerd[1916]: time="2025-12-12T17:43:07.633128187Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:43:07.633246 containerd[1916]: time="2025-12-12T17:43:07.633216293Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:43:07.633409 kubelet[3527]: E1212 17:43:07.633374 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:43:07.633445 kubelet[3527]: E1212 17:43:07.633427 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:43:07.633897 kubelet[3527]: E1212 17:43:07.633570 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-5cf495b795-pnxm5_calico-apiserver(65781289-4c97-4182-9b09-b4c93c5b6dd1): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:43:07.633897 kubelet[3527]: E1212 17:43:07.633607 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cf495b795-pnxm5" podUID="65781289-4c97-4182-9b09-b4c93c5b6dd1" Dec 12 17:43:07.820300 sshd[5850]: Accepted publickey for core from 10.200.16.10 port 59750 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:43:07.821980 sshd-session[5850]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:43:07.826017 systemd-logind[1877]: New session 17 of user core. Dec 12 17:43:07.829610 systemd[1]: Started session-17.scope - Session 17 of User core. Dec 12 17:43:08.260493 sshd[5853]: Connection closed by 10.200.16.10 port 59750 Dec 12 17:43:08.259861 sshd-session[5850]: pam_unix(sshd:session): session closed for user core Dec 12 17:43:08.265088 systemd-logind[1877]: Session 17 logged out. Waiting for processes to exit. Dec 12 17:43:08.266790 systemd[1]: sshd@14-10.200.20.11:22-10.200.16.10:59750.service: Deactivated successfully. Dec 12 17:43:08.270552 systemd[1]: session-17.scope: Deactivated successfully. Dec 12 17:43:08.272463 systemd-logind[1877]: Removed session 17. Dec 12 17:43:08.348332 systemd[1]: Started sshd@15-10.200.20.11:22-10.200.16.10:59766.service - OpenSSH per-connection server daemon (10.200.16.10:59766). Dec 12 17:43:08.843204 sshd[5864]: Accepted publickey for core from 10.200.16.10 port 59766 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:43:08.844369 sshd-session[5864]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:43:08.848278 systemd-logind[1877]: New session 18 of user core. Dec 12 17:43:08.854640 systemd[1]: Started session-18.scope - Session 18 of User core. Dec 12 17:43:09.322350 sshd[5867]: Connection closed by 10.200.16.10 port 59766 Dec 12 17:43:09.322993 sshd-session[5864]: pam_unix(sshd:session): session closed for user core Dec 12 17:43:09.326371 systemd[1]: sshd@15-10.200.20.11:22-10.200.16.10:59766.service: Deactivated successfully. Dec 12 17:43:09.327852 systemd[1]: session-18.scope: Deactivated successfully. Dec 12 17:43:09.328461 systemd-logind[1877]: Session 18 logged out. Waiting for processes to exit. Dec 12 17:43:09.329972 systemd-logind[1877]: Removed session 18. Dec 12 17:43:09.354048 containerd[1916]: time="2025-12-12T17:43:09.353691806Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:43:09.411657 systemd[1]: Started sshd@16-10.200.20.11:22-10.200.16.10:59768.service - OpenSSH per-connection server daemon (10.200.16.10:59768). Dec 12 17:43:09.647782 containerd[1916]: time="2025-12-12T17:43:09.647647449Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:43:09.651052 containerd[1916]: time="2025-12-12T17:43:09.651006996Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:43:09.651143 containerd[1916]: time="2025-12-12T17:43:09.651093423Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:43:09.651396 kubelet[3527]: E1212 17:43:09.651327 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:43:09.651396 kubelet[3527]: E1212 17:43:09.651373 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:43:09.652094 kubelet[3527]: E1212 17:43:09.651951 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-5cf495b795-djp4n_calico-apiserver(6c556bc6-ca1e-488b-87dc-eb6c6785bf8c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:43:09.652094 kubelet[3527]: E1212 17:43:09.651988 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cf495b795-djp4n" podUID="6c556bc6-ca1e-488b-87dc-eb6c6785bf8c" Dec 12 17:43:09.910107 sshd[5877]: Accepted publickey for core from 10.200.16.10 port 59768 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:43:09.912380 sshd-session[5877]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:43:09.919409 systemd-logind[1877]: New session 19 of user core. Dec 12 17:43:09.922641 systemd[1]: Started session-19.scope - Session 19 of User core. Dec 12 17:43:10.652103 sshd[5880]: Connection closed by 10.200.16.10 port 59768 Dec 12 17:43:10.653690 sshd-session[5877]: pam_unix(sshd:session): session closed for user core Dec 12 17:43:10.657620 systemd-logind[1877]: Session 19 logged out. Waiting for processes to exit. Dec 12 17:43:10.658917 systemd[1]: sshd@16-10.200.20.11:22-10.200.16.10:59768.service: Deactivated successfully. Dec 12 17:43:10.663204 systemd[1]: session-19.scope: Deactivated successfully. Dec 12 17:43:10.667290 systemd-logind[1877]: Removed session 19. Dec 12 17:43:10.741973 systemd[1]: Started sshd@17-10.200.20.11:22-10.200.16.10:49206.service - OpenSSH per-connection server daemon (10.200.16.10:49206). Dec 12 17:43:11.234603 sshd[5903]: Accepted publickey for core from 10.200.16.10 port 49206 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:43:11.235831 sshd-session[5903]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:43:11.242114 systemd-logind[1877]: New session 20 of user core. Dec 12 17:43:11.246562 systemd[1]: Started session-20.scope - Session 20 of User core. Dec 12 17:43:11.356009 kubelet[3527]: E1212 17:43:11.355580 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ftsnw" podUID="9d6c6448-7253-4405-b42f-a3327529c933" Dec 12 17:43:11.728286 sshd[5906]: Connection closed by 10.200.16.10 port 49206 Dec 12 17:43:11.728782 sshd-session[5903]: pam_unix(sshd:session): session closed for user core Dec 12 17:43:11.733009 systemd-logind[1877]: Session 20 logged out. Waiting for processes to exit. Dec 12 17:43:11.733130 systemd[1]: sshd@17-10.200.20.11:22-10.200.16.10:49206.service: Deactivated successfully. Dec 12 17:43:11.738248 systemd[1]: session-20.scope: Deactivated successfully. Dec 12 17:43:11.741696 systemd-logind[1877]: Removed session 20. Dec 12 17:43:11.823602 systemd[1]: Started sshd@18-10.200.20.11:22-10.200.16.10:49214.service - OpenSSH per-connection server daemon (10.200.16.10:49214). Dec 12 17:43:12.318921 sshd[5918]: Accepted publickey for core from 10.200.16.10 port 49214 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:43:12.320045 sshd-session[5918]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:43:12.323968 systemd-logind[1877]: New session 21 of user core. Dec 12 17:43:12.334662 systemd[1]: Started session-21.scope - Session 21 of User core. Dec 12 17:43:12.355749 containerd[1916]: time="2025-12-12T17:43:12.355479693Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:43:12.644439 containerd[1916]: time="2025-12-12T17:43:12.643996369Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:43:12.647712 containerd[1916]: time="2025-12-12T17:43:12.647615782Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:43:12.647712 containerd[1916]: time="2025-12-12T17:43:12.647670592Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:43:12.647891 kubelet[3527]: E1212 17:43:12.647835 3527 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:43:12.648265 kubelet[3527]: E1212 17:43:12.647898 3527 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:43:12.648265 kubelet[3527]: E1212 17:43:12.647971 3527 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-8645b67b4b-hg8jx_calico-system(cc75188a-1547-4b56-bf96-4cc7e7818e13): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:43:12.648265 kubelet[3527]: E1212 17:43:12.647998 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8645b67b4b-hg8jx" podUID="cc75188a-1547-4b56-bf96-4cc7e7818e13" Dec 12 17:43:12.713962 sshd[5921]: Connection closed by 10.200.16.10 port 49214 Dec 12 17:43:12.714324 sshd-session[5918]: pam_unix(sshd:session): session closed for user core Dec 12 17:43:12.717959 systemd[1]: sshd@18-10.200.20.11:22-10.200.16.10:49214.service: Deactivated successfully. Dec 12 17:43:12.721314 systemd[1]: session-21.scope: Deactivated successfully. Dec 12 17:43:12.722614 systemd-logind[1877]: Session 21 logged out. Waiting for processes to exit. Dec 12 17:43:12.724023 systemd-logind[1877]: Removed session 21. Dec 12 17:43:14.355098 kubelet[3527]: E1212 17:43:14.355045 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nhpd8" podUID="aae497cc-3748-48de-b6f7-6585350a2476" Dec 12 17:43:17.794615 systemd[1]: Started sshd@19-10.200.20.11:22-10.200.16.10:49222.service - OpenSSH per-connection server daemon (10.200.16.10:49222). Dec 12 17:43:18.251443 sshd[5935]: Accepted publickey for core from 10.200.16.10 port 49222 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:43:18.252793 sshd-session[5935]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:43:18.256654 systemd-logind[1877]: New session 22 of user core. Dec 12 17:43:18.262652 systemd[1]: Started session-22.scope - Session 22 of User core. Dec 12 17:43:18.355223 kubelet[3527]: E1212 17:43:18.355173 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f998dd77f-h9sx8" podUID="ea69765d-2386-4a5a-bcbf-e5190366c632" Dec 12 17:43:18.621845 sshd[5938]: Connection closed by 10.200.16.10 port 49222 Dec 12 17:43:18.622532 sshd-session[5935]: pam_unix(sshd:session): session closed for user core Dec 12 17:43:18.625728 systemd[1]: sshd@19-10.200.20.11:22-10.200.16.10:49222.service: Deactivated successfully. Dec 12 17:43:18.627611 systemd[1]: session-22.scope: Deactivated successfully. Dec 12 17:43:18.628295 systemd-logind[1877]: Session 22 logged out. Waiting for processes to exit. Dec 12 17:43:18.629996 systemd-logind[1877]: Removed session 22. Dec 12 17:43:21.354748 kubelet[3527]: E1212 17:43:21.354704 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cf495b795-djp4n" podUID="6c556bc6-ca1e-488b-87dc-eb6c6785bf8c" Dec 12 17:43:22.355123 kubelet[3527]: E1212 17:43:22.354637 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cf495b795-pnxm5" podUID="65781289-4c97-4182-9b09-b4c93c5b6dd1" Dec 12 17:43:23.354638 kubelet[3527]: E1212 17:43:23.354343 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8645b67b4b-hg8jx" podUID="cc75188a-1547-4b56-bf96-4cc7e7818e13" Dec 12 17:43:23.714479 systemd[1]: Started sshd@20-10.200.20.11:22-10.200.16.10:52400.service - OpenSSH per-connection server daemon (10.200.16.10:52400). Dec 12 17:43:24.208259 sshd[5950]: Accepted publickey for core from 10.200.16.10 port 52400 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:43:24.209870 sshd-session[5950]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:43:24.216738 systemd-logind[1877]: New session 23 of user core. Dec 12 17:43:24.220985 systemd[1]: Started session-23.scope - Session 23 of User core. Dec 12 17:43:24.606371 sshd[5953]: Connection closed by 10.200.16.10 port 52400 Dec 12 17:43:24.606208 sshd-session[5950]: pam_unix(sshd:session): session closed for user core Dec 12 17:43:24.609336 systemd-logind[1877]: Session 23 logged out. Waiting for processes to exit. Dec 12 17:43:24.609466 systemd[1]: sshd@20-10.200.20.11:22-10.200.16.10:52400.service: Deactivated successfully. Dec 12 17:43:24.612473 systemd[1]: session-23.scope: Deactivated successfully. Dec 12 17:43:24.614918 systemd-logind[1877]: Removed session 23. Dec 12 17:43:25.357464 kubelet[3527]: E1212 17:43:25.357397 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nhpd8" podUID="aae497cc-3748-48de-b6f7-6585350a2476" Dec 12 17:43:26.354995 kubelet[3527]: E1212 17:43:26.354947 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ftsnw" podUID="9d6c6448-7253-4405-b42f-a3327529c933" Dec 12 17:43:29.355276 kubelet[3527]: E1212 17:43:29.354911 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f998dd77f-h9sx8" podUID="ea69765d-2386-4a5a-bcbf-e5190366c632" Dec 12 17:43:29.696917 systemd[1]: Started sshd@21-10.200.20.11:22-10.200.16.10:52408.service - OpenSSH per-connection server daemon (10.200.16.10:52408). Dec 12 17:43:30.189083 sshd[5967]: Accepted publickey for core from 10.200.16.10 port 52408 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:43:30.190775 sshd-session[5967]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:43:30.196205 systemd-logind[1877]: New session 24 of user core. Dec 12 17:43:30.204668 systemd[1]: Started session-24.scope - Session 24 of User core. Dec 12 17:43:30.630756 sshd[5970]: Connection closed by 10.200.16.10 port 52408 Dec 12 17:43:30.630598 sshd-session[5967]: pam_unix(sshd:session): session closed for user core Dec 12 17:43:30.636546 systemd[1]: sshd@21-10.200.20.11:22-10.200.16.10:52408.service: Deactivated successfully. Dec 12 17:43:30.639404 systemd[1]: session-24.scope: Deactivated successfully. Dec 12 17:43:30.640631 systemd-logind[1877]: Session 24 logged out. Waiting for processes to exit. Dec 12 17:43:30.642386 systemd-logind[1877]: Removed session 24. Dec 12 17:43:33.354440 kubelet[3527]: E1212 17:43:33.354329 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cf495b795-djp4n" podUID="6c556bc6-ca1e-488b-87dc-eb6c6785bf8c" Dec 12 17:43:35.355616 kubelet[3527]: E1212 17:43:35.355295 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cf495b795-pnxm5" podUID="65781289-4c97-4182-9b09-b4c93c5b6dd1" Dec 12 17:43:35.727412 systemd[1]: Started sshd@22-10.200.20.11:22-10.200.16.10:49770.service - OpenSSH per-connection server daemon (10.200.16.10:49770). Dec 12 17:43:36.227493 sshd[6006]: Accepted publickey for core from 10.200.16.10 port 49770 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:43:36.246543 sshd-session[6006]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:43:36.249990 systemd-logind[1877]: New session 25 of user core. Dec 12 17:43:36.255636 systemd[1]: Started session-25.scope - Session 25 of User core. Dec 12 17:43:36.623660 sshd[6010]: Connection closed by 10.200.16.10 port 49770 Dec 12 17:43:36.624304 sshd-session[6006]: pam_unix(sshd:session): session closed for user core Dec 12 17:43:36.627837 systemd[1]: sshd@22-10.200.20.11:22-10.200.16.10:49770.service: Deactivated successfully. Dec 12 17:43:36.629819 systemd[1]: session-25.scope: Deactivated successfully. Dec 12 17:43:36.631265 systemd-logind[1877]: Session 25 logged out. Waiting for processes to exit. Dec 12 17:43:36.632618 systemd-logind[1877]: Removed session 25. Dec 12 17:43:37.356798 kubelet[3527]: E1212 17:43:37.356739 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-nhpd8" podUID="aae497cc-3748-48de-b6f7-6585350a2476" Dec 12 17:43:38.354583 kubelet[3527]: E1212 17:43:38.354536 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-8645b67b4b-hg8jx" podUID="cc75188a-1547-4b56-bf96-4cc7e7818e13" Dec 12 17:43:39.353700 kubelet[3527]: E1212 17:43:39.353398 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ftsnw" podUID="9d6c6448-7253-4405-b42f-a3327529c933" Dec 12 17:43:41.356399 kubelet[3527]: E1212 17:43:41.356355 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-f998dd77f-h9sx8" podUID="ea69765d-2386-4a5a-bcbf-e5190366c632" Dec 12 17:43:41.711704 systemd[1]: Started sshd@23-10.200.20.11:22-10.200.16.10:49536.service - OpenSSH per-connection server daemon (10.200.16.10:49536). Dec 12 17:43:42.208987 sshd[6022]: Accepted publickey for core from 10.200.16.10 port 49536 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:43:42.210140 sshd-session[6022]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:43:42.213785 systemd-logind[1877]: New session 26 of user core. Dec 12 17:43:42.221738 systemd[1]: Started session-26.scope - Session 26 of User core. Dec 12 17:43:42.604493 sshd[6025]: Connection closed by 10.200.16.10 port 49536 Dec 12 17:43:42.605170 sshd-session[6022]: pam_unix(sshd:session): session closed for user core Dec 12 17:43:42.608837 systemd[1]: sshd@23-10.200.20.11:22-10.200.16.10:49536.service: Deactivated successfully. Dec 12 17:43:42.610633 systemd[1]: session-26.scope: Deactivated successfully. Dec 12 17:43:42.611408 systemd-logind[1877]: Session 26 logged out. Waiting for processes to exit. Dec 12 17:43:42.612794 systemd-logind[1877]: Removed session 26. Dec 12 17:43:44.353730 kubelet[3527]: E1212 17:43:44.353681 3527 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cf495b795-djp4n" podUID="6c556bc6-ca1e-488b-87dc-eb6c6785bf8c"