Dec 12 17:38:40.075526 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd490] Dec 12 17:38:40.075544 kernel: Linux version 6.12.61-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Fri Dec 12 15:20:48 -00 2025 Dec 12 17:38:40.075550 kernel: KASLR enabled Dec 12 17:38:40.075554 kernel: earlycon: pl11 at MMIO 0x00000000effec000 (options '') Dec 12 17:38:40.075558 kernel: printk: legacy bootconsole [pl11] enabled Dec 12 17:38:40.075563 kernel: efi: EFI v2.7 by EDK II Dec 12 17:38:40.075568 kernel: efi: ACPI 2.0=0x3f979018 SMBIOS=0x3f8a0000 SMBIOS 3.0=0x3f880000 MEMATTR=0x3e89d018 RNG=0x3f979998 MEMRESERVE=0x3db7d598 Dec 12 17:38:40.075572 kernel: random: crng init done Dec 12 17:38:40.075576 kernel: secureboot: Secure boot disabled Dec 12 17:38:40.075580 kernel: ACPI: Early table checksum verification disabled Dec 12 17:38:40.075584 kernel: ACPI: RSDP 0x000000003F979018 000024 (v02 VRTUAL) Dec 12 17:38:40.075588 kernel: ACPI: XSDT 0x000000003F979F18 00006C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Dec 12 17:38:40.075592 kernel: ACPI: FACP 0x000000003F979C18 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Dec 12 17:38:40.075596 kernel: ACPI: DSDT 0x000000003F95A018 01E046 (v02 MSFTVM DSDT01 00000001 INTL 20230628) Dec 12 17:38:40.075602 kernel: ACPI: DBG2 0x000000003F979B18 000072 (v00 VRTUAL MICROSFT 00000001 MSFT 00000001) Dec 12 17:38:40.075606 kernel: ACPI: GTDT 0x000000003F979D98 000060 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Dec 12 17:38:40.075610 kernel: ACPI: OEM0 0x000000003F979098 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Dec 12 17:38:40.075614 kernel: ACPI: SPCR 0x000000003F979A98 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Dec 12 17:38:40.075619 kernel: ACPI: APIC 0x000000003F979818 0000FC (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Dec 12 17:38:40.075624 kernel: ACPI: SRAT 0x000000003F979198 000234 (v03 VRTUAL MICROSFT 00000001 MSFT 00000001) Dec 12 17:38:40.075628 kernel: ACPI: PPTT 0x000000003F979418 000120 (v01 VRTUAL MICROSFT 00000000 MSFT 00000000) Dec 12 17:38:40.075632 kernel: ACPI: BGRT 0x000000003F979E98 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Dec 12 17:38:40.075636 kernel: ACPI: SPCR: console: pl011,mmio32,0xeffec000,115200 Dec 12 17:38:40.075640 kernel: ACPI: Use ACPI SPCR as default console: Yes Dec 12 17:38:40.075644 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] hotplug Dec 12 17:38:40.075649 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x1bfffffff] hotplug Dec 12 17:38:40.075653 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1c0000000-0xfbfffffff] hotplug Dec 12 17:38:40.075657 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] hotplug Dec 12 17:38:40.075661 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] hotplug Dec 12 17:38:40.075665 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] hotplug Dec 12 17:38:40.075670 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] hotplug Dec 12 17:38:40.075688 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] hotplug Dec 12 17:38:40.075692 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] hotplug Dec 12 17:38:40.075696 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] hotplug Dec 12 17:38:40.075701 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] hotplug Dec 12 17:38:40.075705 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] hotplug Dec 12 17:38:40.075709 kernel: NUMA: Node 0 [mem 0x00000000-0x3fffffff] + [mem 0x100000000-0x1bfffffff] -> [mem 0x00000000-0x1bfffffff] Dec 12 17:38:40.075713 kernel: NODE_DATA(0) allocated [mem 0x1bf7ffa00-0x1bf806fff] Dec 12 17:38:40.075717 kernel: Zone ranges: Dec 12 17:38:40.075722 kernel: DMA [mem 0x0000000000000000-0x00000000ffffffff] Dec 12 17:38:40.075729 kernel: DMA32 empty Dec 12 17:38:40.075733 kernel: Normal [mem 0x0000000100000000-0x00000001bfffffff] Dec 12 17:38:40.075738 kernel: Device empty Dec 12 17:38:40.075742 kernel: Movable zone start for each node Dec 12 17:38:40.075747 kernel: Early memory node ranges Dec 12 17:38:40.075751 kernel: node 0: [mem 0x0000000000000000-0x00000000007fffff] Dec 12 17:38:40.075756 kernel: node 0: [mem 0x0000000000824000-0x000000003f38ffff] Dec 12 17:38:40.075761 kernel: node 0: [mem 0x000000003f390000-0x000000003f93ffff] Dec 12 17:38:40.075765 kernel: node 0: [mem 0x000000003f940000-0x000000003f9effff] Dec 12 17:38:40.075769 kernel: node 0: [mem 0x000000003f9f0000-0x000000003fdeffff] Dec 12 17:38:40.075774 kernel: node 0: [mem 0x000000003fdf0000-0x000000003fffffff] Dec 12 17:38:40.075778 kernel: node 0: [mem 0x0000000100000000-0x00000001bfffffff] Dec 12 17:38:40.075782 kernel: Initmem setup node 0 [mem 0x0000000000000000-0x00000001bfffffff] Dec 12 17:38:40.075787 kernel: On node 0, zone DMA: 36 pages in unavailable ranges Dec 12 17:38:40.075791 kernel: cma: Reserved 16 MiB at 0x000000003ca00000 on node -1 Dec 12 17:38:40.075795 kernel: psci: probing for conduit method from ACPI. Dec 12 17:38:40.075800 kernel: psci: PSCIv1.3 detected in firmware. Dec 12 17:38:40.075804 kernel: psci: Using standard PSCI v0.2 function IDs Dec 12 17:38:40.075809 kernel: psci: MIGRATE_INFO_TYPE not supported. Dec 12 17:38:40.075814 kernel: psci: SMC Calling Convention v1.4 Dec 12 17:38:40.075818 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Dec 12 17:38:40.075823 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Dec 12 17:38:40.075827 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Dec 12 17:38:40.075831 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Dec 12 17:38:40.075836 kernel: pcpu-alloc: [0] 0 [0] 1 Dec 12 17:38:40.075840 kernel: Detected PIPT I-cache on CPU0 Dec 12 17:38:40.075845 kernel: CPU features: detected: Address authentication (architected QARMA5 algorithm) Dec 12 17:38:40.075849 kernel: CPU features: detected: GIC system register CPU interface Dec 12 17:38:40.075853 kernel: CPU features: detected: Spectre-v4 Dec 12 17:38:40.075858 kernel: CPU features: detected: Spectre-BHB Dec 12 17:38:40.075863 kernel: CPU features: kernel page table isolation forced ON by KASLR Dec 12 17:38:40.075867 kernel: CPU features: detected: Kernel page table isolation (KPTI) Dec 12 17:38:40.075872 kernel: CPU features: detected: ARM erratum 2067961 or 2054223 Dec 12 17:38:40.075876 kernel: CPU features: detected: SSBS not fully self-synchronizing Dec 12 17:38:40.075881 kernel: alternatives: applying boot alternatives Dec 12 17:38:40.075886 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=361f5baddf90aee3bc7ee7e9be879bc0cc94314f224faa1e2791d9b44cd3ec52 Dec 12 17:38:40.075891 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Dec 12 17:38:40.075895 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 12 17:38:40.075899 kernel: Fallback order for Node 0: 0 Dec 12 17:38:40.075904 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1048540 Dec 12 17:38:40.075909 kernel: Policy zone: Normal Dec 12 17:38:40.075913 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 12 17:38:40.075918 kernel: software IO TLB: area num 2. Dec 12 17:38:40.075922 kernel: software IO TLB: mapped [mem 0x0000000035900000-0x0000000039900000] (64MB) Dec 12 17:38:40.075926 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Dec 12 17:38:40.075931 kernel: rcu: Preemptible hierarchical RCU implementation. Dec 12 17:38:40.075936 kernel: rcu: RCU event tracing is enabled. Dec 12 17:38:40.075940 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Dec 12 17:38:40.075945 kernel: Trampoline variant of Tasks RCU enabled. Dec 12 17:38:40.075949 kernel: Tracing variant of Tasks RCU enabled. Dec 12 17:38:40.075954 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 12 17:38:40.075958 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Dec 12 17:38:40.075964 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Dec 12 17:38:40.075968 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Dec 12 17:38:40.075973 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Dec 12 17:38:40.075977 kernel: GICv3: 960 SPIs implemented Dec 12 17:38:40.075981 kernel: GICv3: 0 Extended SPIs implemented Dec 12 17:38:40.075985 kernel: Root IRQ handler: gic_handle_irq Dec 12 17:38:40.075990 kernel: GICv3: GICv3 features: 16 PPIs, RSS Dec 12 17:38:40.075994 kernel: GICv3: GICD_CTRL.DS=0, SCR_EL3.FIQ=0 Dec 12 17:38:40.075998 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000effee000 Dec 12 17:38:40.076003 kernel: ITS: No ITS available, not enabling LPIs Dec 12 17:38:40.076007 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 12 17:38:40.076012 kernel: arch_timer: cp15 timer(s) running at 1000.00MHz (virt). Dec 12 17:38:40.076017 kernel: clocksource: arch_sys_counter: mask: 0x1fffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Dec 12 17:38:40.076021 kernel: sched_clock: 61 bits at 1000MHz, resolution 1ns, wraps every 4398046511103ns Dec 12 17:38:40.076026 kernel: Console: colour dummy device 80x25 Dec 12 17:38:40.076031 kernel: printk: legacy console [tty1] enabled Dec 12 17:38:40.076035 kernel: ACPI: Core revision 20240827 Dec 12 17:38:40.076040 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 2000.00 BogoMIPS (lpj=1000000) Dec 12 17:38:40.076044 kernel: pid_max: default: 32768 minimum: 301 Dec 12 17:38:40.076049 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Dec 12 17:38:40.076053 kernel: landlock: Up and running. Dec 12 17:38:40.076058 kernel: SELinux: Initializing. Dec 12 17:38:40.076063 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Dec 12 17:38:40.076068 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Dec 12 17:38:40.076072 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3b8030, hints 0xa0000e, misc 0x31e1 Dec 12 17:38:40.076077 kernel: Hyper-V: Host Build 10.0.26102.1172-1-0 Dec 12 17:38:40.076085 kernel: Hyper-V: enabling crash_kexec_post_notifiers Dec 12 17:38:40.076090 kernel: rcu: Hierarchical SRCU implementation. Dec 12 17:38:40.076095 kernel: rcu: Max phase no-delay instances is 400. Dec 12 17:38:40.076100 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Dec 12 17:38:40.076105 kernel: Remapping and enabling EFI services. Dec 12 17:38:40.076109 kernel: smp: Bringing up secondary CPUs ... Dec 12 17:38:40.076114 kernel: Detected PIPT I-cache on CPU1 Dec 12 17:38:40.076120 kernel: GICv3: CPU1: found redistributor 1 region 1:0x00000000f000e000 Dec 12 17:38:40.076125 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd490] Dec 12 17:38:40.076129 kernel: smp: Brought up 1 node, 2 CPUs Dec 12 17:38:40.076134 kernel: SMP: Total of 2 processors activated. Dec 12 17:38:40.076139 kernel: CPU: All CPU(s) started at EL1 Dec 12 17:38:40.076144 kernel: CPU features: detected: 32-bit EL0 Support Dec 12 17:38:40.076149 kernel: CPU features: detected: Instruction cache invalidation not required for I/D coherence Dec 12 17:38:40.076154 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Dec 12 17:38:40.076159 kernel: CPU features: detected: Common not Private translations Dec 12 17:38:40.076163 kernel: CPU features: detected: CRC32 instructions Dec 12 17:38:40.076168 kernel: CPU features: detected: Generic authentication (architected QARMA5 algorithm) Dec 12 17:38:40.076173 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Dec 12 17:38:40.076178 kernel: CPU features: detected: LSE atomic instructions Dec 12 17:38:40.076183 kernel: CPU features: detected: Privileged Access Never Dec 12 17:38:40.076188 kernel: CPU features: detected: Speculation barrier (SB) Dec 12 17:38:40.076193 kernel: CPU features: detected: TLB range maintenance instructions Dec 12 17:38:40.076198 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Dec 12 17:38:40.076203 kernel: CPU features: detected: Scalable Vector Extension Dec 12 17:38:40.076207 kernel: alternatives: applying system-wide alternatives Dec 12 17:38:40.076212 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Dec 12 17:38:40.076217 kernel: SVE: maximum available vector length 16 bytes per vector Dec 12 17:38:40.076222 kernel: SVE: default vector length 16 bytes per vector Dec 12 17:38:40.076226 kernel: Memory: 3952828K/4194160K available (11200K kernel code, 2456K rwdata, 9084K rodata, 39552K init, 1038K bss, 220144K reserved, 16384K cma-reserved) Dec 12 17:38:40.076232 kernel: devtmpfs: initialized Dec 12 17:38:40.076237 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 12 17:38:40.076242 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Dec 12 17:38:40.076246 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Dec 12 17:38:40.076251 kernel: 0 pages in range for non-PLT usage Dec 12 17:38:40.076256 kernel: 508400 pages in range for PLT usage Dec 12 17:38:40.076260 kernel: pinctrl core: initialized pinctrl subsystem Dec 12 17:38:40.076265 kernel: SMBIOS 3.1.0 present. Dec 12 17:38:40.076271 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 06/10/2025 Dec 12 17:38:40.076275 kernel: DMI: Memory slots populated: 2/2 Dec 12 17:38:40.076280 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 12 17:38:40.076285 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Dec 12 17:38:40.076290 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Dec 12 17:38:40.076295 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Dec 12 17:38:40.076299 kernel: audit: initializing netlink subsys (disabled) Dec 12 17:38:40.076304 kernel: audit: type=2000 audit(0.059:1): state=initialized audit_enabled=0 res=1 Dec 12 17:38:40.076309 kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 12 17:38:40.076314 kernel: cpuidle: using governor menu Dec 12 17:38:40.076319 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Dec 12 17:38:40.076324 kernel: ASID allocator initialised with 32768 entries Dec 12 17:38:40.076329 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 12 17:38:40.076333 kernel: Serial: AMBA PL011 UART driver Dec 12 17:38:40.076338 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Dec 12 17:38:40.076343 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Dec 12 17:38:40.076347 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Dec 12 17:38:40.076352 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Dec 12 17:38:40.076358 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Dec 12 17:38:40.076362 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Dec 12 17:38:40.076367 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Dec 12 17:38:40.076372 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Dec 12 17:38:40.076376 kernel: ACPI: Added _OSI(Module Device) Dec 12 17:38:40.076381 kernel: ACPI: Added _OSI(Processor Device) Dec 12 17:38:40.076386 kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 12 17:38:40.076390 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 12 17:38:40.076395 kernel: ACPI: Interpreter enabled Dec 12 17:38:40.076401 kernel: ACPI: Using GIC for interrupt routing Dec 12 17:38:40.076405 kernel: ARMH0011:00: ttyAMA0 at MMIO 0xeffec000 (irq = 12, base_baud = 0) is a SBSA Dec 12 17:38:40.076410 kernel: printk: legacy console [ttyAMA0] enabled Dec 12 17:38:40.076415 kernel: printk: legacy bootconsole [pl11] disabled Dec 12 17:38:40.076420 kernel: ARMH0011:01: ttyAMA1 at MMIO 0xeffeb000 (irq = 13, base_baud = 0) is a SBSA Dec 12 17:38:40.076424 kernel: ACPI: CPU0 has been hot-added Dec 12 17:38:40.076429 kernel: ACPI: CPU1 has been hot-added Dec 12 17:38:40.076434 kernel: iommu: Default domain type: Translated Dec 12 17:38:40.076438 kernel: iommu: DMA domain TLB invalidation policy: strict mode Dec 12 17:38:40.076444 kernel: efivars: Registered efivars operations Dec 12 17:38:40.076449 kernel: vgaarb: loaded Dec 12 17:38:40.076453 kernel: clocksource: Switched to clocksource arch_sys_counter Dec 12 17:38:40.076458 kernel: VFS: Disk quotas dquot_6.6.0 Dec 12 17:38:40.076463 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 12 17:38:40.076468 kernel: pnp: PnP ACPI init Dec 12 17:38:40.076472 kernel: pnp: PnP ACPI: found 0 devices Dec 12 17:38:40.076477 kernel: NET: Registered PF_INET protocol family Dec 12 17:38:40.076482 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Dec 12 17:38:40.076487 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Dec 12 17:38:40.076492 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 12 17:38:40.076497 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 12 17:38:40.076502 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Dec 12 17:38:40.076507 kernel: TCP: Hash tables configured (established 32768 bind 32768) Dec 12 17:38:40.076511 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Dec 12 17:38:40.076516 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Dec 12 17:38:40.076521 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 12 17:38:40.076526 kernel: PCI: CLS 0 bytes, default 64 Dec 12 17:38:40.076530 kernel: kvm [1]: HYP mode not available Dec 12 17:38:40.076536 kernel: Initialise system trusted keyrings Dec 12 17:38:40.076541 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Dec 12 17:38:40.076545 kernel: Key type asymmetric registered Dec 12 17:38:40.076550 kernel: Asymmetric key parser 'x509' registered Dec 12 17:38:40.076555 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Dec 12 17:38:40.076559 kernel: io scheduler mq-deadline registered Dec 12 17:38:40.076564 kernel: io scheduler kyber registered Dec 12 17:38:40.076569 kernel: io scheduler bfq registered Dec 12 17:38:40.076574 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 12 17:38:40.076579 kernel: thunder_xcv, ver 1.0 Dec 12 17:38:40.076584 kernel: thunder_bgx, ver 1.0 Dec 12 17:38:40.076588 kernel: nicpf, ver 1.0 Dec 12 17:38:40.076593 kernel: nicvf, ver 1.0 Dec 12 17:38:40.076717 kernel: rtc-efi rtc-efi.0: registered as rtc0 Dec 12 17:38:40.076771 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-12-12T17:38:39 UTC (1765561119) Dec 12 17:38:40.076778 kernel: efifb: probing for efifb Dec 12 17:38:40.076785 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Dec 12 17:38:40.076790 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Dec 12 17:38:40.076794 kernel: efifb: scrolling: redraw Dec 12 17:38:40.076799 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Dec 12 17:38:40.076804 kernel: Console: switching to colour frame buffer device 128x48 Dec 12 17:38:40.076809 kernel: fb0: EFI VGA frame buffer device Dec 12 17:38:40.076813 kernel: SMCCC: SOC_ID: ARCH_SOC_ID not implemented, skipping .... Dec 12 17:38:40.076818 kernel: hid: raw HID events driver (C) Jiri Kosina Dec 12 17:38:40.076823 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Dec 12 17:38:40.076829 kernel: watchdog: NMI not fully supported Dec 12 17:38:40.076834 kernel: watchdog: Hard watchdog permanently disabled Dec 12 17:38:40.076838 kernel: NET: Registered PF_INET6 protocol family Dec 12 17:38:40.076843 kernel: Segment Routing with IPv6 Dec 12 17:38:40.076848 kernel: In-situ OAM (IOAM) with IPv6 Dec 12 17:38:40.076853 kernel: NET: Registered PF_PACKET protocol family Dec 12 17:38:40.076857 kernel: Key type dns_resolver registered Dec 12 17:38:40.076862 kernel: registered taskstats version 1 Dec 12 17:38:40.076867 kernel: Loading compiled-in X.509 certificates Dec 12 17:38:40.076872 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.61-flatcar: 92f3a94fb747a7ba7cbcfde1535be91b86f9429a' Dec 12 17:38:40.076878 kernel: Demotion targets for Node 0: null Dec 12 17:38:40.076883 kernel: Key type .fscrypt registered Dec 12 17:38:40.076887 kernel: Key type fscrypt-provisioning registered Dec 12 17:38:40.076892 kernel: ima: No TPM chip found, activating TPM-bypass! Dec 12 17:38:40.076897 kernel: ima: Allocated hash algorithm: sha1 Dec 12 17:38:40.076901 kernel: ima: No architecture policies found Dec 12 17:38:40.076906 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Dec 12 17:38:40.076911 kernel: clk: Disabling unused clocks Dec 12 17:38:40.076916 kernel: PM: genpd: Disabling unused power domains Dec 12 17:38:40.076921 kernel: Warning: unable to open an initial console. Dec 12 17:38:40.076926 kernel: Freeing unused kernel memory: 39552K Dec 12 17:38:40.076931 kernel: Run /init as init process Dec 12 17:38:40.076935 kernel: with arguments: Dec 12 17:38:40.076940 kernel: /init Dec 12 17:38:40.076945 kernel: with environment: Dec 12 17:38:40.076949 kernel: HOME=/ Dec 12 17:38:40.076954 kernel: TERM=linux Dec 12 17:38:40.076960 systemd[1]: Successfully made /usr/ read-only. Dec 12 17:38:40.076967 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 12 17:38:40.076973 systemd[1]: Detected virtualization microsoft. Dec 12 17:38:40.076978 systemd[1]: Detected architecture arm64. Dec 12 17:38:40.076983 systemd[1]: Running in initrd. Dec 12 17:38:40.076988 systemd[1]: No hostname configured, using default hostname. Dec 12 17:38:40.076993 systemd[1]: Hostname set to . Dec 12 17:38:40.076998 systemd[1]: Initializing machine ID from random generator. Dec 12 17:38:40.077004 systemd[1]: Queued start job for default target initrd.target. Dec 12 17:38:40.077009 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 17:38:40.077015 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 17:38:40.077020 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Dec 12 17:38:40.077026 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 12 17:38:40.077031 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Dec 12 17:38:40.077037 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Dec 12 17:38:40.077043 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Dec 12 17:38:40.077049 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Dec 12 17:38:40.077054 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 17:38:40.077059 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 12 17:38:40.077064 systemd[1]: Reached target paths.target - Path Units. Dec 12 17:38:40.077069 systemd[1]: Reached target slices.target - Slice Units. Dec 12 17:38:40.077074 systemd[1]: Reached target swap.target - Swaps. Dec 12 17:38:40.077080 systemd[1]: Reached target timers.target - Timer Units. Dec 12 17:38:40.077086 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Dec 12 17:38:40.077091 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 12 17:38:40.077096 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Dec 12 17:38:40.077101 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Dec 12 17:38:40.077106 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 12 17:38:40.077112 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 12 17:38:40.077117 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 17:38:40.077122 systemd[1]: Reached target sockets.target - Socket Units. Dec 12 17:38:40.077127 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Dec 12 17:38:40.077134 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 12 17:38:40.077139 systemd[1]: Finished network-cleanup.service - Network Cleanup. Dec 12 17:38:40.077144 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Dec 12 17:38:40.077149 systemd[1]: Starting systemd-fsck-usr.service... Dec 12 17:38:40.077154 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 12 17:38:40.077160 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 12 17:38:40.077176 systemd-journald[225]: Collecting audit messages is disabled. Dec 12 17:38:40.077191 systemd-journald[225]: Journal started Dec 12 17:38:40.077205 systemd-journald[225]: Runtime Journal (/run/log/journal/56c721c0433f4998b7ea65cad9698d07) is 8M, max 78.3M, 70.3M free. Dec 12 17:38:40.079712 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:38:40.084621 systemd-modules-load[227]: Inserted module 'overlay' Dec 12 17:38:40.102477 systemd[1]: Started systemd-journald.service - Journal Service. Dec 12 17:38:40.103000 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Dec 12 17:38:40.124771 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 12 17:38:40.124792 kernel: Bridge firewalling registered Dec 12 17:38:40.114351 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 17:38:40.123343 systemd-modules-load[227]: Inserted module 'br_netfilter' Dec 12 17:38:40.127226 systemd[1]: Finished systemd-fsck-usr.service. Dec 12 17:38:40.134985 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 12 17:38:40.143046 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:38:40.153816 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 12 17:38:40.161201 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 12 17:38:40.183146 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 12 17:38:40.189825 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 12 17:38:40.212046 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 12 17:38:40.224571 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 12 17:38:40.230854 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 12 17:38:40.233924 systemd-tmpfiles[245]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Dec 12 17:38:40.236089 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 17:38:40.247521 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Dec 12 17:38:40.275353 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 12 17:38:40.284668 dracut-cmdline[261]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=361f5baddf90aee3bc7ee7e9be879bc0cc94314f224faa1e2791d9b44cd3ec52 Dec 12 17:38:40.311346 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 12 17:38:40.329430 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 17:38:40.350315 systemd-resolved[264]: Positive Trust Anchors: Dec 12 17:38:40.350335 systemd-resolved[264]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 12 17:38:40.350355 systemd-resolved[264]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 12 17:38:40.352019 systemd-resolved[264]: Defaulting to hostname 'linux'. Dec 12 17:38:40.353636 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 12 17:38:40.365074 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 12 17:38:40.446708 kernel: SCSI subsystem initialized Dec 12 17:38:40.452687 kernel: Loading iSCSI transport class v2.0-870. Dec 12 17:38:40.459686 kernel: iscsi: registered transport (tcp) Dec 12 17:38:40.472818 kernel: iscsi: registered transport (qla4xxx) Dec 12 17:38:40.472868 kernel: QLogic iSCSI HBA Driver Dec 12 17:38:40.486318 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 12 17:38:40.506239 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 17:38:40.513351 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 12 17:38:40.558295 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Dec 12 17:38:40.563831 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Dec 12 17:38:40.621693 kernel: raid6: neonx8 gen() 18552 MB/s Dec 12 17:38:40.640684 kernel: raid6: neonx4 gen() 18569 MB/s Dec 12 17:38:40.659679 kernel: raid6: neonx2 gen() 17075 MB/s Dec 12 17:38:40.679681 kernel: raid6: neonx1 gen() 15068 MB/s Dec 12 17:38:40.698680 kernel: raid6: int64x8 gen() 10545 MB/s Dec 12 17:38:40.717766 kernel: raid6: int64x4 gen() 10609 MB/s Dec 12 17:38:40.737700 kernel: raid6: int64x2 gen() 8991 MB/s Dec 12 17:38:40.759213 kernel: raid6: int64x1 gen() 6993 MB/s Dec 12 17:38:40.759224 kernel: raid6: using algorithm neonx4 gen() 18569 MB/s Dec 12 17:38:40.781285 kernel: raid6: .... xor() 15131 MB/s, rmw enabled Dec 12 17:38:40.781336 kernel: raid6: using neon recovery algorithm Dec 12 17:38:40.789430 kernel: xor: measuring software checksum speed Dec 12 17:38:40.789438 kernel: 8regs : 28661 MB/sec Dec 12 17:38:40.795994 kernel: 32regs : 27054 MB/sec Dec 12 17:38:40.796003 kernel: arm64_neon : 37676 MB/sec Dec 12 17:38:40.799159 kernel: xor: using function: arm64_neon (37676 MB/sec) Dec 12 17:38:40.836692 kernel: Btrfs loaded, zoned=no, fsverity=no Dec 12 17:38:40.843721 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Dec 12 17:38:40.853155 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 17:38:40.882727 systemd-udevd[473]: Using default interface naming scheme 'v255'. Dec 12 17:38:40.886941 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 17:38:40.899641 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Dec 12 17:38:40.926932 dracut-pre-trigger[484]: rd.md=0: removing MD RAID activation Dec 12 17:38:40.947174 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Dec 12 17:38:40.953557 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 12 17:38:41.003911 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 17:38:41.010141 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Dec 12 17:38:41.070690 kernel: hv_vmbus: Vmbus version:5.3 Dec 12 17:38:41.081527 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 17:38:41.103375 kernel: hv_vmbus: registering driver hyperv_keyboard Dec 12 17:38:41.103394 kernel: pps_core: LinuxPPS API ver. 1 registered Dec 12 17:38:41.103401 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Dec 12 17:38:41.103408 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input0 Dec 12 17:38:41.103415 kernel: PTP clock support registered Dec 12 17:38:41.099823 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:38:41.118727 kernel: hv_vmbus: registering driver hv_storvsc Dec 12 17:38:41.119691 kernel: hv_utils: Registering HyperV Utility Driver Dec 12 17:38:41.122066 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:38:41.146339 kernel: scsi host0: storvsc_host_t Dec 12 17:38:41.165755 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Dec 12 17:38:41.165897 kernel: hv_vmbus: registering driver hv_utils Dec 12 17:38:41.165906 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 Dec 12 17:38:41.165977 kernel: scsi host1: storvsc_host_t Dec 12 17:38:41.166056 kernel: hv_utils: Heartbeat IC version 3.0 Dec 12 17:38:41.166063 kernel: hv_vmbus: registering driver hid_hyperv Dec 12 17:38:41.166069 kernel: hv_utils: Shutdown IC version 3.2 Dec 12 17:38:41.166075 kernel: hv_utils: TimeSync IC version 4.0 Dec 12 17:38:41.166081 kernel: hv_vmbus: registering driver hv_netvsc Dec 12 17:38:41.166087 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input1 Dec 12 17:38:41.141387 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:38:41.631463 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Dec 12 17:38:41.609710 systemd-resolved[264]: Clock change detected. Flushing caches. Dec 12 17:38:41.618137 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Dec 12 17:38:41.639130 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 17:38:41.667204 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Dec 12 17:38:41.667351 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Dec 12 17:38:41.639208 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:38:41.677173 kernel: sd 0:0:0:0: [sda] Write Protect is off Dec 12 17:38:41.677298 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 Dec 12 17:38:41.677363 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Dec 12 17:38:41.668098 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Dec 12 17:38:41.703143 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#257 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Dec 12 17:38:41.703294 kernel: hv_netvsc 000d3ac2-b520-000d-3ac2-b520000d3ac2 eth0: VF slot 1 added Dec 12 17:38:41.703359 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#264 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Dec 12 17:38:41.668720 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:38:41.722304 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 12 17:38:41.729486 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Dec 12 17:38:41.729632 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Dec 12 17:38:41.732761 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Dec 12 17:38:41.729793 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:38:41.750583 kernel: hv_vmbus: registering driver hv_pci Dec 12 17:38:41.750597 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Dec 12 17:38:41.750716 kernel: hv_pci 4b69c9a1-9295-49f3-98f9-133e274d1141: PCI VMBus probing: Using version 0x10004 Dec 12 17:38:41.760764 kernel: hv_pci 4b69c9a1-9295-49f3-98f9-133e274d1141: PCI host bridge to bus 9295:00 Dec 12 17:38:41.760861 kernel: pci_bus 9295:00: root bus resource [mem 0xfc0000000-0xfc00fffff window] Dec 12 17:38:41.765443 kernel: pci_bus 9295:00: No busn resource found for root bus, will use [bus 00-ff] Dec 12 17:38:41.767274 kernel: pci 9295:00:02.0: [15b3:101a] type 00 class 0x020000 PCIe Endpoint Dec 12 17:38:41.778307 kernel: pci 9295:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref] Dec 12 17:38:41.785256 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#282 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Dec 12 17:38:41.789258 kernel: pci 9295:00:02.0: enabling Extended Tags Dec 12 17:38:41.807236 kernel: pci 9295:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at 9295:00:02.0 (capable of 252.048 Gb/s with 16.0 GT/s PCIe x16 link) Dec 12 17:38:41.807384 kernel: pci_bus 9295:00: busn_res: [bus 00-ff] end is updated to 00 Dec 12 17:38:41.814807 kernel: pci 9295:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref]: assigned Dec 12 17:38:41.826234 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#257 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Dec 12 17:38:41.881712 kernel: mlx5_core 9295:00:02.0: enabling device (0000 -> 0002) Dec 12 17:38:41.890747 kernel: mlx5_core 9295:00:02.0: PTM is not supported by PCIe Dec 12 17:38:41.890871 kernel: mlx5_core 9295:00:02.0: firmware version: 16.30.5006 Dec 12 17:38:42.062537 kernel: hv_netvsc 000d3ac2-b520-000d-3ac2-b520000d3ac2 eth0: VF registering: eth1 Dec 12 17:38:42.062719 kernel: mlx5_core 9295:00:02.0 eth1: joined to eth0 Dec 12 17:38:42.069276 kernel: mlx5_core 9295:00:02.0: MLX5E: StrdRq(1) RqSz(8) StrdSz(2048) RxCqeCmprss(0 basic) Dec 12 17:38:42.078233 kernel: mlx5_core 9295:00:02.0 enP37525s1: renamed from eth1 Dec 12 17:38:42.275214 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. Dec 12 17:38:42.374253 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. Dec 12 17:38:42.386176 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Dec 12 17:38:42.410235 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. Dec 12 17:38:42.415253 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. Dec 12 17:38:42.427754 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Dec 12 17:38:42.595271 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Dec 12 17:38:42.600114 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Dec 12 17:38:42.609164 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 17:38:42.618490 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 12 17:38:42.627963 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Dec 12 17:38:42.651174 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Dec 12 17:38:43.463033 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#310 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Dec 12 17:38:43.476237 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 12 17:38:43.477046 disk-uuid[648]: The operation has completed successfully. Dec 12 17:38:43.544005 systemd[1]: disk-uuid.service: Deactivated successfully. Dec 12 17:38:43.545240 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Dec 12 17:38:43.576796 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Dec 12 17:38:43.594340 sh[825]: Success Dec 12 17:38:43.628363 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 12 17:38:43.628406 kernel: device-mapper: uevent: version 1.0.3 Dec 12 17:38:43.633181 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Dec 12 17:38:43.643242 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Dec 12 17:38:43.914758 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Dec 12 17:38:43.919681 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Dec 12 17:38:43.940264 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Dec 12 17:38:43.962734 kernel: BTRFS: device fsid 6d6d314d-b8a1-4727-8a34-8525e276a248 devid 1 transid 38 /dev/mapper/usr (254:0) scanned by mount (843) Dec 12 17:38:43.962779 kernel: BTRFS info (device dm-0): first mount of filesystem 6d6d314d-b8a1-4727-8a34-8525e276a248 Dec 12 17:38:43.967134 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:38:44.278852 kernel: BTRFS info (device dm-0): disabling log replay at mount time Dec 12 17:38:44.278934 kernel: BTRFS info (device dm-0): enabling free space tree Dec 12 17:38:44.312064 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Dec 12 17:38:44.316412 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Dec 12 17:38:44.323950 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Dec 12 17:38:44.324620 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Dec 12 17:38:44.350831 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Dec 12 17:38:44.380235 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (866) Dec 12 17:38:44.391575 kernel: BTRFS info (device sda6): first mount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:38:44.391620 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:38:44.442918 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 12 17:38:44.453887 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 12 17:38:44.487686 systemd-networkd[1006]: lo: Link UP Dec 12 17:38:44.487699 systemd-networkd[1006]: lo: Gained carrier Dec 12 17:38:44.488418 systemd-networkd[1006]: Enumeration completed Dec 12 17:38:44.490481 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 12 17:38:44.493824 systemd-networkd[1006]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:38:44.493827 systemd-networkd[1006]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 12 17:38:44.498572 systemd[1]: Reached target network.target - Network. Dec 12 17:38:44.534791 kernel: BTRFS info (device sda6): turning on async discard Dec 12 17:38:44.534824 kernel: BTRFS info (device sda6): enabling free space tree Dec 12 17:38:44.543255 kernel: BTRFS info (device sda6): last unmount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:38:44.544297 systemd[1]: Finished ignition-setup.service - Ignition (setup). Dec 12 17:38:44.549973 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Dec 12 17:38:44.598236 kernel: mlx5_core 9295:00:02.0 enP37525s1: Link up Dec 12 17:38:44.630510 kernel: hv_netvsc 000d3ac2-b520-000d-3ac2-b520000d3ac2 eth0: Data path switched to VF: enP37525s1 Dec 12 17:38:44.630265 systemd-networkd[1006]: enP37525s1: Link UP Dec 12 17:38:44.630325 systemd-networkd[1006]: eth0: Link UP Dec 12 17:38:44.630392 systemd-networkd[1006]: eth0: Gained carrier Dec 12 17:38:44.630406 systemd-networkd[1006]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:38:44.647530 systemd-networkd[1006]: enP37525s1: Gained carrier Dec 12 17:38:44.664259 systemd-networkd[1006]: eth0: DHCPv4 address 10.200.20.10/24, gateway 10.200.20.1 acquired from 168.63.129.16 Dec 12 17:38:45.947732 ignition[1014]: Ignition 2.22.0 Dec 12 17:38:45.947751 ignition[1014]: Stage: fetch-offline Dec 12 17:38:45.950725 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Dec 12 17:38:45.947841 ignition[1014]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:38:45.960400 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Dec 12 17:38:45.947848 ignition[1014]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Dec 12 17:38:45.947910 ignition[1014]: parsed url from cmdline: "" Dec 12 17:38:45.947913 ignition[1014]: no config URL provided Dec 12 17:38:45.947916 ignition[1014]: reading system config file "/usr/lib/ignition/user.ign" Dec 12 17:38:45.947921 ignition[1014]: no config at "/usr/lib/ignition/user.ign" Dec 12 17:38:45.947924 ignition[1014]: failed to fetch config: resource requires networking Dec 12 17:38:45.948121 ignition[1014]: Ignition finished successfully Dec 12 17:38:45.992988 ignition[1024]: Ignition 2.22.0 Dec 12 17:38:45.992993 ignition[1024]: Stage: fetch Dec 12 17:38:45.993164 ignition[1024]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:38:45.993171 ignition[1024]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Dec 12 17:38:45.993321 ignition[1024]: parsed url from cmdline: "" Dec 12 17:38:45.993324 ignition[1024]: no config URL provided Dec 12 17:38:45.993328 ignition[1024]: reading system config file "/usr/lib/ignition/user.ign" Dec 12 17:38:45.993336 ignition[1024]: no config at "/usr/lib/ignition/user.ign" Dec 12 17:38:45.993353 ignition[1024]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Dec 12 17:38:46.085980 ignition[1024]: GET result: OK Dec 12 17:38:46.086066 ignition[1024]: config has been read from IMDS userdata Dec 12 17:38:46.089301 unknown[1024]: fetched base config from "system" Dec 12 17:38:46.086087 ignition[1024]: parsing config with SHA512: 625d7d027cdff182e9498162f31eed8e323b553b8ac4db95b3fb2d2191bd25072a14c579b9828bc03923d84f906f274dbb73f4b9144213e55292d6ad207dbbb2 Dec 12 17:38:46.089312 unknown[1024]: fetched base config from "system" Dec 12 17:38:46.089620 ignition[1024]: fetch: fetch complete Dec 12 17:38:46.089315 unknown[1024]: fetched user config from "azure" Dec 12 17:38:46.089624 ignition[1024]: fetch: fetch passed Dec 12 17:38:46.091455 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Dec 12 17:38:46.089659 ignition[1024]: Ignition finished successfully Dec 12 17:38:46.101928 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Dec 12 17:38:46.144531 ignition[1031]: Ignition 2.22.0 Dec 12 17:38:46.144546 ignition[1031]: Stage: kargs Dec 12 17:38:46.150527 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Dec 12 17:38:46.144706 ignition[1031]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:38:46.156195 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Dec 12 17:38:46.144714 ignition[1031]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Dec 12 17:38:46.145157 ignition[1031]: kargs: kargs passed Dec 12 17:38:46.145193 ignition[1031]: Ignition finished successfully Dec 12 17:38:46.190407 ignition[1037]: Ignition 2.22.0 Dec 12 17:38:46.190421 ignition[1037]: Stage: disks Dec 12 17:38:46.194636 systemd[1]: Finished ignition-disks.service - Ignition (disks). Dec 12 17:38:46.190591 ignition[1037]: no configs at "/usr/lib/ignition/base.d" Dec 12 17:38:46.201624 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Dec 12 17:38:46.190599 ignition[1037]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Dec 12 17:38:46.209660 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Dec 12 17:38:46.191141 ignition[1037]: disks: disks passed Dec 12 17:38:46.218477 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 12 17:38:46.191184 ignition[1037]: Ignition finished successfully Dec 12 17:38:46.226965 systemd[1]: Reached target sysinit.target - System Initialization. Dec 12 17:38:46.235367 systemd[1]: Reached target basic.target - Basic System. Dec 12 17:38:46.244739 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Dec 12 17:38:46.342415 systemd-networkd[1006]: eth0: Gained IPv6LL Dec 12 17:38:46.350500 systemd-fsck[1045]: ROOT: clean, 15/7326000 files, 477845/7359488 blocks Dec 12 17:38:46.359971 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Dec 12 17:38:46.371709 systemd[1]: Mounting sysroot.mount - /sysroot... Dec 12 17:38:46.618248 kernel: EXT4-fs (sda9): mounted filesystem 895d7845-d0e8-43ae-a778-7804b473b868 r/w with ordered data mode. Quota mode: none. Dec 12 17:38:46.618300 systemd[1]: Mounted sysroot.mount - /sysroot. Dec 12 17:38:46.622395 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Dec 12 17:38:46.647682 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 12 17:38:46.665735 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Dec 12 17:38:46.673939 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Dec 12 17:38:46.684767 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Dec 12 17:38:46.714983 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1059) Dec 12 17:38:46.715005 kernel: BTRFS info (device sda6): first mount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:38:46.715013 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:38:46.715019 kernel: BTRFS info (device sda6): turning on async discard Dec 12 17:38:46.684797 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Dec 12 17:38:46.727445 kernel: BTRFS info (device sda6): enabling free space tree Dec 12 17:38:46.727356 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 12 17:38:46.731668 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Dec 12 17:38:46.747720 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Dec 12 17:38:47.406342 coreos-metadata[1061]: Dec 12 17:38:47.406 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Dec 12 17:38:47.415075 coreos-metadata[1061]: Dec 12 17:38:47.415 INFO Fetch successful Dec 12 17:38:47.419263 coreos-metadata[1061]: Dec 12 17:38:47.419 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Dec 12 17:38:47.429243 coreos-metadata[1061]: Dec 12 17:38:47.428 INFO Fetch successful Dec 12 17:38:47.429243 coreos-metadata[1061]: Dec 12 17:38:47.428 INFO wrote hostname ci-4459.2.2-a-260bc0236d to /sysroot/etc/hostname Dec 12 17:38:47.436314 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Dec 12 17:38:47.577858 initrd-setup-root[1089]: cut: /sysroot/etc/passwd: No such file or directory Dec 12 17:38:47.616603 initrd-setup-root[1096]: cut: /sysroot/etc/group: No such file or directory Dec 12 17:38:47.639647 initrd-setup-root[1103]: cut: /sysroot/etc/shadow: No such file or directory Dec 12 17:38:47.647753 initrd-setup-root[1110]: cut: /sysroot/etc/gshadow: No such file or directory Dec 12 17:38:48.532284 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Dec 12 17:38:48.538129 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Dec 12 17:38:48.555834 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Dec 12 17:38:48.564913 systemd[1]: sysroot-oem.mount: Deactivated successfully. Dec 12 17:38:48.575780 kernel: BTRFS info (device sda6): last unmount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:38:48.597770 ignition[1177]: INFO : Ignition 2.22.0 Dec 12 17:38:48.602671 ignition[1177]: INFO : Stage: mount Dec 12 17:38:48.602671 ignition[1177]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 17:38:48.602671 ignition[1177]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Dec 12 17:38:48.602671 ignition[1177]: INFO : mount: mount passed Dec 12 17:38:48.602671 ignition[1177]: INFO : Ignition finished successfully Dec 12 17:38:48.602601 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Dec 12 17:38:48.606619 systemd[1]: Finished ignition-mount.service - Ignition (mount). Dec 12 17:38:48.612389 systemd[1]: Starting ignition-files.service - Ignition (files)... Dec 12 17:38:48.634340 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 12 17:38:48.667240 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1190) Dec 12 17:38:48.677205 kernel: BTRFS info (device sda6): first mount of filesystem 4b8ce5a5-a2aa-4c44-bc9b-80e30d06d25f Dec 12 17:38:48.677218 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Dec 12 17:38:48.686690 kernel: BTRFS info (device sda6): turning on async discard Dec 12 17:38:48.686716 kernel: BTRFS info (device sda6): enabling free space tree Dec 12 17:38:48.688266 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 12 17:38:48.719757 ignition[1207]: INFO : Ignition 2.22.0 Dec 12 17:38:48.723951 ignition[1207]: INFO : Stage: files Dec 12 17:38:48.723951 ignition[1207]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 17:38:48.723951 ignition[1207]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Dec 12 17:38:48.723951 ignition[1207]: DEBUG : files: compiled without relabeling support, skipping Dec 12 17:38:48.742293 ignition[1207]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Dec 12 17:38:48.742293 ignition[1207]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Dec 12 17:38:48.791451 ignition[1207]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Dec 12 17:38:48.797005 ignition[1207]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Dec 12 17:38:48.797005 ignition[1207]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Dec 12 17:38:48.791822 unknown[1207]: wrote ssh authorized keys file for user: core Dec 12 17:38:48.876533 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Dec 12 17:38:48.876533 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-arm64.tar.gz: attempt #1 Dec 12 17:38:48.912951 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Dec 12 17:38:49.045438 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Dec 12 17:38:49.045438 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Dec 12 17:38:49.061179 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Dec 12 17:38:49.061179 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Dec 12 17:38:49.061179 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Dec 12 17:38:49.061179 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 12 17:38:49.061179 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 12 17:38:49.061179 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 12 17:38:49.061179 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 12 17:38:49.114499 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Dec 12 17:38:49.114499 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Dec 12 17:38:49.114499 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Dec 12 17:38:49.114499 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Dec 12 17:38:49.114499 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Dec 12 17:38:49.114499 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-arm64.raw: attempt #1 Dec 12 17:38:49.364920 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Dec 12 17:38:49.603947 ignition[1207]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Dec 12 17:38:49.603947 ignition[1207]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Dec 12 17:38:49.635425 ignition[1207]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 12 17:38:49.648012 ignition[1207]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 12 17:38:49.648012 ignition[1207]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Dec 12 17:38:49.663272 ignition[1207]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Dec 12 17:38:49.663272 ignition[1207]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Dec 12 17:38:49.663272 ignition[1207]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Dec 12 17:38:49.663272 ignition[1207]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Dec 12 17:38:49.663272 ignition[1207]: INFO : files: files passed Dec 12 17:38:49.663272 ignition[1207]: INFO : Ignition finished successfully Dec 12 17:38:49.656761 systemd[1]: Finished ignition-files.service - Ignition (files). Dec 12 17:38:49.669003 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Dec 12 17:38:49.696356 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Dec 12 17:38:49.709549 systemd[1]: ignition-quench.service: Deactivated successfully. Dec 12 17:38:49.709648 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Dec 12 17:38:49.739666 initrd-setup-root-after-ignition[1240]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 12 17:38:49.735806 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 12 17:38:49.766924 initrd-setup-root-after-ignition[1236]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 12 17:38:49.766924 initrd-setup-root-after-ignition[1236]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Dec 12 17:38:49.745461 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Dec 12 17:38:49.756955 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Dec 12 17:38:49.806751 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 12 17:38:49.806848 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Dec 12 17:38:49.816242 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Dec 12 17:38:49.825157 systemd[1]: Reached target initrd.target - Initrd Default Target. Dec 12 17:38:49.833587 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Dec 12 17:38:49.834348 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Dec 12 17:38:49.870955 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 12 17:38:49.877350 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Dec 12 17:38:49.901873 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Dec 12 17:38:49.906768 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 17:38:49.916047 systemd[1]: Stopped target timers.target - Timer Units. Dec 12 17:38:49.924319 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 12 17:38:49.924428 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 12 17:38:49.936344 systemd[1]: Stopped target initrd.target - Initrd Default Target. Dec 12 17:38:49.945240 systemd[1]: Stopped target basic.target - Basic System. Dec 12 17:38:49.952794 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Dec 12 17:38:49.961070 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Dec 12 17:38:49.969950 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Dec 12 17:38:49.979000 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Dec 12 17:38:49.987847 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Dec 12 17:38:49.996779 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Dec 12 17:38:50.005898 systemd[1]: Stopped target sysinit.target - System Initialization. Dec 12 17:38:50.015247 systemd[1]: Stopped target local-fs.target - Local File Systems. Dec 12 17:38:50.023487 systemd[1]: Stopped target swap.target - Swaps. Dec 12 17:38:50.030567 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 12 17:38:50.030681 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Dec 12 17:38:50.041709 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Dec 12 17:38:50.046274 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 17:38:50.055733 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Dec 12 17:38:50.059732 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 17:38:50.065049 systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 12 17:38:50.065146 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Dec 12 17:38:50.078105 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Dec 12 17:38:50.078196 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 12 17:38:50.083814 systemd[1]: ignition-files.service: Deactivated successfully. Dec 12 17:38:50.083890 systemd[1]: Stopped ignition-files.service - Ignition (files). Dec 12 17:38:50.093167 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Dec 12 17:38:50.161697 ignition[1260]: INFO : Ignition 2.22.0 Dec 12 17:38:50.161697 ignition[1260]: INFO : Stage: umount Dec 12 17:38:50.161697 ignition[1260]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 17:38:50.161697 ignition[1260]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Dec 12 17:38:50.161697 ignition[1260]: INFO : umount: umount passed Dec 12 17:38:50.161697 ignition[1260]: INFO : Ignition finished successfully Dec 12 17:38:50.093245 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Dec 12 17:38:50.102978 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Dec 12 17:38:50.118290 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Dec 12 17:38:50.132111 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 12 17:38:50.132311 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 17:38:50.146357 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 12 17:38:50.146452 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Dec 12 17:38:50.163448 systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 12 17:38:50.163543 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Dec 12 17:38:50.170828 systemd[1]: ignition-mount.service: Deactivated successfully. Dec 12 17:38:50.170911 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Dec 12 17:38:50.179833 systemd[1]: sysroot-boot.mount: Deactivated successfully. Dec 12 17:38:50.181833 systemd[1]: ignition-disks.service: Deactivated successfully. Dec 12 17:38:50.181914 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Dec 12 17:38:50.187838 systemd[1]: ignition-kargs.service: Deactivated successfully. Dec 12 17:38:50.187894 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Dec 12 17:38:50.202781 systemd[1]: ignition-fetch.service: Deactivated successfully. Dec 12 17:38:50.202842 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Dec 12 17:38:50.210734 systemd[1]: Stopped target network.target - Network. Dec 12 17:38:50.217883 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Dec 12 17:38:50.217945 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Dec 12 17:38:50.227690 systemd[1]: Stopped target paths.target - Path Units. Dec 12 17:38:50.236188 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 12 17:38:50.240242 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 17:38:50.248884 systemd[1]: Stopped target slices.target - Slice Units. Dec 12 17:38:50.258137 systemd[1]: Stopped target sockets.target - Socket Units. Dec 12 17:38:50.266051 systemd[1]: iscsid.socket: Deactivated successfully. Dec 12 17:38:50.266090 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Dec 12 17:38:50.274990 systemd[1]: iscsiuio.socket: Deactivated successfully. Dec 12 17:38:50.275017 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 12 17:38:50.283271 systemd[1]: ignition-setup.service: Deactivated successfully. Dec 12 17:38:50.283332 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Dec 12 17:38:50.292630 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Dec 12 17:38:50.292666 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Dec 12 17:38:50.301105 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Dec 12 17:38:50.309141 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Dec 12 17:38:50.321690 systemd[1]: systemd-resolved.service: Deactivated successfully. Dec 12 17:38:50.321819 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Dec 12 17:38:50.336201 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Dec 12 17:38:50.550017 kernel: hv_netvsc 000d3ac2-b520-000d-3ac2-b520000d3ac2 eth0: Data path switched from VF: enP37525s1 Dec 12 17:38:50.336539 systemd[1]: systemd-networkd.service: Deactivated successfully. Dec 12 17:38:50.336642 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Dec 12 17:38:50.348601 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Dec 12 17:38:50.349094 systemd[1]: Stopped target network-pre.target - Preparation for Network. Dec 12 17:38:50.357160 systemd[1]: systemd-networkd.socket: Deactivated successfully. Dec 12 17:38:50.357196 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Dec 12 17:38:50.369459 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Dec 12 17:38:50.383442 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Dec 12 17:38:50.383514 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 12 17:38:50.392556 systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 12 17:38:50.392603 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Dec 12 17:38:50.403826 systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 12 17:38:50.403865 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Dec 12 17:38:50.408328 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 12 17:38:50.408362 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 17:38:50.421796 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 17:38:50.434192 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Dec 12 17:38:50.434261 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Dec 12 17:38:50.457037 systemd[1]: sysroot-boot.service: Deactivated successfully. Dec 12 17:38:50.457181 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Dec 12 17:38:50.462480 systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 12 17:38:50.462586 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 17:38:50.512434 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 12 17:38:50.512512 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Dec 12 17:38:50.521229 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 12 17:38:50.521262 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 17:38:50.535116 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 12 17:38:50.535166 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Dec 12 17:38:50.550071 systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 12 17:38:50.550124 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Dec 12 17:38:50.558681 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 12 17:38:50.558721 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 12 17:38:50.574370 systemd[1]: initrd-setup-root.service: Deactivated successfully. Dec 12 17:38:50.574420 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Dec 12 17:38:50.585019 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Dec 12 17:38:50.600187 systemd[1]: systemd-network-generator.service: Deactivated successfully. Dec 12 17:38:50.600263 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 17:38:50.615823 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 12 17:38:50.615883 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 17:38:50.627987 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Dec 12 17:38:50.628034 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 12 17:38:50.821631 systemd-journald[225]: Received SIGTERM from PID 1 (systemd). Dec 12 17:38:50.637570 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 12 17:38:50.637621 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 17:38:50.643871 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 17:38:50.643908 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:38:50.659906 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Dec 12 17:38:50.659949 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Dec 12 17:38:50.659971 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Dec 12 17:38:50.660005 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Dec 12 17:38:50.660322 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 12 17:38:50.660422 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Dec 12 17:38:50.674411 systemd[1]: network-cleanup.service: Deactivated successfully. Dec 12 17:38:50.674563 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Dec 12 17:38:50.685280 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Dec 12 17:38:50.697061 systemd[1]: Starting initrd-switch-root.service - Switch Root... Dec 12 17:38:50.722424 systemd[1]: Switching root. Dec 12 17:38:50.889658 systemd-journald[225]: Journal stopped Dec 12 17:38:55.601009 kernel: SELinux: policy capability network_peer_controls=1 Dec 12 17:38:55.601028 kernel: SELinux: policy capability open_perms=1 Dec 12 17:38:55.601035 kernel: SELinux: policy capability extended_socket_class=1 Dec 12 17:38:55.601041 kernel: SELinux: policy capability always_check_network=0 Dec 12 17:38:55.601046 kernel: SELinux: policy capability cgroup_seclabel=1 Dec 12 17:38:55.601052 kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 12 17:38:55.601058 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Dec 12 17:38:55.601064 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Dec 12 17:38:55.601069 kernel: SELinux: policy capability userspace_initial_context=0 Dec 12 17:38:55.601074 kernel: audit: type=1403 audit(1765561131.868:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Dec 12 17:38:55.601081 systemd[1]: Successfully loaded SELinux policy in 185.028ms. Dec 12 17:38:55.601089 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 4.410ms. Dec 12 17:38:55.601095 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 12 17:38:55.601101 systemd[1]: Detected virtualization microsoft. Dec 12 17:38:55.601107 systemd[1]: Detected architecture arm64. Dec 12 17:38:55.601113 systemd[1]: Detected first boot. Dec 12 17:38:55.601122 systemd[1]: Hostname set to . Dec 12 17:38:55.601128 systemd[1]: Initializing machine ID from random generator. Dec 12 17:38:55.601134 zram_generator::config[1305]: No configuration found. Dec 12 17:38:55.601140 kernel: NET: Registered PF_VSOCK protocol family Dec 12 17:38:55.601146 systemd[1]: Populated /etc with preset unit settings. Dec 12 17:38:55.601152 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Dec 12 17:38:55.601158 systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 12 17:38:55.601165 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Dec 12 17:38:55.601171 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 12 17:38:55.601177 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Dec 12 17:38:55.601183 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Dec 12 17:38:55.601189 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Dec 12 17:38:55.601195 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Dec 12 17:38:55.601201 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Dec 12 17:38:55.601208 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Dec 12 17:38:55.601215 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Dec 12 17:38:55.601242 systemd[1]: Created slice user.slice - User and Session Slice. Dec 12 17:38:55.601248 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 17:38:55.601255 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 17:38:55.601261 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Dec 12 17:38:55.601267 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Dec 12 17:38:55.601274 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Dec 12 17:38:55.601281 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 12 17:38:55.601287 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Dec 12 17:38:55.601295 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 17:38:55.601301 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 12 17:38:55.601307 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Dec 12 17:38:55.601313 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Dec 12 17:38:55.601320 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Dec 12 17:38:55.601326 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Dec 12 17:38:55.601333 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 17:38:55.601339 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 12 17:38:55.601345 systemd[1]: Reached target slices.target - Slice Units. Dec 12 17:38:55.601351 systemd[1]: Reached target swap.target - Swaps. Dec 12 17:38:55.601357 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Dec 12 17:38:55.601363 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Dec 12 17:38:55.601371 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Dec 12 17:38:55.601377 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 12 17:38:55.601383 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 12 17:38:55.601389 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 17:38:55.601396 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Dec 12 17:38:55.601402 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Dec 12 17:38:55.601408 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Dec 12 17:38:55.601416 systemd[1]: Mounting media.mount - External Media Directory... Dec 12 17:38:55.601422 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Dec 12 17:38:55.601428 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Dec 12 17:38:55.601434 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Dec 12 17:38:55.601441 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 12 17:38:55.601447 systemd[1]: Reached target machines.target - Containers. Dec 12 17:38:55.601453 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Dec 12 17:38:55.601459 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:38:55.601466 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 12 17:38:55.601473 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Dec 12 17:38:55.601479 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 17:38:55.601485 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 12 17:38:55.601491 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 17:38:55.601497 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Dec 12 17:38:55.601503 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 17:38:55.601510 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Dec 12 17:38:55.601516 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 12 17:38:55.601524 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Dec 12 17:38:55.601530 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Dec 12 17:38:55.601536 systemd[1]: Stopped systemd-fsck-usr.service. Dec 12 17:38:55.601543 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:38:55.601549 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 12 17:38:55.601555 kernel: fuse: init (API version 7.41) Dec 12 17:38:55.601561 kernel: loop: module loaded Dec 12 17:38:55.601567 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 12 17:38:55.601574 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 12 17:38:55.601580 kernel: ACPI: bus type drm_connector registered Dec 12 17:38:55.601586 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Dec 12 17:38:55.601606 systemd-journald[1402]: Collecting audit messages is disabled. Dec 12 17:38:55.601623 systemd-journald[1402]: Journal started Dec 12 17:38:55.601638 systemd-journald[1402]: Runtime Journal (/run/log/journal/4c9eab79c9c8458d89cedf62bdbdd32e) is 8M, max 78.3M, 70.3M free. Dec 12 17:38:54.787766 systemd[1]: Queued start job for default target multi-user.target. Dec 12 17:38:54.794711 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Dec 12 17:38:54.795107 systemd[1]: systemd-journald.service: Deactivated successfully. Dec 12 17:38:54.797379 systemd[1]: systemd-journald.service: Consumed 2.466s CPU time. Dec 12 17:38:55.616698 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Dec 12 17:38:55.624667 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 12 17:38:55.635487 systemd[1]: verity-setup.service: Deactivated successfully. Dec 12 17:38:55.635536 systemd[1]: Stopped verity-setup.service. Dec 12 17:38:55.649982 systemd[1]: Started systemd-journald.service - Journal Service. Dec 12 17:38:55.650716 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Dec 12 17:38:55.655432 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Dec 12 17:38:55.660430 systemd[1]: Mounted media.mount - External Media Directory. Dec 12 17:38:55.664955 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Dec 12 17:38:55.669876 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Dec 12 17:38:55.674797 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Dec 12 17:38:55.679501 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Dec 12 17:38:55.684606 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 17:38:55.690345 systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 12 17:38:55.690474 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Dec 12 17:38:55.696085 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 17:38:55.696223 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 17:38:55.701323 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 12 17:38:55.701441 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 12 17:38:55.706059 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 17:38:55.706187 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 17:38:55.711722 systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 12 17:38:55.711855 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Dec 12 17:38:55.716586 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 17:38:55.716706 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 17:38:55.721579 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 12 17:38:55.726709 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 17:38:55.732443 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Dec 12 17:38:55.737879 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Dec 12 17:38:55.743706 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 17:38:55.757081 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 12 17:38:55.763135 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Dec 12 17:38:55.769811 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Dec 12 17:38:55.774723 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Dec 12 17:38:55.774749 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 12 17:38:55.779687 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Dec 12 17:38:55.785976 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Dec 12 17:38:55.790347 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:38:55.797893 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Dec 12 17:38:55.803656 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Dec 12 17:38:55.808505 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 12 17:38:55.809867 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Dec 12 17:38:55.814595 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 12 17:38:55.820968 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 12 17:38:55.827960 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Dec 12 17:38:55.834064 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 12 17:38:55.840547 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Dec 12 17:38:55.846043 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Dec 12 17:38:55.851515 systemd-journald[1402]: Time spent on flushing to /var/log/journal/4c9eab79c9c8458d89cedf62bdbdd32e is 13.004ms for 936 entries. Dec 12 17:38:55.851515 systemd-journald[1402]: System Journal (/var/log/journal/4c9eab79c9c8458d89cedf62bdbdd32e) is 8M, max 2.6G, 2.6G free. Dec 12 17:38:55.920468 systemd-journald[1402]: Received client request to flush runtime journal. Dec 12 17:38:55.920527 kernel: loop0: detected capacity change from 0 to 27936 Dec 12 17:38:55.865912 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Dec 12 17:38:55.871807 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Dec 12 17:38:55.877978 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Dec 12 17:38:55.902679 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 12 17:38:55.923053 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Dec 12 17:38:55.957948 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Dec 12 17:38:55.959706 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Dec 12 17:38:55.959742 systemd-tmpfiles[1446]: ACLs are not supported, ignoring. Dec 12 17:38:55.959750 systemd-tmpfiles[1446]: ACLs are not supported, ignoring. Dec 12 17:38:55.965591 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 12 17:38:55.975399 systemd[1]: Starting systemd-sysusers.service - Create System Users... Dec 12 17:38:56.100559 systemd[1]: Finished systemd-sysusers.service - Create System Users. Dec 12 17:38:56.106749 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 12 17:38:56.127975 systemd-tmpfiles[1461]: ACLs are not supported, ignoring. Dec 12 17:38:56.128274 systemd-tmpfiles[1461]: ACLs are not supported, ignoring. Dec 12 17:38:56.130878 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 17:38:56.279241 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Dec 12 17:38:56.373258 kernel: loop1: detected capacity change from 0 to 100632 Dec 12 17:38:56.395166 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Dec 12 17:38:56.401727 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 17:38:56.431632 systemd-udevd[1467]: Using default interface naming scheme 'v255'. Dec 12 17:38:56.685970 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 17:38:56.696323 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 12 17:38:56.727876 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Dec 12 17:38:56.768406 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Dec 12 17:38:56.826246 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#294 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Dec 12 17:38:56.839294 systemd[1]: Started systemd-userdbd.service - User Database Manager. Dec 12 17:38:56.855500 kernel: loop2: detected capacity change from 0 to 207008 Dec 12 17:38:56.855587 kernel: mousedev: PS/2 mouse device common for all mice Dec 12 17:38:56.890274 kernel: hv_vmbus: registering driver hv_balloon Dec 12 17:38:56.890368 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Dec 12 17:38:56.895842 kernel: hv_balloon: Memory hot add disabled on ARM64 Dec 12 17:38:56.935268 kernel: hv_vmbus: registering driver hyperv_fb Dec 12 17:38:56.935347 kernel: loop3: detected capacity change from 0 to 119840 Dec 12 17:38:56.956237 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Dec 12 17:38:56.963535 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Dec 12 17:38:56.965257 systemd-networkd[1491]: lo: Link UP Dec 12 17:38:56.965264 systemd-networkd[1491]: lo: Gained carrier Dec 12 17:38:56.966379 systemd-networkd[1491]: Enumeration completed Dec 12 17:38:56.966464 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 12 17:38:56.974342 kernel: Console: switching to colour dummy device 80x25 Dec 12 17:38:56.971260 systemd-networkd[1491]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:38:56.971264 systemd-networkd[1491]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 12 17:38:56.975963 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Dec 12 17:38:56.986551 kernel: Console: switching to colour frame buffer device 128x48 Dec 12 17:38:56.991482 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Dec 12 17:38:57.008479 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:38:57.024431 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 17:38:57.025810 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:38:57.036637 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:38:57.052350 kernel: mlx5_core 9295:00:02.0 enP37525s1: Link up Dec 12 17:38:57.053249 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 17:38:57.053470 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:38:57.064351 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 17:38:57.082240 kernel: hv_netvsc 000d3ac2-b520-000d-3ac2-b520000d3ac2 eth0: Data path switched to VF: enP37525s1 Dec 12 17:38:57.082588 systemd-networkd[1491]: enP37525s1: Link UP Dec 12 17:38:57.082952 systemd-networkd[1491]: eth0: Link UP Dec 12 17:38:57.083026 systemd-networkd[1491]: eth0: Gained carrier Dec 12 17:38:57.083081 systemd-networkd[1491]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:38:57.087630 systemd-networkd[1491]: enP37525s1: Gained carrier Dec 12 17:38:57.089723 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Dec 12 17:38:57.102489 systemd-networkd[1491]: eth0: DHCPv4 address 10.200.20.10/24, gateway 10.200.20.1 acquired from 168.63.129.16 Dec 12 17:38:57.126249 kernel: MACsec IEEE 802.1AE Dec 12 17:38:57.127715 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Dec 12 17:38:57.134008 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Dec 12 17:38:57.181148 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Dec 12 17:38:57.338255 kernel: loop4: detected capacity change from 0 to 27936 Dec 12 17:38:57.355238 kernel: loop5: detected capacity change from 0 to 100632 Dec 12 17:38:57.373244 kernel: loop6: detected capacity change from 0 to 207008 Dec 12 17:38:57.390266 kernel: loop7: detected capacity change from 0 to 119840 Dec 12 17:38:57.400507 (sd-merge)[1611]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. Dec 12 17:38:57.400889 (sd-merge)[1611]: Merged extensions into '/usr'. Dec 12 17:38:57.403367 systemd[1]: Reload requested from client PID 1445 ('systemd-sysext') (unit systemd-sysext.service)... Dec 12 17:38:57.403379 systemd[1]: Reloading... Dec 12 17:38:57.454279 zram_generator::config[1641]: No configuration found. Dec 12 17:38:57.623214 systemd[1]: Reloading finished in 219 ms. Dec 12 17:38:57.649053 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Dec 12 17:38:57.655040 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 17:38:57.671269 systemd[1]: Starting ensure-sysext.service... Dec 12 17:38:57.676249 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 12 17:38:57.692148 systemd[1]: Reload requested from client PID 1699 ('systemctl') (unit ensure-sysext.service)... Dec 12 17:38:57.692160 systemd[1]: Reloading... Dec 12 17:38:57.723688 systemd-tmpfiles[1700]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Dec 12 17:38:57.723710 systemd-tmpfiles[1700]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Dec 12 17:38:57.723917 systemd-tmpfiles[1700]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 12 17:38:57.724054 systemd-tmpfiles[1700]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Dec 12 17:38:57.725940 systemd-tmpfiles[1700]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Dec 12 17:38:57.726196 systemd-tmpfiles[1700]: ACLs are not supported, ignoring. Dec 12 17:38:57.730954 systemd-tmpfiles[1700]: ACLs are not supported, ignoring. Dec 12 17:38:57.739246 zram_generator::config[1728]: No configuration found. Dec 12 17:38:57.755116 systemd-tmpfiles[1700]: Detected autofs mount point /boot during canonicalization of boot. Dec 12 17:38:57.755126 systemd-tmpfiles[1700]: Skipping /boot Dec 12 17:38:57.760933 systemd-tmpfiles[1700]: Detected autofs mount point /boot during canonicalization of boot. Dec 12 17:38:57.761040 systemd-tmpfiles[1700]: Skipping /boot Dec 12 17:38:57.899831 systemd[1]: Reloading finished in 207 ms. Dec 12 17:38:57.910649 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 17:38:57.933118 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 12 17:38:57.945994 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Dec 12 17:38:57.953424 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Dec 12 17:38:57.961490 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 12 17:38:57.970299 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Dec 12 17:38:57.977141 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:38:57.979583 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 17:38:57.988150 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 17:38:57.999424 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 17:38:58.004756 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:38:58.004854 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:38:58.007909 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 17:38:58.009261 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 17:38:58.015106 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 17:38:58.015272 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 17:38:58.021267 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 17:38:58.021395 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 17:38:58.031021 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:38:58.034488 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 17:38:58.046275 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 17:38:58.054583 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 17:38:58.060748 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:38:58.062416 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:38:58.064363 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Dec 12 17:38:58.075714 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 17:38:58.075945 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 17:38:58.082606 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 17:38:58.083011 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 17:38:58.084554 systemd-resolved[1791]: Positive Trust Anchors: Dec 12 17:38:58.084801 systemd-resolved[1791]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 12 17:38:58.084825 systemd-resolved[1791]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 12 17:38:58.088022 systemd-resolved[1791]: Using system hostname 'ci-4459.2.2-a-260bc0236d'. Dec 12 17:38:58.089243 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Dec 12 17:38:58.095616 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 12 17:38:58.100869 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 17:38:58.102257 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 17:38:58.113660 systemd[1]: Reached target network.target - Network. Dec 12 17:38:58.117589 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 12 17:38:58.122976 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 17:38:58.124126 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 17:38:58.135434 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 12 17:38:58.149422 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 17:38:58.159064 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 17:38:58.163822 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 17:38:58.163920 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 17:38:58.164015 systemd[1]: Reached target time-set.target - System Time Set. Dec 12 17:38:58.170250 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 17:38:58.170398 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 17:38:58.171249 augenrules[1831]: No rules Dec 12 17:38:58.175538 systemd[1]: audit-rules.service: Deactivated successfully. Dec 12 17:38:58.177668 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 12 17:38:58.182622 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 12 17:38:58.182748 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 12 17:38:58.187774 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 17:38:58.187898 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 17:38:58.193882 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 17:38:58.194022 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 17:38:58.202432 systemd[1]: Finished ensure-sysext.service. Dec 12 17:38:58.207827 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 12 17:38:58.207888 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 12 17:38:58.502331 systemd-networkd[1491]: eth0: Gained IPv6LL Dec 12 17:38:58.504289 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Dec 12 17:38:58.510146 systemd[1]: Reached target network-online.target - Network is Online. Dec 12 17:38:58.782536 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Dec 12 17:38:58.788285 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 12 17:39:01.713032 ldconfig[1439]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Dec 12 17:39:01.724898 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Dec 12 17:39:01.731588 systemd[1]: Starting systemd-update-done.service - Update is Completed... Dec 12 17:39:01.743483 systemd[1]: Finished systemd-update-done.service - Update is Completed. Dec 12 17:39:01.748983 systemd[1]: Reached target sysinit.target - System Initialization. Dec 12 17:39:01.753575 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Dec 12 17:39:01.758433 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Dec 12 17:39:01.763797 systemd[1]: Started logrotate.timer - Daily rotation of log files. Dec 12 17:39:01.768249 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Dec 12 17:39:01.773418 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Dec 12 17:39:01.779543 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Dec 12 17:39:01.779572 systemd[1]: Reached target paths.target - Path Units. Dec 12 17:39:01.783758 systemd[1]: Reached target timers.target - Timer Units. Dec 12 17:39:01.803583 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Dec 12 17:39:01.809843 systemd[1]: Starting docker.socket - Docker Socket for the API... Dec 12 17:39:01.815335 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Dec 12 17:39:01.820646 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Dec 12 17:39:01.826249 systemd[1]: Reached target ssh-access.target - SSH Access Available. Dec 12 17:39:01.832310 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Dec 12 17:39:01.836762 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Dec 12 17:39:01.842854 systemd[1]: Listening on docker.socket - Docker Socket for the API. Dec 12 17:39:01.848179 systemd[1]: Reached target sockets.target - Socket Units. Dec 12 17:39:01.852965 systemd[1]: Reached target basic.target - Basic System. Dec 12 17:39:01.856967 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Dec 12 17:39:01.857071 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Dec 12 17:39:01.859241 systemd[1]: Starting chronyd.service - NTP client/server... Dec 12 17:39:01.878024 systemd[1]: Starting containerd.service - containerd container runtime... Dec 12 17:39:01.886393 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Dec 12 17:39:01.900359 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Dec 12 17:39:01.906354 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Dec 12 17:39:01.920102 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Dec 12 17:39:01.926442 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Dec 12 17:39:01.927480 chronyd[1849]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Dec 12 17:39:01.931127 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Dec 12 17:39:01.934355 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. Dec 12 17:39:01.939572 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). Dec 12 17:39:01.940742 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:39:01.941175 KVP[1859]: KVP starting; pid is:1859 Dec 12 17:39:01.954715 kernel: hv_utils: KVP IC version 4.0 Dec 12 17:39:01.954763 jq[1857]: false Dec 12 17:39:01.950619 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Dec 12 17:39:01.949709 KVP[1859]: KVP LIC Version: 3.1 Dec 12 17:39:01.951450 chronyd[1849]: Timezone right/UTC failed leap second check, ignoring Dec 12 17:39:01.954886 chronyd[1849]: Loaded seccomp filter (level 2) Dec 12 17:39:01.956371 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Dec 12 17:39:01.966095 extend-filesystems[1858]: Found /dev/sda6 Dec 12 17:39:01.969269 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Dec 12 17:39:01.976604 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Dec 12 17:39:01.985049 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Dec 12 17:39:01.990003 extend-filesystems[1858]: Found /dev/sda9 Dec 12 17:39:01.993412 extend-filesystems[1858]: Checking size of /dev/sda9 Dec 12 17:39:01.998340 systemd[1]: Starting systemd-logind.service - User Login Management... Dec 12 17:39:02.003758 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Dec 12 17:39:02.004173 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Dec 12 17:39:02.005856 systemd[1]: Starting update-engine.service - Update Engine... Dec 12 17:39:02.012659 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Dec 12 17:39:02.018481 systemd[1]: Started chronyd.service - NTP client/server. Dec 12 17:39:02.026017 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Dec 12 17:39:02.030549 jq[1882]: true Dec 12 17:39:02.033893 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Dec 12 17:39:02.034403 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Dec 12 17:39:02.036747 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Dec 12 17:39:02.036906 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Dec 12 17:39:02.049594 extend-filesystems[1858]: Old size kept for /dev/sda9 Dec 12 17:39:02.058131 systemd[1]: extend-filesystems.service: Deactivated successfully. Dec 12 17:39:02.058323 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Dec 12 17:39:02.065955 systemd[1]: motdgen.service: Deactivated successfully. Dec 12 17:39:02.066244 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Dec 12 17:39:02.070708 jq[1892]: true Dec 12 17:39:02.073407 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Dec 12 17:39:02.107642 (ntainerd)[1898]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Dec 12 17:39:02.134109 update_engine[1879]: I20251212 17:39:02.134027 1879 main.cc:92] Flatcar Update Engine starting Dec 12 17:39:02.153560 systemd-logind[1875]: New seat seat0. Dec 12 17:39:02.158373 systemd-logind[1875]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Dec 12 17:39:02.158549 systemd[1]: Started systemd-logind.service - User Login Management. Dec 12 17:39:02.189543 tar[1888]: linux-arm64/LICENSE Dec 12 17:39:02.193150 tar[1888]: linux-arm64/helm Dec 12 17:39:02.215678 bash[1933]: Updated "/home/core/.ssh/authorized_keys" Dec 12 17:39:02.218377 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Dec 12 17:39:02.228545 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Dec 12 17:39:02.366519 dbus-daemon[1852]: [system] SELinux support is enabled Dec 12 17:39:02.366915 systemd[1]: Started dbus.service - D-Bus System Message Bus. Dec 12 17:39:02.376232 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Dec 12 17:39:02.376261 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Dec 12 17:39:02.385992 update_engine[1879]: I20251212 17:39:02.384001 1879 update_check_scheduler.cc:74] Next update check in 9m25s Dec 12 17:39:02.385917 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Dec 12 17:39:02.385932 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Dec 12 17:39:02.395129 dbus-daemon[1852]: [system] Successfully activated service 'org.freedesktop.systemd1' Dec 12 17:39:02.395321 systemd[1]: Started update-engine.service - Update Engine. Dec 12 17:39:02.406079 systemd[1]: Started locksmithd.service - Cluster reboot manager. Dec 12 17:39:02.448276 coreos-metadata[1851]: Dec 12 17:39:02.448 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Dec 12 17:39:02.452518 coreos-metadata[1851]: Dec 12 17:39:02.452 INFO Fetch successful Dec 12 17:39:02.452518 coreos-metadata[1851]: Dec 12 17:39:02.452 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 Dec 12 17:39:02.458120 coreos-metadata[1851]: Dec 12 17:39:02.458 INFO Fetch successful Dec 12 17:39:02.458533 coreos-metadata[1851]: Dec 12 17:39:02.458 INFO Fetching http://168.63.129.16/machine/e4e56173-ae60-41f9-bf77-937668a35480/1785fc4d%2D133f%2D47cc%2Db07a%2D887dd3b9d4cd.%5Fci%2D4459.2.2%2Da%2D260bc0236d?comp=config&type=sharedConfig&incarnation=1: Attempt #1 Dec 12 17:39:02.495651 sshd_keygen[1880]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Dec 12 17:39:02.500998 coreos-metadata[1851]: Dec 12 17:39:02.500 INFO Fetch successful Dec 12 17:39:02.500998 coreos-metadata[1851]: Dec 12 17:39:02.500 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 Dec 12 17:39:02.511328 coreos-metadata[1851]: Dec 12 17:39:02.511 INFO Fetch successful Dec 12 17:39:02.523264 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Dec 12 17:39:02.536097 systemd[1]: Starting issuegen.service - Generate /run/issue... Dec 12 17:39:02.542341 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... Dec 12 17:39:02.556778 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Dec 12 17:39:02.566055 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Dec 12 17:39:02.571123 systemd[1]: issuegen.service: Deactivated successfully. Dec 12 17:39:02.576103 systemd[1]: Finished issuegen.service - Generate /run/issue. Dec 12 17:39:02.584839 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Dec 12 17:39:02.593557 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. Dec 12 17:39:02.619926 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Dec 12 17:39:02.629833 systemd[1]: Started getty@tty1.service - Getty on tty1. Dec 12 17:39:02.636013 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Dec 12 17:39:02.642552 systemd[1]: Reached target getty.target - Login Prompts. Dec 12 17:39:02.672330 locksmithd[1997]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Dec 12 17:39:02.676582 tar[1888]: linux-arm64/README.md Dec 12 17:39:02.687003 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Dec 12 17:39:02.906098 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:39:02.912669 (kubelet)[2047]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:39:02.987708 containerd[1898]: time="2025-12-12T17:39:02Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Dec 12 17:39:02.988650 containerd[1898]: time="2025-12-12T17:39:02.988611860Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Dec 12 17:39:02.996279 containerd[1898]: time="2025-12-12T17:39:02.996249556Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.648µs" Dec 12 17:39:02.997176 containerd[1898]: time="2025-12-12T17:39:02.996478324Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Dec 12 17:39:02.997176 containerd[1898]: time="2025-12-12T17:39:02.996940756Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Dec 12 17:39:02.997176 containerd[1898]: time="2025-12-12T17:39:02.997104100Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Dec 12 17:39:02.997176 containerd[1898]: time="2025-12-12T17:39:02.997120164Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Dec 12 17:39:02.997176 containerd[1898]: time="2025-12-12T17:39:02.997140420Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 12 17:39:02.997320 containerd[1898]: time="2025-12-12T17:39:02.997183556Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 12 17:39:02.997320 containerd[1898]: time="2025-12-12T17:39:02.997191084Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 12 17:39:02.997424 containerd[1898]: time="2025-12-12T17:39:02.997396508Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 12 17:39:02.997424 containerd[1898]: time="2025-12-12T17:39:02.997417852Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 12 17:39:02.997457 containerd[1898]: time="2025-12-12T17:39:02.997426404Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 12 17:39:02.997457 containerd[1898]: time="2025-12-12T17:39:02.997433452Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Dec 12 17:39:02.997516 containerd[1898]: time="2025-12-12T17:39:02.997502980Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Dec 12 17:39:02.997680 containerd[1898]: time="2025-12-12T17:39:02.997665004Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 12 17:39:02.997703 containerd[1898]: time="2025-12-12T17:39:02.997690716Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 12 17:39:02.997703 containerd[1898]: time="2025-12-12T17:39:02.997700228Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Dec 12 17:39:02.997733 containerd[1898]: time="2025-12-12T17:39:02.997726988Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Dec 12 17:39:02.997891 containerd[1898]: time="2025-12-12T17:39:02.997875724Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Dec 12 17:39:02.997949 containerd[1898]: time="2025-12-12T17:39:02.997933772Z" level=info msg="metadata content store policy set" policy=shared Dec 12 17:39:03.013344 containerd[1898]: time="2025-12-12T17:39:03.013299556Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Dec 12 17:39:03.013475 containerd[1898]: time="2025-12-12T17:39:03.013370820Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Dec 12 17:39:03.013475 containerd[1898]: time="2025-12-12T17:39:03.013381820Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Dec 12 17:39:03.013475 containerd[1898]: time="2025-12-12T17:39:03.013425652Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Dec 12 17:39:03.013475 containerd[1898]: time="2025-12-12T17:39:03.013435132Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Dec 12 17:39:03.013475 containerd[1898]: time="2025-12-12T17:39:03.013443380Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Dec 12 17:39:03.013475 containerd[1898]: time="2025-12-12T17:39:03.013455108Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Dec 12 17:39:03.013475 containerd[1898]: time="2025-12-12T17:39:03.013469068Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Dec 12 17:39:03.013475 containerd[1898]: time="2025-12-12T17:39:03.013477172Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Dec 12 17:39:03.013580 containerd[1898]: time="2025-12-12T17:39:03.013483844Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Dec 12 17:39:03.013580 containerd[1898]: time="2025-12-12T17:39:03.013491412Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Dec 12 17:39:03.013580 containerd[1898]: time="2025-12-12T17:39:03.013506748Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Dec 12 17:39:03.013671 containerd[1898]: time="2025-12-12T17:39:03.013651780Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Dec 12 17:39:03.013686 containerd[1898]: time="2025-12-12T17:39:03.013671052Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Dec 12 17:39:03.013686 containerd[1898]: time="2025-12-12T17:39:03.013681204Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Dec 12 17:39:03.013708 containerd[1898]: time="2025-12-12T17:39:03.013689540Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Dec 12 17:39:03.013708 containerd[1898]: time="2025-12-12T17:39:03.013697292Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Dec 12 17:39:03.013708 containerd[1898]: time="2025-12-12T17:39:03.013705132Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Dec 12 17:39:03.013747 containerd[1898]: time="2025-12-12T17:39:03.013712036Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Dec 12 17:39:03.013747 containerd[1898]: time="2025-12-12T17:39:03.013719044Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Dec 12 17:39:03.013747 containerd[1898]: time="2025-12-12T17:39:03.013728900Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Dec 12 17:39:03.013747 containerd[1898]: time="2025-12-12T17:39:03.013736652Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Dec 12 17:39:03.013747 containerd[1898]: time="2025-12-12T17:39:03.013743612Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Dec 12 17:39:03.013802 containerd[1898]: time="2025-12-12T17:39:03.013786732Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Dec 12 17:39:03.013802 containerd[1898]: time="2025-12-12T17:39:03.013796868Z" level=info msg="Start snapshots syncer" Dec 12 17:39:03.013837 containerd[1898]: time="2025-12-12T17:39:03.013817924Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Dec 12 17:39:03.014036 containerd[1898]: time="2025-12-12T17:39:03.014004100Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Dec 12 17:39:03.014130 containerd[1898]: time="2025-12-12T17:39:03.014050444Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Dec 12 17:39:03.014130 containerd[1898]: time="2025-12-12T17:39:03.014087284Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Dec 12 17:39:03.014205 containerd[1898]: time="2025-12-12T17:39:03.014186388Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Dec 12 17:39:03.014295 containerd[1898]: time="2025-12-12T17:39:03.014207836Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Dec 12 17:39:03.014295 containerd[1898]: time="2025-12-12T17:39:03.014236108Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Dec 12 17:39:03.014295 containerd[1898]: time="2025-12-12T17:39:03.014244460Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Dec 12 17:39:03.014295 containerd[1898]: time="2025-12-12T17:39:03.014252412Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Dec 12 17:39:03.014295 containerd[1898]: time="2025-12-12T17:39:03.014259076Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Dec 12 17:39:03.014295 containerd[1898]: time="2025-12-12T17:39:03.014266044Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Dec 12 17:39:03.014295 containerd[1898]: time="2025-12-12T17:39:03.014284492Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Dec 12 17:39:03.014295 containerd[1898]: time="2025-12-12T17:39:03.014292564Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Dec 12 17:39:03.014457 containerd[1898]: time="2025-12-12T17:39:03.014313908Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Dec 12 17:39:03.014457 containerd[1898]: time="2025-12-12T17:39:03.014336660Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 12 17:39:03.014457 containerd[1898]: time="2025-12-12T17:39:03.014346628Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 12 17:39:03.014457 containerd[1898]: time="2025-12-12T17:39:03.014351972Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 12 17:39:03.014457 containerd[1898]: time="2025-12-12T17:39:03.014357460Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 12 17:39:03.014457 containerd[1898]: time="2025-12-12T17:39:03.014361692Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Dec 12 17:39:03.014457 containerd[1898]: time="2025-12-12T17:39:03.014367332Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Dec 12 17:39:03.014457 containerd[1898]: time="2025-12-12T17:39:03.014374716Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Dec 12 17:39:03.014457 containerd[1898]: time="2025-12-12T17:39:03.014387060Z" level=info msg="runtime interface created" Dec 12 17:39:03.014457 containerd[1898]: time="2025-12-12T17:39:03.014390348Z" level=info msg="created NRI interface" Dec 12 17:39:03.014457 containerd[1898]: time="2025-12-12T17:39:03.014395188Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Dec 12 17:39:03.014457 containerd[1898]: time="2025-12-12T17:39:03.014403212Z" level=info msg="Connect containerd service" Dec 12 17:39:03.014457 containerd[1898]: time="2025-12-12T17:39:03.014417980Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Dec 12 17:39:03.014997 containerd[1898]: time="2025-12-12T17:39:03.014972068Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 12 17:39:03.255949 kubelet[2047]: E1212 17:39:03.255826 2047 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:39:03.258131 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:39:03.258395 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:39:03.258747 systemd[1]: kubelet.service: Consumed 546ms CPU time, 253.4M memory peak. Dec 12 17:39:03.314161 containerd[1898]: time="2025-12-12T17:39:03.314097436Z" level=info msg="Start subscribing containerd event" Dec 12 17:39:03.314385 containerd[1898]: time="2025-12-12T17:39:03.314233532Z" level=info msg="Start recovering state" Dec 12 17:39:03.314385 containerd[1898]: time="2025-12-12T17:39:03.314277388Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Dec 12 17:39:03.314385 containerd[1898]: time="2025-12-12T17:39:03.314320988Z" level=info msg=serving... address=/run/containerd/containerd.sock Dec 12 17:39:03.314596 containerd[1898]: time="2025-12-12T17:39:03.314494836Z" level=info msg="Start event monitor" Dec 12 17:39:03.314596 containerd[1898]: time="2025-12-12T17:39:03.314528876Z" level=info msg="Start cni network conf syncer for default" Dec 12 17:39:03.314596 containerd[1898]: time="2025-12-12T17:39:03.314534844Z" level=info msg="Start streaming server" Dec 12 17:39:03.314596 containerd[1898]: time="2025-12-12T17:39:03.314542076Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Dec 12 17:39:03.314596 containerd[1898]: time="2025-12-12T17:39:03.314549196Z" level=info msg="runtime interface starting up..." Dec 12 17:39:03.314596 containerd[1898]: time="2025-12-12T17:39:03.314553980Z" level=info msg="starting plugins..." Dec 12 17:39:03.314596 containerd[1898]: time="2025-12-12T17:39:03.314566652Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Dec 12 17:39:03.314869 containerd[1898]: time="2025-12-12T17:39:03.314813804Z" level=info msg="containerd successfully booted in 0.327438s" Dec 12 17:39:03.315082 systemd[1]: Started containerd.service - containerd container runtime. Dec 12 17:39:03.320554 systemd[1]: Reached target multi-user.target - Multi-User System. Dec 12 17:39:03.328663 systemd[1]: Startup finished in 1.629s (kernel) + 11.617s (initrd) + 11.643s (userspace) = 24.889s. Dec 12 17:39:03.683979 login[2027]: pam_lastlog(login:session): file /var/log/lastlog is locked/write, retrying Dec 12 17:39:03.685709 login[2028]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:39:03.692029 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Dec 12 17:39:03.694412 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Dec 12 17:39:03.700468 systemd-logind[1875]: New session 1 of user core. Dec 12 17:39:03.711351 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Dec 12 17:39:03.713766 systemd[1]: Starting user@500.service - User Manager for UID 500... Dec 12 17:39:03.725203 (systemd)[2074]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Dec 12 17:39:03.727102 systemd-logind[1875]: New session c1 of user core. Dec 12 17:39:03.848840 systemd[2074]: Queued start job for default target default.target. Dec 12 17:39:03.857956 systemd[2074]: Created slice app.slice - User Application Slice. Dec 12 17:39:03.857984 systemd[2074]: Reached target paths.target - Paths. Dec 12 17:39:03.858015 systemd[2074]: Reached target timers.target - Timers. Dec 12 17:39:03.859112 systemd[2074]: Starting dbus.socket - D-Bus User Message Bus Socket... Dec 12 17:39:03.866919 systemd[2074]: Listening on dbus.socket - D-Bus User Message Bus Socket. Dec 12 17:39:03.867086 systemd[2074]: Reached target sockets.target - Sockets. Dec 12 17:39:03.867201 systemd[2074]: Reached target basic.target - Basic System. Dec 12 17:39:03.867325 systemd[2074]: Reached target default.target - Main User Target. Dec 12 17:39:03.867402 systemd[2074]: Startup finished in 135ms. Dec 12 17:39:03.868142 systemd[1]: Started user@500.service - User Manager for UID 500. Dec 12 17:39:03.876381 systemd[1]: Started session-1.scope - Session 1 of User core. Dec 12 17:39:04.684349 login[2027]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:39:04.689690 systemd-logind[1875]: New session 2 of user core. Dec 12 17:39:04.693369 systemd[1]: Started session-2.scope - Session 2 of User core. Dec 12 17:39:04.710934 waagent[2024]: 2025-12-12T17:39:04.710861Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 Dec 12 17:39:04.717236 waagent[2024]: 2025-12-12T17:39:04.715596Z INFO Daemon Daemon OS: flatcar 4459.2.2 Dec 12 17:39:04.720264 waagent[2024]: 2025-12-12T17:39:04.720197Z INFO Daemon Daemon Python: 3.11.13 Dec 12 17:39:04.724250 waagent[2024]: 2025-12-12T17:39:04.724192Z INFO Daemon Daemon Run daemon Dec 12 17:39:04.728350 waagent[2024]: 2025-12-12T17:39:04.728311Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4459.2.2' Dec 12 17:39:04.735210 waagent[2024]: 2025-12-12T17:39:04.735176Z INFO Daemon Daemon Using waagent for provisioning Dec 12 17:39:04.739932 waagent[2024]: 2025-12-12T17:39:04.739890Z INFO Daemon Daemon Activate resource disk Dec 12 17:39:04.743570 waagent[2024]: 2025-12-12T17:39:04.743539Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Dec 12 17:39:04.752321 waagent[2024]: 2025-12-12T17:39:04.752280Z INFO Daemon Daemon Found device: None Dec 12 17:39:04.755879 waagent[2024]: 2025-12-12T17:39:04.755849Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Dec 12 17:39:04.762336 waagent[2024]: 2025-12-12T17:39:04.762309Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Dec 12 17:39:04.772156 waagent[2024]: 2025-12-12T17:39:04.772116Z INFO Daemon Daemon Clean protocol and wireserver endpoint Dec 12 17:39:04.776445 waagent[2024]: 2025-12-12T17:39:04.776414Z INFO Daemon Daemon Running default provisioning handler Dec 12 17:39:04.785441 waagent[2024]: 2025-12-12T17:39:04.785388Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. Dec 12 17:39:04.796151 waagent[2024]: 2025-12-12T17:39:04.796109Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Dec 12 17:39:04.803487 waagent[2024]: 2025-12-12T17:39:04.803452Z INFO Daemon Daemon cloud-init is enabled: False Dec 12 17:39:04.807428 waagent[2024]: 2025-12-12T17:39:04.807399Z INFO Daemon Daemon Copying ovf-env.xml Dec 12 17:39:04.858501 waagent[2024]: 2025-12-12T17:39:04.858430Z INFO Daemon Daemon Successfully mounted dvd Dec 12 17:39:04.891470 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Dec 12 17:39:04.893746 waagent[2024]: 2025-12-12T17:39:04.893681Z INFO Daemon Daemon Detect protocol endpoint Dec 12 17:39:04.897559 waagent[2024]: 2025-12-12T17:39:04.897519Z INFO Daemon Daemon Clean protocol and wireserver endpoint Dec 12 17:39:04.901832 waagent[2024]: 2025-12-12T17:39:04.901798Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Dec 12 17:39:04.906527 waagent[2024]: 2025-12-12T17:39:04.906501Z INFO Daemon Daemon Test for route to 168.63.129.16 Dec 12 17:39:04.910603 waagent[2024]: 2025-12-12T17:39:04.910572Z INFO Daemon Daemon Route to 168.63.129.16 exists Dec 12 17:39:04.914422 waagent[2024]: 2025-12-12T17:39:04.914396Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Dec 12 17:39:04.959723 waagent[2024]: 2025-12-12T17:39:04.959649Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Dec 12 17:39:04.964851 waagent[2024]: 2025-12-12T17:39:04.964827Z INFO Daemon Daemon Wire protocol version:2012-11-30 Dec 12 17:39:04.968859 waagent[2024]: 2025-12-12T17:39:04.968828Z INFO Daemon Daemon Server preferred version:2015-04-05 Dec 12 17:39:05.064614 waagent[2024]: 2025-12-12T17:39:05.064537Z INFO Daemon Daemon Initializing goal state during protocol detection Dec 12 17:39:05.069759 waagent[2024]: 2025-12-12T17:39:05.069696Z INFO Daemon Daemon Forcing an update of the goal state. Dec 12 17:39:05.077359 waagent[2024]: 2025-12-12T17:39:05.077319Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] Dec 12 17:39:05.096534 waagent[2024]: 2025-12-12T17:39:05.096493Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.177 Dec 12 17:39:05.101138 waagent[2024]: 2025-12-12T17:39:05.101100Z INFO Daemon Dec 12 17:39:05.103328 waagent[2024]: 2025-12-12T17:39:05.103300Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: 8e04f142-8adf-47df-a7e8-540bdf171434 eTag: 9708546817582386382 source: Fabric] Dec 12 17:39:05.112044 waagent[2024]: 2025-12-12T17:39:05.112013Z INFO Daemon The vmSettings originated via Fabric; will ignore them. Dec 12 17:39:05.117221 waagent[2024]: 2025-12-12T17:39:05.117192Z INFO Daemon Dec 12 17:39:05.119434 waagent[2024]: 2025-12-12T17:39:05.119406Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] Dec 12 17:39:05.128816 waagent[2024]: 2025-12-12T17:39:05.128787Z INFO Daemon Daemon Downloading artifacts profile blob Dec 12 17:39:05.250069 waagent[2024]: 2025-12-12T17:39:05.249948Z INFO Daemon Downloaded certificate {'thumbprint': '9DCFA955A4FCA51FD37DBF67EFCDF13B110E7E5E', 'hasPrivateKey': True} Dec 12 17:39:05.257595 waagent[2024]: 2025-12-12T17:39:05.257557Z INFO Daemon Fetch goal state completed Dec 12 17:39:05.267780 waagent[2024]: 2025-12-12T17:39:05.267749Z INFO Daemon Daemon Starting provisioning Dec 12 17:39:05.271737 waagent[2024]: 2025-12-12T17:39:05.271703Z INFO Daemon Daemon Handle ovf-env.xml. Dec 12 17:39:05.275407 waagent[2024]: 2025-12-12T17:39:05.275382Z INFO Daemon Daemon Set hostname [ci-4459.2.2-a-260bc0236d] Dec 12 17:39:05.297149 waagent[2024]: 2025-12-12T17:39:05.297101Z INFO Daemon Daemon Publish hostname [ci-4459.2.2-a-260bc0236d] Dec 12 17:39:05.301827 waagent[2024]: 2025-12-12T17:39:05.301788Z INFO Daemon Daemon Examine /proc/net/route for primary interface Dec 12 17:39:05.306765 waagent[2024]: 2025-12-12T17:39:05.306731Z INFO Daemon Daemon Primary interface is [eth0] Dec 12 17:39:05.316437 systemd-networkd[1491]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 17:39:05.316443 systemd-networkd[1491]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 12 17:39:05.316494 systemd-networkd[1491]: eth0: DHCP lease lost Dec 12 17:39:05.317392 waagent[2024]: 2025-12-12T17:39:05.317340Z INFO Daemon Daemon Create user account if not exists Dec 12 17:39:05.321489 waagent[2024]: 2025-12-12T17:39:05.321456Z INFO Daemon Daemon User core already exists, skip useradd Dec 12 17:39:05.325743 waagent[2024]: 2025-12-12T17:39:05.325701Z INFO Daemon Daemon Configure sudoer Dec 12 17:39:05.334436 waagent[2024]: 2025-12-12T17:39:05.334390Z INFO Daemon Daemon Configure sshd Dec 12 17:39:05.340970 waagent[2024]: 2025-12-12T17:39:05.340926Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. Dec 12 17:39:05.350543 waagent[2024]: 2025-12-12T17:39:05.350507Z INFO Daemon Daemon Deploy ssh public key. Dec 12 17:39:05.351272 systemd-networkd[1491]: eth0: DHCPv4 address 10.200.20.10/24, gateway 10.200.20.1 acquired from 168.63.129.16 Dec 12 17:39:06.452652 waagent[2024]: 2025-12-12T17:39:06.452604Z INFO Daemon Daemon Provisioning complete Dec 12 17:39:06.466830 waagent[2024]: 2025-12-12T17:39:06.466789Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Dec 12 17:39:06.472196 waagent[2024]: 2025-12-12T17:39:06.472160Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Dec 12 17:39:06.480137 waagent[2024]: 2025-12-12T17:39:06.480105Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent Dec 12 17:39:06.578258 waagent[2124]: 2025-12-12T17:39:06.577726Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) Dec 12 17:39:06.578258 waagent[2124]: 2025-12-12T17:39:06.577846Z INFO ExtHandler ExtHandler OS: flatcar 4459.2.2 Dec 12 17:39:06.578258 waagent[2124]: 2025-12-12T17:39:06.577884Z INFO ExtHandler ExtHandler Python: 3.11.13 Dec 12 17:39:06.578258 waagent[2124]: 2025-12-12T17:39:06.577918Z INFO ExtHandler ExtHandler CPU Arch: aarch64 Dec 12 17:39:06.616001 waagent[2124]: 2025-12-12T17:39:06.615953Z INFO ExtHandler ExtHandler Distro: flatcar-4459.2.2; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.13; Arch: aarch64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.22.0; Dec 12 17:39:06.616292 waagent[2124]: 2025-12-12T17:39:06.616260Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Dec 12 17:39:06.616402 waagent[2124]: 2025-12-12T17:39:06.616381Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Dec 12 17:39:06.621999 waagent[2124]: 2025-12-12T17:39:06.621953Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Dec 12 17:39:06.628258 waagent[2124]: 2025-12-12T17:39:06.627477Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.177 Dec 12 17:39:06.628258 waagent[2124]: 2025-12-12T17:39:06.627829Z INFO ExtHandler Dec 12 17:39:06.628258 waagent[2124]: 2025-12-12T17:39:06.627882Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: 481fe526-d7ef-4de5-856c-24c2f2205726 eTag: 9708546817582386382 source: Fabric] Dec 12 17:39:06.628258 waagent[2124]: 2025-12-12T17:39:06.628081Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Dec 12 17:39:06.628555 waagent[2124]: 2025-12-12T17:39:06.628519Z INFO ExtHandler Dec 12 17:39:06.628587 waagent[2124]: 2025-12-12T17:39:06.628572Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Dec 12 17:39:06.631706 waagent[2124]: 2025-12-12T17:39:06.631678Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Dec 12 17:39:06.683032 waagent[2124]: 2025-12-12T17:39:06.682973Z INFO ExtHandler Downloaded certificate {'thumbprint': '9DCFA955A4FCA51FD37DBF67EFCDF13B110E7E5E', 'hasPrivateKey': True} Dec 12 17:39:06.683403 waagent[2124]: 2025-12-12T17:39:06.683370Z INFO ExtHandler Fetch goal state completed Dec 12 17:39:06.695471 waagent[2124]: 2025-12-12T17:39:06.695422Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.4.2 1 Jul 2025 (Library: OpenSSL 3.4.2 1 Jul 2025) Dec 12 17:39:06.698665 waagent[2124]: 2025-12-12T17:39:06.698622Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 2124 Dec 12 17:39:06.698764 waagent[2124]: 2025-12-12T17:39:06.698738Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** Dec 12 17:39:06.698996 waagent[2124]: 2025-12-12T17:39:06.698968Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** Dec 12 17:39:06.700049 waagent[2124]: 2025-12-12T17:39:06.700013Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4459.2.2', '', 'Flatcar Container Linux by Kinvolk'] Dec 12 17:39:06.700395 waagent[2124]: 2025-12-12T17:39:06.700364Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4459.2.2', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported Dec 12 17:39:06.700506 waagent[2124]: 2025-12-12T17:39:06.700484Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False Dec 12 17:39:06.700916 waagent[2124]: 2025-12-12T17:39:06.700887Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Dec 12 17:39:06.742489 waagent[2124]: 2025-12-12T17:39:06.742402Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Dec 12 17:39:06.742594 waagent[2124]: 2025-12-12T17:39:06.742565Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Dec 12 17:39:06.746935 waagent[2124]: 2025-12-12T17:39:06.746909Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Dec 12 17:39:06.751809 systemd[1]: Reload requested from client PID 2139 ('systemctl') (unit waagent.service)... Dec 12 17:39:06.751823 systemd[1]: Reloading... Dec 12 17:39:06.816412 zram_generator::config[2178]: No configuration found. Dec 12 17:39:06.968150 systemd[1]: Reloading finished in 216 ms. Dec 12 17:39:06.985511 waagent[2124]: 2025-12-12T17:39:06.985417Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service Dec 12 17:39:06.985602 waagent[2124]: 2025-12-12T17:39:06.985554Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully Dec 12 17:39:07.657393 waagent[2124]: 2025-12-12T17:39:07.657320Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Dec 12 17:39:07.657657 waagent[2124]: 2025-12-12T17:39:07.657623Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] Dec 12 17:39:07.658267 waagent[2124]: 2025-12-12T17:39:07.658224Z INFO ExtHandler ExtHandler Starting env monitor service. Dec 12 17:39:07.658579 waagent[2124]: 2025-12-12T17:39:07.658507Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Dec 12 17:39:07.659293 waagent[2124]: 2025-12-12T17:39:07.658739Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Dec 12 17:39:07.659293 waagent[2124]: 2025-12-12T17:39:07.658820Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Dec 12 17:39:07.659293 waagent[2124]: 2025-12-12T17:39:07.658983Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Dec 12 17:39:07.659293 waagent[2124]: 2025-12-12T17:39:07.659109Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Dec 12 17:39:07.659293 waagent[2124]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Dec 12 17:39:07.659293 waagent[2124]: eth0 00000000 0114C80A 0003 0 0 1024 00000000 0 0 0 Dec 12 17:39:07.659293 waagent[2124]: eth0 0014C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Dec 12 17:39:07.659293 waagent[2124]: eth0 0114C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Dec 12 17:39:07.659293 waagent[2124]: eth0 10813FA8 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Dec 12 17:39:07.659293 waagent[2124]: eth0 FEA9FEA9 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Dec 12 17:39:07.659668 waagent[2124]: 2025-12-12T17:39:07.659568Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Dec 12 17:39:07.659752 waagent[2124]: 2025-12-12T17:39:07.659726Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Dec 12 17:39:07.660016 waagent[2124]: 2025-12-12T17:39:07.659987Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Dec 12 17:39:07.660102 waagent[2124]: 2025-12-12T17:39:07.660065Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Dec 12 17:39:07.660102 waagent[2124]: 2025-12-12T17:39:07.660130Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Dec 12 17:39:07.660279 waagent[2124]: 2025-12-12T17:39:07.660251Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Dec 12 17:39:07.661091 waagent[2124]: 2025-12-12T17:39:07.661066Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Dec 12 17:39:07.661604 waagent[2124]: 2025-12-12T17:39:07.661566Z INFO EnvHandler ExtHandler Configure routes Dec 12 17:39:07.662109 waagent[2124]: 2025-12-12T17:39:07.662065Z INFO EnvHandler ExtHandler Gateway:None Dec 12 17:39:07.662355 waagent[2124]: 2025-12-12T17:39:07.662321Z INFO EnvHandler ExtHandler Routes:None Dec 12 17:39:07.666270 waagent[2124]: 2025-12-12T17:39:07.666185Z INFO ExtHandler ExtHandler Dec 12 17:39:07.667105 waagent[2124]: 2025-12-12T17:39:07.667069Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: 88792abc-6d78-443a-8236-7f17f4d12097 correlation 29b80827-28e7-40f3-ac17-11e42401e521 created: 2025-12-12T17:38:02.674866Z] Dec 12 17:39:07.667490 waagent[2124]: 2025-12-12T17:39:07.667449Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Dec 12 17:39:07.667960 waagent[2124]: 2025-12-12T17:39:07.667933Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 1 ms] Dec 12 17:39:07.689553 waagent[2124]: 2025-12-12T17:39:07.689504Z WARNING ExtHandler ExtHandler Failed to get firewall packets: 'iptables -w -t security -L OUTPUT --zero OUTPUT -nxv' failed: 2 (iptables v1.8.11 (nf_tables): Illegal option `--numeric' with this command Dec 12 17:39:07.689553 waagent[2124]: Try `iptables -h' or 'iptables --help' for more information.) Dec 12 17:39:07.689844 waagent[2124]: 2025-12-12T17:39:07.689812Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: F351570D-264A-4EB5-827B-46F9F86A2C15;DroppedPackets: -1;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] Dec 12 17:39:07.730759 waagent[2124]: 2025-12-12T17:39:07.729977Z INFO MonitorHandler ExtHandler Network interfaces: Dec 12 17:39:07.730759 waagent[2124]: Executing ['ip', '-a', '-o', 'link']: Dec 12 17:39:07.730759 waagent[2124]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Dec 12 17:39:07.730759 waagent[2124]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 00:0d:3a:c2:b5:20 brd ff:ff:ff:ff:ff:ff Dec 12 17:39:07.730759 waagent[2124]: 3: enP37525s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 00:0d:3a:c2:b5:20 brd ff:ff:ff:ff:ff:ff\ altname enP37525p0s2 Dec 12 17:39:07.730759 waagent[2124]: Executing ['ip', '-4', '-a', '-o', 'address']: Dec 12 17:39:07.730759 waagent[2124]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Dec 12 17:39:07.730759 waagent[2124]: 2: eth0 inet 10.200.20.10/24 metric 1024 brd 10.200.20.255 scope global eth0\ valid_lft forever preferred_lft forever Dec 12 17:39:07.730759 waagent[2124]: Executing ['ip', '-6', '-a', '-o', 'address']: Dec 12 17:39:07.730759 waagent[2124]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever Dec 12 17:39:07.730759 waagent[2124]: 2: eth0 inet6 fe80::20d:3aff:fec2:b520/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Dec 12 17:39:07.781429 waagent[2124]: 2025-12-12T17:39:07.781386Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: Dec 12 17:39:07.781429 waagent[2124]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Dec 12 17:39:07.781429 waagent[2124]: pkts bytes target prot opt in out source destination Dec 12 17:39:07.781429 waagent[2124]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Dec 12 17:39:07.781429 waagent[2124]: pkts bytes target prot opt in out source destination Dec 12 17:39:07.781429 waagent[2124]: Chain OUTPUT (policy ACCEPT 5 packets, 646 bytes) Dec 12 17:39:07.781429 waagent[2124]: pkts bytes target prot opt in out source destination Dec 12 17:39:07.781429 waagent[2124]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Dec 12 17:39:07.781429 waagent[2124]: 4 416 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Dec 12 17:39:07.781429 waagent[2124]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Dec 12 17:39:07.785634 waagent[2124]: 2025-12-12T17:39:07.785349Z INFO EnvHandler ExtHandler Current Firewall rules: Dec 12 17:39:07.785634 waagent[2124]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Dec 12 17:39:07.785634 waagent[2124]: pkts bytes target prot opt in out source destination Dec 12 17:39:07.785634 waagent[2124]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Dec 12 17:39:07.785634 waagent[2124]: pkts bytes target prot opt in out source destination Dec 12 17:39:07.785634 waagent[2124]: Chain OUTPUT (policy ACCEPT 5 packets, 646 bytes) Dec 12 17:39:07.785634 waagent[2124]: pkts bytes target prot opt in out source destination Dec 12 17:39:07.785634 waagent[2124]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Dec 12 17:39:07.785634 waagent[2124]: 11 928 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Dec 12 17:39:07.785634 waagent[2124]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Dec 12 17:39:07.785634 waagent[2124]: 2025-12-12T17:39:07.785554Z INFO EnvHandler ExtHandler Set block dev timeout: sda with timeout: 300 Dec 12 17:39:13.509157 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Dec 12 17:39:13.510901 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:39:13.616539 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:39:13.626422 (kubelet)[2274]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:39:13.712514 kubelet[2274]: E1212 17:39:13.712458 2274 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:39:13.715170 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:39:13.715359 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:39:13.715982 systemd[1]: kubelet.service: Consumed 109ms CPU time, 107.7M memory peak. Dec 12 17:39:18.783879 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Dec 12 17:39:18.785272 systemd[1]: Started sshd@0-10.200.20.10:22-10.200.16.10:55998.service - OpenSSH per-connection server daemon (10.200.16.10:55998). Dec 12 17:39:19.381039 sshd[2281]: Accepted publickey for core from 10.200.16.10 port 55998 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:39:19.382083 sshd-session[2281]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:39:19.385755 systemd-logind[1875]: New session 3 of user core. Dec 12 17:39:19.397331 systemd[1]: Started session-3.scope - Session 3 of User core. Dec 12 17:39:19.812846 systemd[1]: Started sshd@1-10.200.20.10:22-10.200.16.10:56000.service - OpenSSH per-connection server daemon (10.200.16.10:56000). Dec 12 17:39:20.266094 sshd[2287]: Accepted publickey for core from 10.200.16.10 port 56000 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:39:20.267122 sshd-session[2287]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:39:20.270863 systemd-logind[1875]: New session 4 of user core. Dec 12 17:39:20.281347 systemd[1]: Started session-4.scope - Session 4 of User core. Dec 12 17:39:20.600331 sshd[2290]: Connection closed by 10.200.16.10 port 56000 Dec 12 17:39:20.601074 sshd-session[2287]: pam_unix(sshd:session): session closed for user core Dec 12 17:39:20.604124 systemd[1]: sshd@1-10.200.20.10:22-10.200.16.10:56000.service: Deactivated successfully. Dec 12 17:39:20.605473 systemd[1]: session-4.scope: Deactivated successfully. Dec 12 17:39:20.606074 systemd-logind[1875]: Session 4 logged out. Waiting for processes to exit. Dec 12 17:39:20.607080 systemd-logind[1875]: Removed session 4. Dec 12 17:39:20.699713 systemd[1]: Started sshd@2-10.200.20.10:22-10.200.16.10:49796.service - OpenSSH per-connection server daemon (10.200.16.10:49796). Dec 12 17:39:21.189819 sshd[2296]: Accepted publickey for core from 10.200.16.10 port 49796 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:39:21.190863 sshd-session[2296]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:39:21.194480 systemd-logind[1875]: New session 5 of user core. Dec 12 17:39:21.201526 systemd[1]: Started session-5.scope - Session 5 of User core. Dec 12 17:39:21.539730 sshd[2299]: Connection closed by 10.200.16.10 port 49796 Dec 12 17:39:21.540389 sshd-session[2296]: pam_unix(sshd:session): session closed for user core Dec 12 17:39:21.543757 systemd[1]: sshd@2-10.200.20.10:22-10.200.16.10:49796.service: Deactivated successfully. Dec 12 17:39:21.545120 systemd[1]: session-5.scope: Deactivated successfully. Dec 12 17:39:21.545786 systemd-logind[1875]: Session 5 logged out. Waiting for processes to exit. Dec 12 17:39:21.546936 systemd-logind[1875]: Removed session 5. Dec 12 17:39:21.631664 systemd[1]: Started sshd@3-10.200.20.10:22-10.200.16.10:49812.service - OpenSSH per-connection server daemon (10.200.16.10:49812). Dec 12 17:39:22.130008 sshd[2305]: Accepted publickey for core from 10.200.16.10 port 49812 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:39:22.131039 sshd-session[2305]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:39:22.135832 systemd-logind[1875]: New session 6 of user core. Dec 12 17:39:22.141363 systemd[1]: Started session-6.scope - Session 6 of User core. Dec 12 17:39:22.483192 sshd[2308]: Connection closed by 10.200.16.10 port 49812 Dec 12 17:39:22.483643 sshd-session[2305]: pam_unix(sshd:session): session closed for user core Dec 12 17:39:22.487068 systemd[1]: sshd@3-10.200.20.10:22-10.200.16.10:49812.service: Deactivated successfully. Dec 12 17:39:22.488532 systemd[1]: session-6.scope: Deactivated successfully. Dec 12 17:39:22.489137 systemd-logind[1875]: Session 6 logged out. Waiting for processes to exit. Dec 12 17:39:22.490481 systemd-logind[1875]: Removed session 6. Dec 12 17:39:22.575784 systemd[1]: Started sshd@4-10.200.20.10:22-10.200.16.10:49822.service - OpenSSH per-connection server daemon (10.200.16.10:49822). Dec 12 17:39:23.072560 sshd[2314]: Accepted publickey for core from 10.200.16.10 port 49822 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:39:23.073612 sshd-session[2314]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:39:23.077144 systemd-logind[1875]: New session 7 of user core. Dec 12 17:39:23.084383 systemd[1]: Started session-7.scope - Session 7 of User core. Dec 12 17:39:23.504613 sudo[2318]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Dec 12 17:39:23.504858 sudo[2318]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:39:23.530901 sudo[2318]: pam_unix(sudo:session): session closed for user root Dec 12 17:39:23.602263 sshd[2317]: Connection closed by 10.200.16.10 port 49822 Dec 12 17:39:23.602946 sshd-session[2314]: pam_unix(sshd:session): session closed for user core Dec 12 17:39:23.606780 systemd-logind[1875]: Session 7 logged out. Waiting for processes to exit. Dec 12 17:39:23.607023 systemd[1]: sshd@4-10.200.20.10:22-10.200.16.10:49822.service: Deactivated successfully. Dec 12 17:39:23.609364 systemd[1]: session-7.scope: Deactivated successfully. Dec 12 17:39:23.610553 systemd-logind[1875]: Removed session 7. Dec 12 17:39:23.689942 systemd[1]: Started sshd@5-10.200.20.10:22-10.200.16.10:49834.service - OpenSSH per-connection server daemon (10.200.16.10:49834). Dec 12 17:39:23.965814 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Dec 12 17:39:23.967528 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:39:24.075375 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:39:24.078308 (kubelet)[2335]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:39:24.160927 kubelet[2335]: E1212 17:39:24.160870 2335 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:39:24.163142 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:39:24.163393 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:39:24.165358 systemd[1]: kubelet.service: Consumed 105ms CPU time, 107.3M memory peak. Dec 12 17:39:24.181563 sshd[2324]: Accepted publickey for core from 10.200.16.10 port 49834 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:39:24.182987 sshd-session[2324]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:39:24.186466 systemd-logind[1875]: New session 8 of user core. Dec 12 17:39:24.193341 systemd[1]: Started session-8.scope - Session 8 of User core. Dec 12 17:39:24.456104 sudo[2343]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Dec 12 17:39:24.456580 sudo[2343]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:39:24.510550 sudo[2343]: pam_unix(sudo:session): session closed for user root Dec 12 17:39:24.514608 sudo[2342]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Dec 12 17:39:24.514821 sudo[2342]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:39:24.522314 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 12 17:39:24.550796 augenrules[2365]: No rules Dec 12 17:39:24.551896 systemd[1]: audit-rules.service: Deactivated successfully. Dec 12 17:39:24.552301 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 12 17:39:24.553282 sudo[2342]: pam_unix(sudo:session): session closed for user root Dec 12 17:39:24.630718 sshd[2341]: Connection closed by 10.200.16.10 port 49834 Dec 12 17:39:24.631556 sshd-session[2324]: pam_unix(sshd:session): session closed for user core Dec 12 17:39:24.634778 systemd-logind[1875]: Session 8 logged out. Waiting for processes to exit. Dec 12 17:39:24.635316 systemd[1]: sshd@5-10.200.20.10:22-10.200.16.10:49834.service: Deactivated successfully. Dec 12 17:39:24.636865 systemd[1]: session-8.scope: Deactivated successfully. Dec 12 17:39:24.638331 systemd-logind[1875]: Removed session 8. Dec 12 17:39:24.716030 systemd[1]: Started sshd@6-10.200.20.10:22-10.200.16.10:49836.service - OpenSSH per-connection server daemon (10.200.16.10:49836). Dec 12 17:39:25.169700 sshd[2374]: Accepted publickey for core from 10.200.16.10 port 49836 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:39:25.170851 sshd-session[2374]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:39:25.174987 systemd-logind[1875]: New session 9 of user core. Dec 12 17:39:25.184344 systemd[1]: Started session-9.scope - Session 9 of User core. Dec 12 17:39:25.425809 sudo[2378]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Dec 12 17:39:25.426016 sudo[2378]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 17:39:25.744286 chronyd[1849]: Selected source PHC0 Dec 12 17:39:26.851421 systemd[1]: Starting docker.service - Docker Application Container Engine... Dec 12 17:39:26.863655 (dockerd)[2395]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Dec 12 17:39:27.682246 dockerd[2395]: time="2025-12-12T17:39:27.681908707Z" level=info msg="Starting up" Dec 12 17:39:27.682758 dockerd[2395]: time="2025-12-12T17:39:27.682722371Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Dec 12 17:39:27.691320 dockerd[2395]: time="2025-12-12T17:39:27.691282354Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Dec 12 17:39:27.728408 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport1824473974-merged.mount: Deactivated successfully. Dec 12 17:39:27.761386 systemd[1]: var-lib-docker-metacopy\x2dcheck2873416596-merged.mount: Deactivated successfully. Dec 12 17:39:27.776561 dockerd[2395]: time="2025-12-12T17:39:27.776520095Z" level=info msg="Loading containers: start." Dec 12 17:39:27.804799 kernel: Initializing XFRM netlink socket Dec 12 17:39:28.095122 systemd-networkd[1491]: docker0: Link UP Dec 12 17:39:28.110133 dockerd[2395]: time="2025-12-12T17:39:28.110087031Z" level=info msg="Loading containers: done." Dec 12 17:39:28.131090 dockerd[2395]: time="2025-12-12T17:39:28.131045719Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Dec 12 17:39:28.131257 dockerd[2395]: time="2025-12-12T17:39:28.131123735Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Dec 12 17:39:28.131257 dockerd[2395]: time="2025-12-12T17:39:28.131199743Z" level=info msg="Initializing buildkit" Dec 12 17:39:28.174852 dockerd[2395]: time="2025-12-12T17:39:28.174808423Z" level=info msg="Completed buildkit initialization" Dec 12 17:39:28.180926 dockerd[2395]: time="2025-12-12T17:39:28.180887335Z" level=info msg="Daemon has completed initialization" Dec 12 17:39:28.181402 dockerd[2395]: time="2025-12-12T17:39:28.181174719Z" level=info msg="API listen on /run/docker.sock" Dec 12 17:39:28.181085 systemd[1]: Started docker.service - Docker Application Container Engine. Dec 12 17:39:28.726091 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2783690997-merged.mount: Deactivated successfully. Dec 12 17:39:29.160837 containerd[1898]: time="2025-12-12T17:39:29.160614407Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.10\"" Dec 12 17:39:30.052196 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1650732192.mount: Deactivated successfully. Dec 12 17:39:31.196202 containerd[1898]: time="2025-12-12T17:39:31.196092359Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:39:31.198786 containerd[1898]: time="2025-12-12T17:39:31.198749399Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.10: active requests=0, bytes read=26431959" Dec 12 17:39:31.203206 containerd[1898]: time="2025-12-12T17:39:31.203161407Z" level=info msg="ImageCreate event name:\"sha256:03aec5fd5841efdd990b8fe285e036fc1386e2f8851378ce2c9dfd1b331897ea\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:39:31.208279 containerd[1898]: time="2025-12-12T17:39:31.207628543Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:af4ee57c047e31a7f58422b94a9ec4c62221d3deebb16755bdeff720df796189\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:39:31.208279 containerd[1898]: time="2025-12-12T17:39:31.208106559Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.10\" with image id \"sha256:03aec5fd5841efdd990b8fe285e036fc1386e2f8851378ce2c9dfd1b331897ea\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.10\", repo digest \"registry.k8s.io/kube-apiserver@sha256:af4ee57c047e31a7f58422b94a9ec4c62221d3deebb16755bdeff720df796189\", size \"26428558\" in 2.047387416s" Dec 12 17:39:31.208279 containerd[1898]: time="2025-12-12T17:39:31.208134255Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.10\" returns image reference \"sha256:03aec5fd5841efdd990b8fe285e036fc1386e2f8851378ce2c9dfd1b331897ea\"" Dec 12 17:39:31.208767 containerd[1898]: time="2025-12-12T17:39:31.208731967Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.10\"" Dec 12 17:39:32.531505 containerd[1898]: time="2025-12-12T17:39:32.531436031Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:39:32.534850 containerd[1898]: time="2025-12-12T17:39:32.534819695Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.10: active requests=0, bytes read=22618955" Dec 12 17:39:32.538613 containerd[1898]: time="2025-12-12T17:39:32.538591943Z" level=info msg="ImageCreate event name:\"sha256:66490a6490dde2df4a78eba21320da67070ad88461899536880edb5301ec2ba3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:39:32.543101 containerd[1898]: time="2025-12-12T17:39:32.543051583Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:efbd9d1dfcd2940e1c73a1476c880c3c2cdf04cc60722d329b21cd48745c8660\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:39:32.543847 containerd[1898]: time="2025-12-12T17:39:32.543565783Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.10\" with image id \"sha256:66490a6490dde2df4a78eba21320da67070ad88461899536880edb5301ec2ba3\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.10\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:efbd9d1dfcd2940e1c73a1476c880c3c2cdf04cc60722d329b21cd48745c8660\", size \"24203439\" in 1.334646104s" Dec 12 17:39:32.543847 containerd[1898]: time="2025-12-12T17:39:32.543592247Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.10\" returns image reference \"sha256:66490a6490dde2df4a78eba21320da67070ad88461899536880edb5301ec2ba3\"" Dec 12 17:39:32.544058 containerd[1898]: time="2025-12-12T17:39:32.543995783Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.10\"" Dec 12 17:39:33.751510 containerd[1898]: time="2025-12-12T17:39:33.751454735Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:39:33.755514 containerd[1898]: time="2025-12-12T17:39:33.755480999Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.10: active requests=0, bytes read=17618436" Dec 12 17:39:33.758372 containerd[1898]: time="2025-12-12T17:39:33.758344654Z" level=info msg="ImageCreate event name:\"sha256:fcf368a1abd0b48cff2fd3cca12fcc008aaf52eeab885656f11e7773c6a188a3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:39:33.765244 containerd[1898]: time="2025-12-12T17:39:33.765205259Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:9c58e1adcad5af66d1d9ca5cf9a4c266e4054b8f19f91a8fff1993549e657b10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:39:33.766256 containerd[1898]: time="2025-12-12T17:39:33.766204365Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.10\" with image id \"sha256:fcf368a1abd0b48cff2fd3cca12fcc008aaf52eeab885656f11e7773c6a188a3\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.10\", repo digest \"registry.k8s.io/kube-scheduler@sha256:9c58e1adcad5af66d1d9ca5cf9a4c266e4054b8f19f91a8fff1993549e657b10\", size \"19202938\" in 1.222161292s" Dec 12 17:39:33.766283 containerd[1898]: time="2025-12-12T17:39:33.766261610Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.10\" returns image reference \"sha256:fcf368a1abd0b48cff2fd3cca12fcc008aaf52eeab885656f11e7773c6a188a3\"" Dec 12 17:39:33.766673 containerd[1898]: time="2025-12-12T17:39:33.766649679Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.10\"" Dec 12 17:39:34.413641 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Dec 12 17:39:34.415315 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:39:34.528958 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:39:34.536589 (kubelet)[2675]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 17:39:34.764902 kubelet[2675]: E1212 17:39:34.764778 2675 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 17:39:34.767204 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 17:39:34.767336 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 17:39:34.767629 systemd[1]: kubelet.service: Consumed 306ms CPU time, 105.3M memory peak. Dec 12 17:39:35.317449 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3141146745.mount: Deactivated successfully. Dec 12 17:39:35.592889 containerd[1898]: time="2025-12-12T17:39:35.592371684Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:39:35.594950 containerd[1898]: time="2025-12-12T17:39:35.594913917Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.10: active requests=0, bytes read=27561799" Dec 12 17:39:35.597914 containerd[1898]: time="2025-12-12T17:39:35.597869659Z" level=info msg="ImageCreate event name:\"sha256:8b57c1f8bd2ddfa793889457b41e87132f192046e262b32ab0514f32d28be47d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:39:35.607827 containerd[1898]: time="2025-12-12T17:39:35.607770015Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:e3dda1c7b384f9eb5b2fa1c27493b23b80e6204b9fa2ee8791b2de078f468cbf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:39:35.608241 containerd[1898]: time="2025-12-12T17:39:35.608067569Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.10\" with image id \"sha256:8b57c1f8bd2ddfa793889457b41e87132f192046e262b32ab0514f32d28be47d\", repo tag \"registry.k8s.io/kube-proxy:v1.32.10\", repo digest \"registry.k8s.io/kube-proxy@sha256:e3dda1c7b384f9eb5b2fa1c27493b23b80e6204b9fa2ee8791b2de078f468cbf\", size \"27560818\" in 1.841392574s" Dec 12 17:39:35.608241 containerd[1898]: time="2025-12-12T17:39:35.608095586Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.10\" returns image reference \"sha256:8b57c1f8bd2ddfa793889457b41e87132f192046e262b32ab0514f32d28be47d\"" Dec 12 17:39:35.608740 containerd[1898]: time="2025-12-12T17:39:35.608717261Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Dec 12 17:39:36.313473 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1803907695.mount: Deactivated successfully. Dec 12 17:39:37.746001 containerd[1898]: time="2025-12-12T17:39:37.745940574Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:39:37.751616 containerd[1898]: time="2025-12-12T17:39:37.751571202Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951622" Dec 12 17:39:37.754724 containerd[1898]: time="2025-12-12T17:39:37.754507455Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:39:37.758497 containerd[1898]: time="2025-12-12T17:39:37.758468686Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:39:37.759035 containerd[1898]: time="2025-12-12T17:39:37.759008471Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 2.150200751s" Dec 12 17:39:37.759035 containerd[1898]: time="2025-12-12T17:39:37.759035912Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Dec 12 17:39:37.759921 containerd[1898]: time="2025-12-12T17:39:37.759896820Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Dec 12 17:39:38.282734 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount333462496.mount: Deactivated successfully. Dec 12 17:39:38.305145 containerd[1898]: time="2025-12-12T17:39:38.305098253Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:39:38.308888 containerd[1898]: time="2025-12-12T17:39:38.308855757Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" Dec 12 17:39:38.311981 containerd[1898]: time="2025-12-12T17:39:38.311948288Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:39:38.356503 containerd[1898]: time="2025-12-12T17:39:38.356432276Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 17:39:38.357178 containerd[1898]: time="2025-12-12T17:39:38.357023158Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 597.098666ms" Dec 12 17:39:38.357178 containerd[1898]: time="2025-12-12T17:39:38.357052959Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Dec 12 17:39:38.357876 containerd[1898]: time="2025-12-12T17:39:38.357678923Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Dec 12 17:39:39.033095 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1577199030.mount: Deactivated successfully. Dec 12 17:39:41.253165 containerd[1898]: time="2025-12-12T17:39:41.253119933Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:39:41.255852 containerd[1898]: time="2025-12-12T17:39:41.255821739Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=67943165" Dec 12 17:39:41.259916 containerd[1898]: time="2025-12-12T17:39:41.259874476Z" level=info msg="ImageCreate event name:\"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:39:41.295547 containerd[1898]: time="2025-12-12T17:39:41.295497533Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:39:41.296490 containerd[1898]: time="2025-12-12T17:39:41.296133586Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"67941650\" in 2.938425749s" Dec 12 17:39:41.296490 containerd[1898]: time="2025-12-12T17:39:41.296492309Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\"" Dec 12 17:39:43.669533 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:39:43.669644 systemd[1]: kubelet.service: Consumed 306ms CPU time, 105.3M memory peak. Dec 12 17:39:43.673399 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:39:43.694952 systemd[1]: Reload requested from client PID 2828 ('systemctl') (unit session-9.scope)... Dec 12 17:39:43.694964 systemd[1]: Reloading... Dec 12 17:39:43.781260 zram_generator::config[2881]: No configuration found. Dec 12 17:39:43.932247 systemd[1]: Reloading finished in 237 ms. Dec 12 17:39:43.978556 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:39:43.980475 systemd[1]: kubelet.service: Deactivated successfully. Dec 12 17:39:43.980647 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:39:43.980682 systemd[1]: kubelet.service: Consumed 80ms CPU time, 95M memory peak. Dec 12 17:39:43.984415 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:39:44.272182 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:39:44.275787 (kubelet)[2944]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 12 17:39:44.425253 kubelet[2944]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 17:39:44.425253 kubelet[2944]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 12 17:39:44.425253 kubelet[2944]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 17:39:44.425253 kubelet[2944]: I1212 17:39:44.424300 2944 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 12 17:39:44.640688 kubelet[2944]: I1212 17:39:44.640310 2944 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Dec 12 17:39:44.640830 kubelet[2944]: I1212 17:39:44.640815 2944 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 12 17:39:44.641126 kubelet[2944]: I1212 17:39:44.641107 2944 server.go:954] "Client rotation is on, will bootstrap in background" Dec 12 17:39:44.664527 kubelet[2944]: E1212 17:39:44.664497 2944 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.200.20.10:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.20.10:6443: connect: connection refused" logger="UnhandledError" Dec 12 17:39:44.666329 kubelet[2944]: I1212 17:39:44.666310 2944 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 12 17:39:44.669821 kubelet[2944]: I1212 17:39:44.669738 2944 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 12 17:39:44.672290 kubelet[2944]: I1212 17:39:44.672266 2944 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 12 17:39:44.672564 kubelet[2944]: I1212 17:39:44.672539 2944 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 12 17:39:44.673078 kubelet[2944]: I1212 17:39:44.672625 2944 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459.2.2-a-260bc0236d","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 12 17:39:44.673540 kubelet[2944]: I1212 17:39:44.673523 2944 topology_manager.go:138] "Creating topology manager with none policy" Dec 12 17:39:44.673614 kubelet[2944]: I1212 17:39:44.673605 2944 container_manager_linux.go:304] "Creating device plugin manager" Dec 12 17:39:44.673777 kubelet[2944]: I1212 17:39:44.673765 2944 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:39:44.676470 kubelet[2944]: I1212 17:39:44.676455 2944 kubelet.go:446] "Attempting to sync node with API server" Dec 12 17:39:44.676645 kubelet[2944]: I1212 17:39:44.676632 2944 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 12 17:39:44.676727 kubelet[2944]: I1212 17:39:44.676719 2944 kubelet.go:352] "Adding apiserver pod source" Dec 12 17:39:44.676781 kubelet[2944]: I1212 17:39:44.676772 2944 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 12 17:39:44.681563 kubelet[2944]: W1212 17:39:44.681521 2944 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.20.10:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459.2.2-a-260bc0236d&limit=500&resourceVersion=0": dial tcp 10.200.20.10:6443: connect: connection refused Dec 12 17:39:44.681634 kubelet[2944]: E1212 17:39:44.681578 2944 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.200.20.10:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459.2.2-a-260bc0236d&limit=500&resourceVersion=0\": dial tcp 10.200.20.10:6443: connect: connection refused" logger="UnhandledError" Dec 12 17:39:44.681655 kubelet[2944]: I1212 17:39:44.681634 2944 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 12 17:39:44.681940 kubelet[2944]: I1212 17:39:44.681919 2944 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 12 17:39:44.681973 kubelet[2944]: W1212 17:39:44.681966 2944 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Dec 12 17:39:44.682425 kubelet[2944]: I1212 17:39:44.682403 2944 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 12 17:39:44.682470 kubelet[2944]: I1212 17:39:44.682438 2944 server.go:1287] "Started kubelet" Dec 12 17:39:44.686511 kubelet[2944]: W1212 17:39:44.686484 2944 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.20.10:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.200.20.10:6443: connect: connection refused Dec 12 17:39:44.686651 kubelet[2944]: E1212 17:39:44.686638 2944 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.200.20.10:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.10:6443: connect: connection refused" logger="UnhandledError" Dec 12 17:39:44.686826 kubelet[2944]: E1212 17:39:44.686742 2944 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.10:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.10:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459.2.2-a-260bc0236d.1880888b0a36a441 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459.2.2-a-260bc0236d,UID:ci-4459.2.2-a-260bc0236d,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459.2.2-a-260bc0236d,},FirstTimestamp:2025-12-12 17:39:44.682419265 +0000 UTC m=+0.403890935,LastTimestamp:2025-12-12 17:39:44.682419265 +0000 UTC m=+0.403890935,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459.2.2-a-260bc0236d,}" Dec 12 17:39:44.688344 kubelet[2944]: I1212 17:39:44.688307 2944 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Dec 12 17:39:44.689187 kubelet[2944]: I1212 17:39:44.689155 2944 server.go:479] "Adding debug handlers to kubelet server" Dec 12 17:39:44.689597 kubelet[2944]: I1212 17:39:44.689579 2944 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 12 17:39:44.690741 kubelet[2944]: I1212 17:39:44.688084 2944 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 12 17:39:44.690900 kubelet[2944]: I1212 17:39:44.690882 2944 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 12 17:39:44.691186 kubelet[2944]: I1212 17:39:44.691160 2944 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 12 17:39:44.692635 kubelet[2944]: E1212 17:39:44.692617 2944 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459.2.2-a-260bc0236d\" not found" Dec 12 17:39:44.693876 kubelet[2944]: I1212 17:39:44.693853 2944 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 12 17:39:44.693937 kubelet[2944]: I1212 17:39:44.693517 2944 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 12 17:39:44.694029 kubelet[2944]: I1212 17:39:44.694014 2944 reconciler.go:26] "Reconciler: start to sync state" Dec 12 17:39:44.694645 kubelet[2944]: E1212 17:39:44.694547 2944 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.10:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.2-a-260bc0236d?timeout=10s\": dial tcp 10.200.20.10:6443: connect: connection refused" interval="200ms" Dec 12 17:39:44.694645 kubelet[2944]: W1212 17:39:44.694603 2944 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.20.10:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.20.10:6443: connect: connection refused Dec 12 17:39:44.694645 kubelet[2944]: E1212 17:39:44.694629 2944 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.200.20.10:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.10:6443: connect: connection refused" logger="UnhandledError" Dec 12 17:39:44.696120 kubelet[2944]: I1212 17:39:44.696061 2944 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 12 17:39:44.697214 kubelet[2944]: E1212 17:39:44.697192 2944 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 12 17:39:44.697320 kubelet[2944]: I1212 17:39:44.697304 2944 factory.go:221] Registration of the containerd container factory successfully Dec 12 17:39:44.697320 kubelet[2944]: I1212 17:39:44.697316 2944 factory.go:221] Registration of the systemd container factory successfully Dec 12 17:39:44.721347 kubelet[2944]: I1212 17:39:44.721324 2944 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 12 17:39:44.721433 kubelet[2944]: I1212 17:39:44.721363 2944 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 12 17:39:44.721433 kubelet[2944]: I1212 17:39:44.721380 2944 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:39:44.792872 kubelet[2944]: E1212 17:39:44.792825 2944 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459.2.2-a-260bc0236d\" not found" Dec 12 17:39:44.860551 kubelet[2944]: I1212 17:39:44.860515 2944 policy_none.go:49] "None policy: Start" Dec 12 17:39:44.860551 kubelet[2944]: I1212 17:39:44.860554 2944 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 12 17:39:44.860683 kubelet[2944]: I1212 17:39:44.860574 2944 state_mem.go:35] "Initializing new in-memory state store" Dec 12 17:39:44.871294 kubelet[2944]: I1212 17:39:44.871258 2944 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 12 17:39:44.872920 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Dec 12 17:39:44.874090 kubelet[2944]: I1212 17:39:44.873984 2944 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 12 17:39:44.874248 kubelet[2944]: I1212 17:39:44.874174 2944 status_manager.go:227] "Starting to sync pod status with apiserver" Dec 12 17:39:44.874248 kubelet[2944]: I1212 17:39:44.874197 2944 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 12 17:39:44.874248 kubelet[2944]: I1212 17:39:44.874202 2944 kubelet.go:2382] "Starting kubelet main sync loop" Dec 12 17:39:44.874557 kubelet[2944]: E1212 17:39:44.874505 2944 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 12 17:39:44.876442 kubelet[2944]: W1212 17:39:44.876341 2944 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.20.10:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.10:6443: connect: connection refused Dec 12 17:39:44.876442 kubelet[2944]: E1212 17:39:44.876373 2944 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.200.20.10:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.10:6443: connect: connection refused" logger="UnhandledError" Dec 12 17:39:44.882510 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Dec 12 17:39:44.884898 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Dec 12 17:39:44.893574 kubelet[2944]: E1212 17:39:44.892928 2944 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459.2.2-a-260bc0236d\" not found" Dec 12 17:39:44.895751 kubelet[2944]: E1212 17:39:44.895724 2944 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.10:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.2-a-260bc0236d?timeout=10s\": dial tcp 10.200.20.10:6443: connect: connection refused" interval="400ms" Dec 12 17:39:44.899659 kubelet[2944]: I1212 17:39:44.899068 2944 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 12 17:39:44.899659 kubelet[2944]: I1212 17:39:44.899272 2944 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 12 17:39:44.899659 kubelet[2944]: I1212 17:39:44.899284 2944 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 12 17:39:44.899659 kubelet[2944]: I1212 17:39:44.899564 2944 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 12 17:39:44.900695 kubelet[2944]: E1212 17:39:44.900677 2944 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 12 17:39:44.900824 kubelet[2944]: E1212 17:39:44.900812 2944 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459.2.2-a-260bc0236d\" not found" Dec 12 17:39:44.984104 systemd[1]: Created slice kubepods-burstable-podb0e6831dab289b6ec222f243e5c8be1f.slice - libcontainer container kubepods-burstable-podb0e6831dab289b6ec222f243e5c8be1f.slice. Dec 12 17:39:44.994992 kubelet[2944]: E1212 17:39:44.994853 2944 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.2-a-260bc0236d\" not found" node="ci-4459.2.2-a-260bc0236d" Dec 12 17:39:44.995526 kubelet[2944]: I1212 17:39:44.995508 2944 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b0e6831dab289b6ec222f243e5c8be1f-ca-certs\") pod \"kube-apiserver-ci-4459.2.2-a-260bc0236d\" (UID: \"b0e6831dab289b6ec222f243e5c8be1f\") " pod="kube-system/kube-apiserver-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:44.995967 kubelet[2944]: I1212 17:39:44.995723 2944 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f0b5f09531fd673cc66127691e1e08b8-ca-certs\") pod \"kube-controller-manager-ci-4459.2.2-a-260bc0236d\" (UID: \"f0b5f09531fd673cc66127691e1e08b8\") " pod="kube-system/kube-controller-manager-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:44.995967 kubelet[2944]: I1212 17:39:44.995926 2944 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f0b5f09531fd673cc66127691e1e08b8-k8s-certs\") pod \"kube-controller-manager-ci-4459.2.2-a-260bc0236d\" (UID: \"f0b5f09531fd673cc66127691e1e08b8\") " pod="kube-system/kube-controller-manager-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:44.995967 kubelet[2944]: I1212 17:39:44.995946 2944 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f0b5f09531fd673cc66127691e1e08b8-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459.2.2-a-260bc0236d\" (UID: \"f0b5f09531fd673cc66127691e1e08b8\") " pod="kube-system/kube-controller-manager-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:44.996189 kubelet[2944]: I1212 17:39:44.996124 2944 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b0e6831dab289b6ec222f243e5c8be1f-k8s-certs\") pod \"kube-apiserver-ci-4459.2.2-a-260bc0236d\" (UID: \"b0e6831dab289b6ec222f243e5c8be1f\") " pod="kube-system/kube-apiserver-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:44.996189 kubelet[2944]: I1212 17:39:44.996155 2944 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b0e6831dab289b6ec222f243e5c8be1f-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459.2.2-a-260bc0236d\" (UID: \"b0e6831dab289b6ec222f243e5c8be1f\") " pod="kube-system/kube-apiserver-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:44.996189 kubelet[2944]: I1212 17:39:44.996176 2944 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/f0b5f09531fd673cc66127691e1e08b8-flexvolume-dir\") pod \"kube-controller-manager-ci-4459.2.2-a-260bc0236d\" (UID: \"f0b5f09531fd673cc66127691e1e08b8\") " pod="kube-system/kube-controller-manager-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:44.996362 kubelet[2944]: I1212 17:39:44.996302 2944 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f0b5f09531fd673cc66127691e1e08b8-kubeconfig\") pod \"kube-controller-manager-ci-4459.2.2-a-260bc0236d\" (UID: \"f0b5f09531fd673cc66127691e1e08b8\") " pod="kube-system/kube-controller-manager-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:44.996362 kubelet[2944]: I1212 17:39:44.996328 2944 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f7d54f61c7ce63e35b0b8d7b6da0e59d-kubeconfig\") pod \"kube-scheduler-ci-4459.2.2-a-260bc0236d\" (UID: \"f7d54f61c7ce63e35b0b8d7b6da0e59d\") " pod="kube-system/kube-scheduler-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:44.997874 systemd[1]: Created slice kubepods-burstable-podf0b5f09531fd673cc66127691e1e08b8.slice - libcontainer container kubepods-burstable-podf0b5f09531fd673cc66127691e1e08b8.slice. Dec 12 17:39:45.000921 kubelet[2944]: E1212 17:39:45.000865 2944 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.2-a-260bc0236d\" not found" node="ci-4459.2.2-a-260bc0236d" Dec 12 17:39:45.003149 systemd[1]: Created slice kubepods-burstable-podf7d54f61c7ce63e35b0b8d7b6da0e59d.slice - libcontainer container kubepods-burstable-podf7d54f61c7ce63e35b0b8d7b6da0e59d.slice. Dec 12 17:39:45.003329 kubelet[2944]: I1212 17:39:45.003272 2944 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.2-a-260bc0236d" Dec 12 17:39:45.003627 kubelet[2944]: E1212 17:39:45.003587 2944 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.10:6443/api/v1/nodes\": dial tcp 10.200.20.10:6443: connect: connection refused" node="ci-4459.2.2-a-260bc0236d" Dec 12 17:39:45.008133 kubelet[2944]: E1212 17:39:45.007975 2944 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.2-a-260bc0236d\" not found" node="ci-4459.2.2-a-260bc0236d" Dec 12 17:39:45.008275 kernel: hv_balloon: Max. dynamic memory size: 4096 MB Dec 12 17:39:45.205406 kubelet[2944]: I1212 17:39:45.205212 2944 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.2-a-260bc0236d" Dec 12 17:39:45.205817 kubelet[2944]: E1212 17:39:45.205623 2944 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.10:6443/api/v1/nodes\": dial tcp 10.200.20.10:6443: connect: connection refused" node="ci-4459.2.2-a-260bc0236d" Dec 12 17:39:45.296303 kubelet[2944]: E1212 17:39:45.296249 2944 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.10:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.2.2-a-260bc0236d?timeout=10s\": dial tcp 10.200.20.10:6443: connect: connection refused" interval="800ms" Dec 12 17:39:45.296781 containerd[1898]: time="2025-12-12T17:39:45.296744188Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459.2.2-a-260bc0236d,Uid:b0e6831dab289b6ec222f243e5c8be1f,Namespace:kube-system,Attempt:0,}" Dec 12 17:39:45.302385 containerd[1898]: time="2025-12-12T17:39:45.302231585Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459.2.2-a-260bc0236d,Uid:f0b5f09531fd673cc66127691e1e08b8,Namespace:kube-system,Attempt:0,}" Dec 12 17:39:45.309017 containerd[1898]: time="2025-12-12T17:39:45.308993605Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459.2.2-a-260bc0236d,Uid:f7d54f61c7ce63e35b0b8d7b6da0e59d,Namespace:kube-system,Attempt:0,}" Dec 12 17:39:45.351893 containerd[1898]: time="2025-12-12T17:39:45.351707397Z" level=info msg="connecting to shim 1b21a3829fdf98f75459ee80da68d2d9dddbbd311db9e323d685e5478f0cb772" address="unix:///run/containerd/s/930ae51ed36cd3f7501e4b120c6c5084fd0fda30168bcfd70549c2e8e8126ca5" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:39:45.374810 containerd[1898]: time="2025-12-12T17:39:45.374766594Z" level=info msg="connecting to shim 9be02451df9e5fde6185c67c80349b6478ec9ed6993233816b3060aa5e512ef9" address="unix:///run/containerd/s/38bcaa89be45d93f3e62b5dd338410e2b2c8251eba404420f97b628cb3562669" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:39:45.378375 systemd[1]: Started cri-containerd-1b21a3829fdf98f75459ee80da68d2d9dddbbd311db9e323d685e5478f0cb772.scope - libcontainer container 1b21a3829fdf98f75459ee80da68d2d9dddbbd311db9e323d685e5478f0cb772. Dec 12 17:39:45.402357 systemd[1]: Started cri-containerd-9be02451df9e5fde6185c67c80349b6478ec9ed6993233816b3060aa5e512ef9.scope - libcontainer container 9be02451df9e5fde6185c67c80349b6478ec9ed6993233816b3060aa5e512ef9. Dec 12 17:39:45.404147 containerd[1898]: time="2025-12-12T17:39:45.404090431Z" level=info msg="connecting to shim 66514cd221fad20a5058d28b13b1550695be2aeeaa2e9d60824243d2b2ac81dd" address="unix:///run/containerd/s/74509aa387f3ab2fdbd0e3c3c3d411f31523e13a2a14bfb8827d0d3faad166cb" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:39:45.429453 systemd[1]: Started cri-containerd-66514cd221fad20a5058d28b13b1550695be2aeeaa2e9d60824243d2b2ac81dd.scope - libcontainer container 66514cd221fad20a5058d28b13b1550695be2aeeaa2e9d60824243d2b2ac81dd. Dec 12 17:39:45.485664 containerd[1898]: time="2025-12-12T17:39:45.485456238Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459.2.2-a-260bc0236d,Uid:b0e6831dab289b6ec222f243e5c8be1f,Namespace:kube-system,Attempt:0,} returns sandbox id \"1b21a3829fdf98f75459ee80da68d2d9dddbbd311db9e323d685e5478f0cb772\"" Dec 12 17:39:45.488980 containerd[1898]: time="2025-12-12T17:39:45.488416062Z" level=info msg="CreateContainer within sandbox \"1b21a3829fdf98f75459ee80da68d2d9dddbbd311db9e323d685e5478f0cb772\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Dec 12 17:39:45.489747 containerd[1898]: time="2025-12-12T17:39:45.489723782Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459.2.2-a-260bc0236d,Uid:f0b5f09531fd673cc66127691e1e08b8,Namespace:kube-system,Attempt:0,} returns sandbox id \"9be02451df9e5fde6185c67c80349b6478ec9ed6993233816b3060aa5e512ef9\"" Dec 12 17:39:45.491624 containerd[1898]: time="2025-12-12T17:39:45.491592583Z" level=info msg="CreateContainer within sandbox \"9be02451df9e5fde6185c67c80349b6478ec9ed6993233816b3060aa5e512ef9\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Dec 12 17:39:45.493240 containerd[1898]: time="2025-12-12T17:39:45.493210709Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459.2.2-a-260bc0236d,Uid:f7d54f61c7ce63e35b0b8d7b6da0e59d,Namespace:kube-system,Attempt:0,} returns sandbox id \"66514cd221fad20a5058d28b13b1550695be2aeeaa2e9d60824243d2b2ac81dd\"" Dec 12 17:39:45.495020 containerd[1898]: time="2025-12-12T17:39:45.494994770Z" level=info msg="CreateContainer within sandbox \"66514cd221fad20a5058d28b13b1550695be2aeeaa2e9d60824243d2b2ac81dd\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Dec 12 17:39:45.517006 kubelet[2944]: W1212 17:39:45.516955 2944 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.20.10:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.200.20.10:6443: connect: connection refused Dec 12 17:39:45.535058 kubelet[2944]: E1212 17:39:45.517014 2944 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.200.20.10:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.10:6443: connect: connection refused" logger="UnhandledError" Dec 12 17:39:45.554517 containerd[1898]: time="2025-12-12T17:39:45.554447134Z" level=info msg="Container 54813a7dee9a921b08256f1598c6b60937c2ffd679fbfdab4ead21893c8e5c26: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:39:45.561569 containerd[1898]: time="2025-12-12T17:39:45.561504463Z" level=info msg="Container b10a5736bc578a822b5e9571164f791573c5d8aa46d326446efaf5ccfe01d856: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:39:45.567494 containerd[1898]: time="2025-12-12T17:39:45.567465241Z" level=info msg="Container 97961617c42cb020c235efe6617f5770ec1525f8731b80a267db254d3b20587c: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:39:45.593377 containerd[1898]: time="2025-12-12T17:39:45.593342864Z" level=info msg="CreateContainer within sandbox \"1b21a3829fdf98f75459ee80da68d2d9dddbbd311db9e323d685e5478f0cb772\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"54813a7dee9a921b08256f1598c6b60937c2ffd679fbfdab4ead21893c8e5c26\"" Dec 12 17:39:45.597246 containerd[1898]: time="2025-12-12T17:39:45.596375883Z" level=info msg="CreateContainer within sandbox \"66514cd221fad20a5058d28b13b1550695be2aeeaa2e9d60824243d2b2ac81dd\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"b10a5736bc578a822b5e9571164f791573c5d8aa46d326446efaf5ccfe01d856\"" Dec 12 17:39:45.597246 containerd[1898]: time="2025-12-12T17:39:45.596551563Z" level=info msg="StartContainer for \"54813a7dee9a921b08256f1598c6b60937c2ffd679fbfdab4ead21893c8e5c26\"" Dec 12 17:39:45.597475 containerd[1898]: time="2025-12-12T17:39:45.597453114Z" level=info msg="connecting to shim 54813a7dee9a921b08256f1598c6b60937c2ffd679fbfdab4ead21893c8e5c26" address="unix:///run/containerd/s/930ae51ed36cd3f7501e4b120c6c5084fd0fda30168bcfd70549c2e8e8126ca5" protocol=ttrpc version=3 Dec 12 17:39:45.598358 containerd[1898]: time="2025-12-12T17:39:45.598340168Z" level=info msg="StartContainer for \"b10a5736bc578a822b5e9571164f791573c5d8aa46d326446efaf5ccfe01d856\"" Dec 12 17:39:45.599817 containerd[1898]: time="2025-12-12T17:39:45.599789391Z" level=info msg="connecting to shim b10a5736bc578a822b5e9571164f791573c5d8aa46d326446efaf5ccfe01d856" address="unix:///run/containerd/s/74509aa387f3ab2fdbd0e3c3c3d411f31523e13a2a14bfb8827d0d3faad166cb" protocol=ttrpc version=3 Dec 12 17:39:45.603554 containerd[1898]: time="2025-12-12T17:39:45.603526297Z" level=info msg="CreateContainer within sandbox \"9be02451df9e5fde6185c67c80349b6478ec9ed6993233816b3060aa5e512ef9\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"97961617c42cb020c235efe6617f5770ec1525f8731b80a267db254d3b20587c\"" Dec 12 17:39:45.604586 containerd[1898]: time="2025-12-12T17:39:45.604562693Z" level=info msg="StartContainer for \"97961617c42cb020c235efe6617f5770ec1525f8731b80a267db254d3b20587c\"" Dec 12 17:39:45.605335 containerd[1898]: time="2025-12-12T17:39:45.605313366Z" level=info msg="connecting to shim 97961617c42cb020c235efe6617f5770ec1525f8731b80a267db254d3b20587c" address="unix:///run/containerd/s/38bcaa89be45d93f3e62b5dd338410e2b2c8251eba404420f97b628cb3562669" protocol=ttrpc version=3 Dec 12 17:39:45.608546 kubelet[2944]: I1212 17:39:45.608519 2944 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.2-a-260bc0236d" Dec 12 17:39:45.608937 kubelet[2944]: E1212 17:39:45.608915 2944 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.10:6443/api/v1/nodes\": dial tcp 10.200.20.10:6443: connect: connection refused" node="ci-4459.2.2-a-260bc0236d" Dec 12 17:39:45.623386 systemd[1]: Started cri-containerd-b10a5736bc578a822b5e9571164f791573c5d8aa46d326446efaf5ccfe01d856.scope - libcontainer container b10a5736bc578a822b5e9571164f791573c5d8aa46d326446efaf5ccfe01d856. Dec 12 17:39:45.633438 systemd[1]: Started cri-containerd-54813a7dee9a921b08256f1598c6b60937c2ffd679fbfdab4ead21893c8e5c26.scope - libcontainer container 54813a7dee9a921b08256f1598c6b60937c2ffd679fbfdab4ead21893c8e5c26. Dec 12 17:39:45.634384 systemd[1]: Started cri-containerd-97961617c42cb020c235efe6617f5770ec1525f8731b80a267db254d3b20587c.scope - libcontainer container 97961617c42cb020c235efe6617f5770ec1525f8731b80a267db254d3b20587c. Dec 12 17:39:45.696973 kubelet[2944]: W1212 17:39:45.696910 2944 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.20.10:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.10:6443: connect: connection refused Dec 12 17:39:45.697094 kubelet[2944]: E1212 17:39:45.697069 2944 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.200.20.10:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.10:6443: connect: connection refused" logger="UnhandledError" Dec 12 17:39:45.701407 containerd[1898]: time="2025-12-12T17:39:45.701331391Z" level=info msg="StartContainer for \"b10a5736bc578a822b5e9571164f791573c5d8aa46d326446efaf5ccfe01d856\" returns successfully" Dec 12 17:39:45.701820 containerd[1898]: time="2025-12-12T17:39:45.701774090Z" level=info msg="StartContainer for \"54813a7dee9a921b08256f1598c6b60937c2ffd679fbfdab4ead21893c8e5c26\" returns successfully" Dec 12 17:39:45.702655 containerd[1898]: time="2025-12-12T17:39:45.702615286Z" level=info msg="StartContainer for \"97961617c42cb020c235efe6617f5770ec1525f8731b80a267db254d3b20587c\" returns successfully" Dec 12 17:39:45.884231 kubelet[2944]: E1212 17:39:45.882775 2944 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.2-a-260bc0236d\" not found" node="ci-4459.2.2-a-260bc0236d" Dec 12 17:39:45.885725 kubelet[2944]: E1212 17:39:45.885700 2944 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.2-a-260bc0236d\" not found" node="ci-4459.2.2-a-260bc0236d" Dec 12 17:39:45.888565 kubelet[2944]: E1212 17:39:45.888543 2944 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.2-a-260bc0236d\" not found" node="ci-4459.2.2-a-260bc0236d" Dec 12 17:39:46.411892 kubelet[2944]: I1212 17:39:46.411385 2944 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.2-a-260bc0236d" Dec 12 17:39:46.877959 kubelet[2944]: E1212 17:39:46.877919 2944 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459.2.2-a-260bc0236d\" not found" node="ci-4459.2.2-a-260bc0236d" Dec 12 17:39:46.891082 kubelet[2944]: E1212 17:39:46.891052 2944 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.2-a-260bc0236d\" not found" node="ci-4459.2.2-a-260bc0236d" Dec 12 17:39:46.891338 kubelet[2944]: E1212 17:39:46.891322 2944 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.2.2-a-260bc0236d\" not found" node="ci-4459.2.2-a-260bc0236d" Dec 12 17:39:46.979341 kubelet[2944]: I1212 17:39:46.979304 2944 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459.2.2-a-260bc0236d" Dec 12 17:39:46.995233 kubelet[2944]: I1212 17:39:46.995189 2944 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:47.006453 kubelet[2944]: E1212 17:39:47.006420 2944 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459.2.2-a-260bc0236d\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:47.006453 kubelet[2944]: I1212 17:39:47.006447 2944 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:47.007606 kubelet[2944]: E1212 17:39:47.007579 2944 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459.2.2-a-260bc0236d\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:47.007606 kubelet[2944]: I1212 17:39:47.007600 2944 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:47.008722 kubelet[2944]: E1212 17:39:47.008690 2944 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459.2.2-a-260bc0236d\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:47.440469 update_engine[1879]: I20251212 17:39:47.440403 1879 update_attempter.cc:509] Updating boot flags... Dec 12 17:39:47.687967 kubelet[2944]: I1212 17:39:47.687926 2944 apiserver.go:52] "Watching apiserver" Dec 12 17:39:47.695044 kubelet[2944]: I1212 17:39:47.694953 2944 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 12 17:39:47.889527 kubelet[2944]: I1212 17:39:47.889421 2944 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:47.901994 kubelet[2944]: W1212 17:39:47.901969 2944 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Dec 12 17:39:49.257146 kubelet[2944]: I1212 17:39:49.257108 2944 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:49.267262 kubelet[2944]: W1212 17:39:49.266692 2944 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Dec 12 17:39:49.342984 systemd[1]: Reload requested from client PID 3279 ('systemctl') (unit session-9.scope)... Dec 12 17:39:49.343002 systemd[1]: Reloading... Dec 12 17:39:49.419251 zram_generator::config[3342]: No configuration found. Dec 12 17:39:49.565513 systemd[1]: Reloading finished in 222 ms. Dec 12 17:39:49.597977 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:39:49.617037 systemd[1]: kubelet.service: Deactivated successfully. Dec 12 17:39:49.617287 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:39:49.617344 systemd[1]: kubelet.service: Consumed 564ms CPU time, 128.1M memory peak. Dec 12 17:39:49.618830 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 17:39:49.719356 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 17:39:49.722141 (kubelet)[3390]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 12 17:39:49.752256 kubelet[3390]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 17:39:49.752256 kubelet[3390]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 12 17:39:49.752256 kubelet[3390]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 17:39:49.752256 kubelet[3390]: I1212 17:39:49.751684 3390 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 12 17:39:49.756245 kubelet[3390]: I1212 17:39:49.755846 3390 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Dec 12 17:39:49.756245 kubelet[3390]: I1212 17:39:49.755870 3390 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 12 17:39:49.756245 kubelet[3390]: I1212 17:39:49.756040 3390 server.go:954] "Client rotation is on, will bootstrap in background" Dec 12 17:39:49.757120 kubelet[3390]: I1212 17:39:49.757097 3390 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 12 17:39:49.758801 kubelet[3390]: I1212 17:39:49.758769 3390 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 12 17:39:49.762631 kubelet[3390]: I1212 17:39:49.762603 3390 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 12 17:39:49.767463 kubelet[3390]: I1212 17:39:49.767435 3390 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 12 17:39:49.767608 kubelet[3390]: I1212 17:39:49.767579 3390 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 12 17:39:49.767725 kubelet[3390]: I1212 17:39:49.767603 3390 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459.2.2-a-260bc0236d","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 12 17:39:49.767809 kubelet[3390]: I1212 17:39:49.767728 3390 topology_manager.go:138] "Creating topology manager with none policy" Dec 12 17:39:49.767809 kubelet[3390]: I1212 17:39:49.767735 3390 container_manager_linux.go:304] "Creating device plugin manager" Dec 12 17:39:49.767809 kubelet[3390]: I1212 17:39:49.767768 3390 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:39:49.767890 kubelet[3390]: I1212 17:39:49.767875 3390 kubelet.go:446] "Attempting to sync node with API server" Dec 12 17:39:49.767890 kubelet[3390]: I1212 17:39:49.767888 3390 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 12 17:39:49.767933 kubelet[3390]: I1212 17:39:49.767905 3390 kubelet.go:352] "Adding apiserver pod source" Dec 12 17:39:49.767933 kubelet[3390]: I1212 17:39:49.767913 3390 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 12 17:39:49.772437 kubelet[3390]: I1212 17:39:49.772391 3390 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 12 17:39:49.773179 kubelet[3390]: I1212 17:39:49.773156 3390 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 12 17:39:49.773677 kubelet[3390]: I1212 17:39:49.773657 3390 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 12 17:39:49.773760 kubelet[3390]: I1212 17:39:49.773752 3390 server.go:1287] "Started kubelet" Dec 12 17:39:49.776376 kubelet[3390]: I1212 17:39:49.776355 3390 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 12 17:39:49.785275 kubelet[3390]: I1212 17:39:49.784612 3390 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Dec 12 17:39:49.785275 kubelet[3390]: I1212 17:39:49.785184 3390 server.go:479] "Adding debug handlers to kubelet server" Dec 12 17:39:49.786075 kubelet[3390]: I1212 17:39:49.786033 3390 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 12 17:39:49.786358 kubelet[3390]: I1212 17:39:49.786344 3390 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 12 17:39:49.786649 kubelet[3390]: I1212 17:39:49.786634 3390 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 12 17:39:49.788009 kubelet[3390]: I1212 17:39:49.787993 3390 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 12 17:39:49.788337 kubelet[3390]: E1212 17:39:49.788320 3390 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459.2.2-a-260bc0236d\" not found" Dec 12 17:39:49.789807 kubelet[3390]: I1212 17:39:49.789793 3390 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 12 17:39:49.790025 kubelet[3390]: I1212 17:39:49.790015 3390 reconciler.go:26] "Reconciler: start to sync state" Dec 12 17:39:49.791953 kubelet[3390]: I1212 17:39:49.791919 3390 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 12 17:39:49.793271 kubelet[3390]: I1212 17:39:49.793253 3390 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 12 17:39:49.793385 kubelet[3390]: I1212 17:39:49.793377 3390 status_manager.go:227] "Starting to sync pod status with apiserver" Dec 12 17:39:49.793459 kubelet[3390]: I1212 17:39:49.793450 3390 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 12 17:39:49.793510 kubelet[3390]: I1212 17:39:49.793502 3390 kubelet.go:2382] "Starting kubelet main sync loop" Dec 12 17:39:49.793624 kubelet[3390]: E1212 17:39:49.793589 3390 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 12 17:39:49.796260 kubelet[3390]: I1212 17:39:49.795724 3390 factory.go:221] Registration of the systemd container factory successfully Dec 12 17:39:49.796260 kubelet[3390]: I1212 17:39:49.795816 3390 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 12 17:39:49.801575 kubelet[3390]: I1212 17:39:49.801545 3390 factory.go:221] Registration of the containerd container factory successfully Dec 12 17:39:49.807424 kubelet[3390]: E1212 17:39:49.807399 3390 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 12 17:39:49.838259 kubelet[3390]: I1212 17:39:49.837965 3390 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 12 17:39:49.838395 kubelet[3390]: I1212 17:39:49.838379 3390 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 12 17:39:49.838613 kubelet[3390]: I1212 17:39:49.838559 3390 state_mem.go:36] "Initialized new in-memory state store" Dec 12 17:39:49.839081 kubelet[3390]: I1212 17:39:49.838856 3390 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 12 17:39:49.839332 kubelet[3390]: I1212 17:39:49.839233 3390 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 12 17:39:49.839530 kubelet[3390]: I1212 17:39:49.839518 3390 policy_none.go:49] "None policy: Start" Dec 12 17:39:49.839707 kubelet[3390]: I1212 17:39:49.839642 3390 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 12 17:39:49.839827 kubelet[3390]: I1212 17:39:49.839778 3390 state_mem.go:35] "Initializing new in-memory state store" Dec 12 17:39:49.840164 kubelet[3390]: I1212 17:39:49.840029 3390 state_mem.go:75] "Updated machine memory state" Dec 12 17:39:49.843667 kubelet[3390]: I1212 17:39:49.843650 3390 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 12 17:39:49.843947 kubelet[3390]: I1212 17:39:49.843920 3390 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 12 17:39:49.844096 kubelet[3390]: I1212 17:39:49.844019 3390 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 12 17:39:49.844364 kubelet[3390]: I1212 17:39:49.844352 3390 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 12 17:39:49.846913 kubelet[3390]: E1212 17:39:49.846897 3390 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 12 17:39:49.895190 kubelet[3390]: I1212 17:39:49.895119 3390 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:49.895190 kubelet[3390]: I1212 17:39:49.895118 3390 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:49.895593 kubelet[3390]: I1212 17:39:49.895561 3390 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:49.904364 kubelet[3390]: W1212 17:39:49.904289 3390 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Dec 12 17:39:49.909000 kubelet[3390]: W1212 17:39:49.908975 3390 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Dec 12 17:39:49.909087 kubelet[3390]: E1212 17:39:49.909042 3390 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459.2.2-a-260bc0236d\" already exists" pod="kube-system/kube-controller-manager-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:49.909240 kubelet[3390]: W1212 17:39:49.909128 3390 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Dec 12 17:39:49.909240 kubelet[3390]: E1212 17:39:49.909167 3390 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459.2.2-a-260bc0236d\" already exists" pod="kube-system/kube-scheduler-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:49.947082 kubelet[3390]: I1212 17:39:49.947049 3390 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.2.2-a-260bc0236d" Dec 12 17:39:49.962849 kubelet[3390]: I1212 17:39:49.962769 3390 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459.2.2-a-260bc0236d" Dec 12 17:39:49.964150 kubelet[3390]: I1212 17:39:49.962869 3390 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459.2.2-a-260bc0236d" Dec 12 17:39:50.091646 kubelet[3390]: I1212 17:39:50.091541 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f0b5f09531fd673cc66127691e1e08b8-k8s-certs\") pod \"kube-controller-manager-ci-4459.2.2-a-260bc0236d\" (UID: \"f0b5f09531fd673cc66127691e1e08b8\") " pod="kube-system/kube-controller-manager-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:50.092133 kubelet[3390]: I1212 17:39:50.092077 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f0b5f09531fd673cc66127691e1e08b8-kubeconfig\") pod \"kube-controller-manager-ci-4459.2.2-a-260bc0236d\" (UID: \"f0b5f09531fd673cc66127691e1e08b8\") " pod="kube-system/kube-controller-manager-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:50.092406 kubelet[3390]: I1212 17:39:50.092387 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b0e6831dab289b6ec222f243e5c8be1f-ca-certs\") pod \"kube-apiserver-ci-4459.2.2-a-260bc0236d\" (UID: \"b0e6831dab289b6ec222f243e5c8be1f\") " pod="kube-system/kube-apiserver-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:50.092572 kubelet[3390]: I1212 17:39:50.092518 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f7d54f61c7ce63e35b0b8d7b6da0e59d-kubeconfig\") pod \"kube-scheduler-ci-4459.2.2-a-260bc0236d\" (UID: \"f7d54f61c7ce63e35b0b8d7b6da0e59d\") " pod="kube-system/kube-scheduler-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:50.092572 kubelet[3390]: I1212 17:39:50.092536 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b0e6831dab289b6ec222f243e5c8be1f-k8s-certs\") pod \"kube-apiserver-ci-4459.2.2-a-260bc0236d\" (UID: \"b0e6831dab289b6ec222f243e5c8be1f\") " pod="kube-system/kube-apiserver-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:50.092572 kubelet[3390]: I1212 17:39:50.092550 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b0e6831dab289b6ec222f243e5c8be1f-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459.2.2-a-260bc0236d\" (UID: \"b0e6831dab289b6ec222f243e5c8be1f\") " pod="kube-system/kube-apiserver-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:50.092734 kubelet[3390]: I1212 17:39:50.092680 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f0b5f09531fd673cc66127691e1e08b8-ca-certs\") pod \"kube-controller-manager-ci-4459.2.2-a-260bc0236d\" (UID: \"f0b5f09531fd673cc66127691e1e08b8\") " pod="kube-system/kube-controller-manager-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:50.092734 kubelet[3390]: I1212 17:39:50.092701 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/f0b5f09531fd673cc66127691e1e08b8-flexvolume-dir\") pod \"kube-controller-manager-ci-4459.2.2-a-260bc0236d\" (UID: \"f0b5f09531fd673cc66127691e1e08b8\") " pod="kube-system/kube-controller-manager-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:50.092734 kubelet[3390]: I1212 17:39:50.092713 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f0b5f09531fd673cc66127691e1e08b8-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459.2.2-a-260bc0236d\" (UID: \"f0b5f09531fd673cc66127691e1e08b8\") " pod="kube-system/kube-controller-manager-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:50.770821 kubelet[3390]: I1212 17:39:50.770782 3390 apiserver.go:52] "Watching apiserver" Dec 12 17:39:50.790856 kubelet[3390]: I1212 17:39:50.790807 3390 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 12 17:39:50.823604 kubelet[3390]: I1212 17:39:50.823574 3390 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:50.824443 kubelet[3390]: I1212 17:39:50.824346 3390 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:50.852994 kubelet[3390]: I1212 17:39:50.852924 3390 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459.2.2-a-260bc0236d" podStartSLOduration=1.852776643 podStartE2EDuration="1.852776643s" podCreationTimestamp="2025-12-12 17:39:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:39:50.824958215 +0000 UTC m=+1.099606079" watchObservedRunningTime="2025-12-12 17:39:50.852776643 +0000 UTC m=+1.127424459" Dec 12 17:39:50.853620 kubelet[3390]: W1212 17:39:50.853574 3390 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Dec 12 17:39:50.853789 kubelet[3390]: E1212 17:39:50.853706 3390 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459.2.2-a-260bc0236d\" already exists" pod="kube-system/kube-apiserver-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:50.854784 kubelet[3390]: W1212 17:39:50.854731 3390 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Dec 12 17:39:50.855046 kubelet[3390]: E1212 17:39:50.854763 3390 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459.2.2-a-260bc0236d\" already exists" pod="kube-system/kube-controller-manager-ci-4459.2.2-a-260bc0236d" Dec 12 17:39:50.866496 kubelet[3390]: I1212 17:39:50.866373 3390 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459.2.2-a-260bc0236d" podStartSLOduration=1.8663621959999999 podStartE2EDuration="1.866362196s" podCreationTimestamp="2025-12-12 17:39:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:39:50.854424388 +0000 UTC m=+1.129072204" watchObservedRunningTime="2025-12-12 17:39:50.866362196 +0000 UTC m=+1.141010020" Dec 12 17:39:50.879104 kubelet[3390]: I1212 17:39:50.878731 3390 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459.2.2-a-260bc0236d" podStartSLOduration=3.878719941 podStartE2EDuration="3.878719941s" podCreationTimestamp="2025-12-12 17:39:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:39:50.867388517 +0000 UTC m=+1.142036333" watchObservedRunningTime="2025-12-12 17:39:50.878719941 +0000 UTC m=+1.153367757" Dec 12 17:39:55.230008 kubelet[3390]: I1212 17:39:55.229972 3390 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Dec 12 17:39:55.230462 containerd[1898]: time="2025-12-12T17:39:55.230347096Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Dec 12 17:39:55.230677 kubelet[3390]: I1212 17:39:55.230532 3390 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Dec 12 17:39:56.022340 systemd[1]: Created slice kubepods-besteffort-podb2cf8e95_b16f_4337_a18d_11fcf06f6cd7.slice - libcontainer container kubepods-besteffort-podb2cf8e95_b16f_4337_a18d_11fcf06f6cd7.slice. Dec 12 17:39:56.025753 kubelet[3390]: I1212 17:39:56.025726 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b2cf8e95-b16f-4337-a18d-11fcf06f6cd7-lib-modules\") pod \"kube-proxy-v84h9\" (UID: \"b2cf8e95-b16f-4337-a18d-11fcf06f6cd7\") " pod="kube-system/kube-proxy-v84h9" Dec 12 17:39:56.025753 kubelet[3390]: I1212 17:39:56.025755 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qwmr\" (UniqueName: \"kubernetes.io/projected/b2cf8e95-b16f-4337-a18d-11fcf06f6cd7-kube-api-access-4qwmr\") pod \"kube-proxy-v84h9\" (UID: \"b2cf8e95-b16f-4337-a18d-11fcf06f6cd7\") " pod="kube-system/kube-proxy-v84h9" Dec 12 17:39:56.025753 kubelet[3390]: I1212 17:39:56.025772 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/b2cf8e95-b16f-4337-a18d-11fcf06f6cd7-kube-proxy\") pod \"kube-proxy-v84h9\" (UID: \"b2cf8e95-b16f-4337-a18d-11fcf06f6cd7\") " pod="kube-system/kube-proxy-v84h9" Dec 12 17:39:56.025753 kubelet[3390]: I1212 17:39:56.025806 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b2cf8e95-b16f-4337-a18d-11fcf06f6cd7-xtables-lock\") pod \"kube-proxy-v84h9\" (UID: \"b2cf8e95-b16f-4337-a18d-11fcf06f6cd7\") " pod="kube-system/kube-proxy-v84h9" Dec 12 17:39:56.332255 containerd[1898]: time="2025-12-12T17:39:56.331935236Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-v84h9,Uid:b2cf8e95-b16f-4337-a18d-11fcf06f6cd7,Namespace:kube-system,Attempt:0,}" Dec 12 17:39:56.374268 containerd[1898]: time="2025-12-12T17:39:56.374207516Z" level=info msg="connecting to shim 741be5b44bde60238840d364be959965ba057560002638550e5e07c438ae4c18" address="unix:///run/containerd/s/d8790914da5725dda9cacf6b92e277b50e9cf342f7a38a4d6bc9717688a5fa7c" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:39:56.392349 systemd[1]: Started cri-containerd-741be5b44bde60238840d364be959965ba057560002638550e5e07c438ae4c18.scope - libcontainer container 741be5b44bde60238840d364be959965ba057560002638550e5e07c438ae4c18. Dec 12 17:39:56.423516 containerd[1898]: time="2025-12-12T17:39:56.423474321Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-v84h9,Uid:b2cf8e95-b16f-4337-a18d-11fcf06f6cd7,Namespace:kube-system,Attempt:0,} returns sandbox id \"741be5b44bde60238840d364be959965ba057560002638550e5e07c438ae4c18\"" Dec 12 17:39:56.426371 containerd[1898]: time="2025-12-12T17:39:56.426343394Z" level=info msg="CreateContainer within sandbox \"741be5b44bde60238840d364be959965ba057560002638550e5e07c438ae4c18\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Dec 12 17:39:56.449839 containerd[1898]: time="2025-12-12T17:39:56.448486694Z" level=info msg="Container 62ab09c10efbbc2437a9e941e926f8ff772871745e15e00cb882a47baf7ba4b3: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:39:56.452072 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2474536273.mount: Deactivated successfully. Dec 12 17:39:56.468277 containerd[1898]: time="2025-12-12T17:39:56.468240227Z" level=info msg="CreateContainer within sandbox \"741be5b44bde60238840d364be959965ba057560002638550e5e07c438ae4c18\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"62ab09c10efbbc2437a9e941e926f8ff772871745e15e00cb882a47baf7ba4b3\"" Dec 12 17:39:56.468807 containerd[1898]: time="2025-12-12T17:39:56.468780912Z" level=info msg="StartContainer for \"62ab09c10efbbc2437a9e941e926f8ff772871745e15e00cb882a47baf7ba4b3\"" Dec 12 17:39:56.471716 containerd[1898]: time="2025-12-12T17:39:56.471681227Z" level=info msg="connecting to shim 62ab09c10efbbc2437a9e941e926f8ff772871745e15e00cb882a47baf7ba4b3" address="unix:///run/containerd/s/d8790914da5725dda9cacf6b92e277b50e9cf342f7a38a4d6bc9717688a5fa7c" protocol=ttrpc version=3 Dec 12 17:39:56.487259 systemd[1]: Created slice kubepods-besteffort-pod270f0974_5328_4d44_9721_32fc1aca132b.slice - libcontainer container kubepods-besteffort-pod270f0974_5328_4d44_9721_32fc1aca132b.slice. Dec 12 17:39:56.499480 systemd[1]: Started cri-containerd-62ab09c10efbbc2437a9e941e926f8ff772871745e15e00cb882a47baf7ba4b3.scope - libcontainer container 62ab09c10efbbc2437a9e941e926f8ff772871745e15e00cb882a47baf7ba4b3. Dec 12 17:39:56.529093 kubelet[3390]: I1212 17:39:56.529061 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/270f0974-5328-4d44-9721-32fc1aca132b-var-lib-calico\") pod \"tigera-operator-7dcd859c48-sfc8c\" (UID: \"270f0974-5328-4d44-9721-32fc1aca132b\") " pod="tigera-operator/tigera-operator-7dcd859c48-sfc8c" Dec 12 17:39:56.529371 kubelet[3390]: I1212 17:39:56.529105 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz5lx\" (UniqueName: \"kubernetes.io/projected/270f0974-5328-4d44-9721-32fc1aca132b-kube-api-access-nz5lx\") pod \"tigera-operator-7dcd859c48-sfc8c\" (UID: \"270f0974-5328-4d44-9721-32fc1aca132b\") " pod="tigera-operator/tigera-operator-7dcd859c48-sfc8c" Dec 12 17:39:56.563821 containerd[1898]: time="2025-12-12T17:39:56.563775941Z" level=info msg="StartContainer for \"62ab09c10efbbc2437a9e941e926f8ff772871745e15e00cb882a47baf7ba4b3\" returns successfully" Dec 12 17:39:56.792498 containerd[1898]: time="2025-12-12T17:39:56.792430712Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-sfc8c,Uid:270f0974-5328-4d44-9721-32fc1aca132b,Namespace:tigera-operator,Attempt:0,}" Dec 12 17:39:56.829266 containerd[1898]: time="2025-12-12T17:39:56.828522579Z" level=info msg="connecting to shim 998faeb88c5d9c25ffa5d295b793d324b823912021321af064efe6a4fd3d2a11" address="unix:///run/containerd/s/0c3f226253ea880c008f7293e266bd5be710c63d7883cde881a0adfa538a76e8" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:39:56.851685 systemd[1]: Started cri-containerd-998faeb88c5d9c25ffa5d295b793d324b823912021321af064efe6a4fd3d2a11.scope - libcontainer container 998faeb88c5d9c25ffa5d295b793d324b823912021321af064efe6a4fd3d2a11. Dec 12 17:39:56.866302 kubelet[3390]: I1212 17:39:56.866189 3390 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-v84h9" podStartSLOduration=0.866171028 podStartE2EDuration="866.171028ms" podCreationTimestamp="2025-12-12 17:39:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:39:56.84948448 +0000 UTC m=+7.124132360" watchObservedRunningTime="2025-12-12 17:39:56.866171028 +0000 UTC m=+7.140818844" Dec 12 17:39:56.886448 containerd[1898]: time="2025-12-12T17:39:56.886411180Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-sfc8c,Uid:270f0974-5328-4d44-9721-32fc1aca132b,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"998faeb88c5d9c25ffa5d295b793d324b823912021321af064efe6a4fd3d2a11\"" Dec 12 17:39:56.888429 containerd[1898]: time="2025-12-12T17:39:56.888398931Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Dec 12 17:39:58.319573 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3053417769.mount: Deactivated successfully. Dec 12 17:39:59.469061 containerd[1898]: time="2025-12-12T17:39:59.468554023Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:39:59.471359 containerd[1898]: time="2025-12-12T17:39:59.471328898Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=22152004" Dec 12 17:39:59.474752 containerd[1898]: time="2025-12-12T17:39:59.474728675Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:39:59.482010 containerd[1898]: time="2025-12-12T17:39:59.481961036Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:39:59.482492 containerd[1898]: time="2025-12-12T17:39:59.482316593Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 2.593890253s" Dec 12 17:39:59.482492 containerd[1898]: time="2025-12-12T17:39:59.482346666Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Dec 12 17:39:59.484816 containerd[1898]: time="2025-12-12T17:39:59.484790705Z" level=info msg="CreateContainer within sandbox \"998faeb88c5d9c25ffa5d295b793d324b823912021321af064efe6a4fd3d2a11\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Dec 12 17:39:59.505766 containerd[1898]: time="2025-12-12T17:39:59.505340133Z" level=info msg="Container ffd171318f358a04644a319cd34faff5c0792a98303ada241b03414fec67bcde: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:39:59.518468 containerd[1898]: time="2025-12-12T17:39:59.518433543Z" level=info msg="CreateContainer within sandbox \"998faeb88c5d9c25ffa5d295b793d324b823912021321af064efe6a4fd3d2a11\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"ffd171318f358a04644a319cd34faff5c0792a98303ada241b03414fec67bcde\"" Dec 12 17:39:59.519270 containerd[1898]: time="2025-12-12T17:39:59.519251508Z" level=info msg="StartContainer for \"ffd171318f358a04644a319cd34faff5c0792a98303ada241b03414fec67bcde\"" Dec 12 17:39:59.520925 containerd[1898]: time="2025-12-12T17:39:59.520731448Z" level=info msg="connecting to shim ffd171318f358a04644a319cd34faff5c0792a98303ada241b03414fec67bcde" address="unix:///run/containerd/s/0c3f226253ea880c008f7293e266bd5be710c63d7883cde881a0adfa538a76e8" protocol=ttrpc version=3 Dec 12 17:39:59.544340 systemd[1]: Started cri-containerd-ffd171318f358a04644a319cd34faff5c0792a98303ada241b03414fec67bcde.scope - libcontainer container ffd171318f358a04644a319cd34faff5c0792a98303ada241b03414fec67bcde. Dec 12 17:39:59.570782 containerd[1898]: time="2025-12-12T17:39:59.570748517Z" level=info msg="StartContainer for \"ffd171318f358a04644a319cd34faff5c0792a98303ada241b03414fec67bcde\" returns successfully" Dec 12 17:39:59.855959 kubelet[3390]: I1212 17:39:59.855892 3390 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-sfc8c" podStartSLOduration=1.26022665 podStartE2EDuration="3.855874306s" podCreationTimestamp="2025-12-12 17:39:56 +0000 UTC" firstStartedPulling="2025-12-12 17:39:56.887498919 +0000 UTC m=+7.162146735" lastFinishedPulling="2025-12-12 17:39:59.483146575 +0000 UTC m=+9.757794391" observedRunningTime="2025-12-12 17:39:59.855541294 +0000 UTC m=+10.130189214" watchObservedRunningTime="2025-12-12 17:39:59.855874306 +0000 UTC m=+10.130522122" Dec 12 17:40:04.621845 sudo[2378]: pam_unix(sudo:session): session closed for user root Dec 12 17:40:04.696866 sshd[2377]: Connection closed by 10.200.16.10 port 49836 Dec 12 17:40:04.697605 sshd-session[2374]: pam_unix(sshd:session): session closed for user core Dec 12 17:40:04.703096 systemd[1]: sshd@6-10.200.20.10:22-10.200.16.10:49836.service: Deactivated successfully. Dec 12 17:40:04.707555 systemd[1]: session-9.scope: Deactivated successfully. Dec 12 17:40:04.709227 systemd[1]: session-9.scope: Consumed 2.649s CPU time, 219.7M memory peak. Dec 12 17:40:04.713851 systemd-logind[1875]: Session 9 logged out. Waiting for processes to exit. Dec 12 17:40:04.716892 systemd-logind[1875]: Removed session 9. Dec 12 17:40:11.292979 systemd[1]: Created slice kubepods-besteffort-podd941e87e_d44c_4761_a1c0_2721aafc58b3.slice - libcontainer container kubepods-besteffort-podd941e87e_d44c_4761_a1c0_2721aafc58b3.slice. Dec 12 17:40:11.315299 kubelet[3390]: I1212 17:40:11.315204 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/d941e87e-d44c-4761-a1c0-2721aafc58b3-typha-certs\") pod \"calico-typha-78b4df6dbf-87skb\" (UID: \"d941e87e-d44c-4761-a1c0-2721aafc58b3\") " pod="calico-system/calico-typha-78b4df6dbf-87skb" Dec 12 17:40:11.315299 kubelet[3390]: I1212 17:40:11.315258 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d941e87e-d44c-4761-a1c0-2721aafc58b3-tigera-ca-bundle\") pod \"calico-typha-78b4df6dbf-87skb\" (UID: \"d941e87e-d44c-4761-a1c0-2721aafc58b3\") " pod="calico-system/calico-typha-78b4df6dbf-87skb" Dec 12 17:40:11.315299 kubelet[3390]: I1212 17:40:11.315277 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vg82\" (UniqueName: \"kubernetes.io/projected/d941e87e-d44c-4761-a1c0-2721aafc58b3-kube-api-access-8vg82\") pod \"calico-typha-78b4df6dbf-87skb\" (UID: \"d941e87e-d44c-4761-a1c0-2721aafc58b3\") " pod="calico-system/calico-typha-78b4df6dbf-87skb" Dec 12 17:40:11.473505 systemd[1]: Created slice kubepods-besteffort-poda2cde55e_efae_46b1_b518_57466ec89f65.slice - libcontainer container kubepods-besteffort-poda2cde55e_efae_46b1_b518_57466ec89f65.slice. Dec 12 17:40:11.516064 kubelet[3390]: I1212 17:40:11.515980 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a2cde55e-efae-46b1-b518-57466ec89f65-lib-modules\") pod \"calico-node-5v4v9\" (UID: \"a2cde55e-efae-46b1-b518-57466ec89f65\") " pod="calico-system/calico-node-5v4v9" Dec 12 17:40:11.516064 kubelet[3390]: I1212 17:40:11.516018 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/a2cde55e-efae-46b1-b518-57466ec89f65-node-certs\") pod \"calico-node-5v4v9\" (UID: \"a2cde55e-efae-46b1-b518-57466ec89f65\") " pod="calico-system/calico-node-5v4v9" Dec 12 17:40:11.516064 kubelet[3390]: I1212 17:40:11.516041 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/a2cde55e-efae-46b1-b518-57466ec89f65-cni-log-dir\") pod \"calico-node-5v4v9\" (UID: \"a2cde55e-efae-46b1-b518-57466ec89f65\") " pod="calico-system/calico-node-5v4v9" Dec 12 17:40:11.516064 kubelet[3390]: I1212 17:40:11.516079 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/a2cde55e-efae-46b1-b518-57466ec89f65-cni-bin-dir\") pod \"calico-node-5v4v9\" (UID: \"a2cde55e-efae-46b1-b518-57466ec89f65\") " pod="calico-system/calico-node-5v4v9" Dec 12 17:40:11.516408 kubelet[3390]: I1212 17:40:11.516088 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/a2cde55e-efae-46b1-b518-57466ec89f65-cni-net-dir\") pod \"calico-node-5v4v9\" (UID: \"a2cde55e-efae-46b1-b518-57466ec89f65\") " pod="calico-system/calico-node-5v4v9" Dec 12 17:40:11.516408 kubelet[3390]: I1212 17:40:11.516099 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2cde55e-efae-46b1-b518-57466ec89f65-tigera-ca-bundle\") pod \"calico-node-5v4v9\" (UID: \"a2cde55e-efae-46b1-b518-57466ec89f65\") " pod="calico-system/calico-node-5v4v9" Dec 12 17:40:11.516408 kubelet[3390]: I1212 17:40:11.516108 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kmf2\" (UniqueName: \"kubernetes.io/projected/a2cde55e-efae-46b1-b518-57466ec89f65-kube-api-access-6kmf2\") pod \"calico-node-5v4v9\" (UID: \"a2cde55e-efae-46b1-b518-57466ec89f65\") " pod="calico-system/calico-node-5v4v9" Dec 12 17:40:11.516408 kubelet[3390]: I1212 17:40:11.516147 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/a2cde55e-efae-46b1-b518-57466ec89f65-policysync\") pod \"calico-node-5v4v9\" (UID: \"a2cde55e-efae-46b1-b518-57466ec89f65\") " pod="calico-system/calico-node-5v4v9" Dec 12 17:40:11.516681 kubelet[3390]: I1212 17:40:11.516188 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a2cde55e-efae-46b1-b518-57466ec89f65-var-lib-calico\") pod \"calico-node-5v4v9\" (UID: \"a2cde55e-efae-46b1-b518-57466ec89f65\") " pod="calico-system/calico-node-5v4v9" Dec 12 17:40:11.516681 kubelet[3390]: I1212 17:40:11.516570 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/a2cde55e-efae-46b1-b518-57466ec89f65-flexvol-driver-host\") pod \"calico-node-5v4v9\" (UID: \"a2cde55e-efae-46b1-b518-57466ec89f65\") " pod="calico-system/calico-node-5v4v9" Dec 12 17:40:11.516681 kubelet[3390]: I1212 17:40:11.516602 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/a2cde55e-efae-46b1-b518-57466ec89f65-var-run-calico\") pod \"calico-node-5v4v9\" (UID: \"a2cde55e-efae-46b1-b518-57466ec89f65\") " pod="calico-system/calico-node-5v4v9" Dec 12 17:40:11.516681 kubelet[3390]: I1212 17:40:11.516616 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a2cde55e-efae-46b1-b518-57466ec89f65-xtables-lock\") pod \"calico-node-5v4v9\" (UID: \"a2cde55e-efae-46b1-b518-57466ec89f65\") " pod="calico-system/calico-node-5v4v9" Dec 12 17:40:11.600233 containerd[1898]: time="2025-12-12T17:40:11.599848172Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-78b4df6dbf-87skb,Uid:d941e87e-d44c-4761-a1c0-2721aafc58b3,Namespace:calico-system,Attempt:0,}" Dec 12 17:40:11.618524 kubelet[3390]: E1212 17:40:11.618473 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.618524 kubelet[3390]: W1212 17:40:11.618504 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.618743 kubelet[3390]: E1212 17:40:11.618580 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.618934 kubelet[3390]: E1212 17:40:11.618863 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.618934 kubelet[3390]: W1212 17:40:11.618881 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.618934 kubelet[3390]: E1212 17:40:11.618894 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.619047 kubelet[3390]: E1212 17:40:11.619033 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.619047 kubelet[3390]: W1212 17:40:11.619045 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.619189 kubelet[3390]: E1212 17:40:11.619053 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.619189 kubelet[3390]: E1212 17:40:11.619141 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.619189 kubelet[3390]: W1212 17:40:11.619146 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.619266 kubelet[3390]: E1212 17:40:11.619154 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.619480 kubelet[3390]: E1212 17:40:11.619454 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.619480 kubelet[3390]: W1212 17:40:11.619466 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.619663 kubelet[3390]: E1212 17:40:11.619651 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.620527 kubelet[3390]: E1212 17:40:11.620473 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.620969 kubelet[3390]: W1212 17:40:11.620861 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.622121 kubelet[3390]: E1212 17:40:11.622079 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.622121 kubelet[3390]: W1212 17:40:11.622095 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.622448 kubelet[3390]: E1212 17:40:11.622390 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.623002 kubelet[3390]: E1212 17:40:11.622966 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.623310 kubelet[3390]: W1212 17:40:11.623162 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.624722 kubelet[3390]: E1212 17:40:11.623081 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.624722 kubelet[3390]: E1212 17:40:11.623366 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.624722 kubelet[3390]: E1212 17:40:11.623693 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.624722 kubelet[3390]: W1212 17:40:11.623704 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.624722 kubelet[3390]: E1212 17:40:11.623886 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.624722 kubelet[3390]: E1212 17:40:11.624367 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.624722 kubelet[3390]: W1212 17:40:11.624377 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.625180 kubelet[3390]: E1212 17:40:11.624866 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.625180 kubelet[3390]: W1212 17:40:11.624881 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.625180 kubelet[3390]: E1212 17:40:11.624986 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.625180 kubelet[3390]: W1212 17:40:11.624991 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.625180 kubelet[3390]: E1212 17:40:11.625030 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.625180 kubelet[3390]: E1212 17:40:11.625053 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.625180 kubelet[3390]: E1212 17:40:11.625063 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.625180 kubelet[3390]: W1212 17:40:11.625081 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.625180 kubelet[3390]: E1212 17:40:11.625151 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.625180 kubelet[3390]: E1212 17:40:11.625064 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.625775 kubelet[3390]: E1212 17:40:11.625161 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.625775 kubelet[3390]: W1212 17:40:11.625155 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.625775 kubelet[3390]: E1212 17:40:11.625286 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.625775 kubelet[3390]: W1212 17:40:11.625292 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.625775 kubelet[3390]: E1212 17:40:11.625298 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.625775 kubelet[3390]: E1212 17:40:11.625315 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.625775 kubelet[3390]: E1212 17:40:11.625405 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.625775 kubelet[3390]: W1212 17:40:11.625411 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.625775 kubelet[3390]: E1212 17:40:11.625467 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.625775 kubelet[3390]: E1212 17:40:11.625492 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.625928 kubelet[3390]: W1212 17:40:11.625497 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.625928 kubelet[3390]: E1212 17:40:11.625560 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.625928 kubelet[3390]: W1212 17:40:11.625564 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.625928 kubelet[3390]: E1212 17:40:11.625571 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.625928 kubelet[3390]: E1212 17:40:11.625636 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.625928 kubelet[3390]: W1212 17:40:11.625639 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.625928 kubelet[3390]: E1212 17:40:11.625644 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.625928 kubelet[3390]: E1212 17:40:11.625742 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.625928 kubelet[3390]: W1212 17:40:11.625746 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.625928 kubelet[3390]: E1212 17:40:11.625752 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.626063 kubelet[3390]: E1212 17:40:11.625829 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.626063 kubelet[3390]: W1212 17:40:11.625833 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.626063 kubelet[3390]: E1212 17:40:11.625838 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.626063 kubelet[3390]: E1212 17:40:11.625921 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.626063 kubelet[3390]: W1212 17:40:11.625926 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.626063 kubelet[3390]: E1212 17:40:11.625931 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.626063 kubelet[3390]: E1212 17:40:11.626022 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.626063 kubelet[3390]: W1212 17:40:11.626026 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.626063 kubelet[3390]: E1212 17:40:11.626031 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.626773 kubelet[3390]: E1212 17:40:11.626205 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.627009 kubelet[3390]: E1212 17:40:11.626993 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.627009 kubelet[3390]: W1212 17:40:11.627005 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.627394 kubelet[3390]: E1212 17:40:11.627020 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.627976 kubelet[3390]: E1212 17:40:11.627803 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.628191 kubelet[3390]: W1212 17:40:11.628172 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.628376 kubelet[3390]: E1212 17:40:11.628362 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.629035 kubelet[3390]: E1212 17:40:11.628941 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.629176 kubelet[3390]: W1212 17:40:11.629111 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.629628 kubelet[3390]: E1212 17:40:11.629491 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.630072 kubelet[3390]: E1212 17:40:11.630015 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.630553 kubelet[3390]: W1212 17:40:11.630418 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.632999 kubelet[3390]: E1212 17:40:11.630608 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.632999 kubelet[3390]: E1212 17:40:11.631031 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.632999 kubelet[3390]: W1212 17:40:11.631042 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.632999 kubelet[3390]: E1212 17:40:11.631072 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.632999 kubelet[3390]: E1212 17:40:11.631257 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.632999 kubelet[3390]: W1212 17:40:11.631267 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.632999 kubelet[3390]: E1212 17:40:11.631443 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.632999 kubelet[3390]: E1212 17:40:11.632304 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.632999 kubelet[3390]: W1212 17:40:11.632317 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.632999 kubelet[3390]: E1212 17:40:11.632344 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.633167 kubelet[3390]: E1212 17:40:11.632523 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.633167 kubelet[3390]: W1212 17:40:11.632532 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.633167 kubelet[3390]: E1212 17:40:11.632556 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.633167 kubelet[3390]: E1212 17:40:11.632697 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.633167 kubelet[3390]: W1212 17:40:11.632706 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.633167 kubelet[3390]: E1212 17:40:11.632815 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.633167 kubelet[3390]: E1212 17:40:11.632866 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.633167 kubelet[3390]: W1212 17:40:11.632877 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.633971 kubelet[3390]: E1212 17:40:11.633551 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.633971 kubelet[3390]: W1212 17:40:11.633562 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.633971 kubelet[3390]: E1212 17:40:11.633664 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.634391 kubelet[3390]: E1212 17:40:11.634292 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.634916 kubelet[3390]: E1212 17:40:11.634857 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.634916 kubelet[3390]: W1212 17:40:11.634870 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.635237 kubelet[3390]: E1212 17:40:11.635142 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.635681 kubelet[3390]: E1212 17:40:11.635668 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.636060 kubelet[3390]: W1212 17:40:11.635994 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.636113 kubelet[3390]: E1212 17:40:11.636102 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.637480 kubelet[3390]: E1212 17:40:11.637466 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.638940 kubelet[3390]: W1212 17:40:11.638293 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.638940 kubelet[3390]: E1212 17:40:11.638319 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.640816 kubelet[3390]: E1212 17:40:11.640588 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.640816 kubelet[3390]: W1212 17:40:11.640602 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.640816 kubelet[3390]: E1212 17:40:11.640617 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.644053 kubelet[3390]: E1212 17:40:11.644003 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.644053 kubelet[3390]: W1212 17:40:11.644017 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.644053 kubelet[3390]: E1212 17:40:11.644029 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.652662 containerd[1898]: time="2025-12-12T17:40:11.651815487Z" level=info msg="connecting to shim dbba162dd19cfe64d6ee114a3010c6753801472599478493acf2fda8c63196b6" address="unix:///run/containerd/s/86b01956cd3e04a73b0a933db6da2565a16b8d628610c86c239f79a0c10ef199" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:40:11.664021 kubelet[3390]: E1212 17:40:11.663995 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.664021 kubelet[3390]: W1212 17:40:11.664014 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.664117 kubelet[3390]: E1212 17:40:11.664031 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.670387 kubelet[3390]: E1212 17:40:11.669434 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ccgmd" podUID="c62a98c7-a503-42c2-845c-ea1022fbba96" Dec 12 17:40:11.695379 systemd[1]: Started cri-containerd-dbba162dd19cfe64d6ee114a3010c6753801472599478493acf2fda8c63196b6.scope - libcontainer container dbba162dd19cfe64d6ee114a3010c6753801472599478493acf2fda8c63196b6. Dec 12 17:40:11.699993 kubelet[3390]: E1212 17:40:11.699971 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.700117 kubelet[3390]: W1212 17:40:11.700103 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.700195 kubelet[3390]: E1212 17:40:11.700182 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.700576 kubelet[3390]: E1212 17:40:11.700560 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.700916 kubelet[3390]: W1212 17:40:11.700646 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.700916 kubelet[3390]: E1212 17:40:11.700692 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.701909 kubelet[3390]: E1212 17:40:11.701822 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.702077 kubelet[3390]: W1212 17:40:11.702053 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.703435 kubelet[3390]: E1212 17:40:11.703333 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.703644 kubelet[3390]: E1212 17:40:11.703633 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.703714 kubelet[3390]: W1212 17:40:11.703704 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.703764 kubelet[3390]: E1212 17:40:11.703753 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.704351 kubelet[3390]: E1212 17:40:11.704297 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.704351 kubelet[3390]: W1212 17:40:11.704310 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.704351 kubelet[3390]: E1212 17:40:11.704319 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.704630 kubelet[3390]: E1212 17:40:11.704608 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.704630 kubelet[3390]: W1212 17:40:11.704619 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.704783 kubelet[3390]: E1212 17:40:11.704708 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.704906 kubelet[3390]: E1212 17:40:11.704867 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.704906 kubelet[3390]: W1212 17:40:11.704876 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.704906 kubelet[3390]: E1212 17:40:11.704884 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.706131 kubelet[3390]: E1212 17:40:11.706089 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.706131 kubelet[3390]: W1212 17:40:11.706101 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.706131 kubelet[3390]: E1212 17:40:11.706112 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.706417 kubelet[3390]: E1212 17:40:11.706405 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.706532 kubelet[3390]: W1212 17:40:11.706487 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.706532 kubelet[3390]: E1212 17:40:11.706501 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.706719 kubelet[3390]: E1212 17:40:11.706709 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.706824 kubelet[3390]: W1212 17:40:11.706757 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.706824 kubelet[3390]: E1212 17:40:11.706770 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.706980 kubelet[3390]: E1212 17:40:11.706971 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.707316 kubelet[3390]: W1212 17:40:11.707028 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.707316 kubelet[3390]: E1212 17:40:11.707040 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.707849 kubelet[3390]: E1212 17:40:11.707835 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.707972 kubelet[3390]: W1212 17:40:11.707913 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.707972 kubelet[3390]: E1212 17:40:11.707927 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.708559 kubelet[3390]: E1212 17:40:11.708515 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.708559 kubelet[3390]: W1212 17:40:11.708529 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.708559 kubelet[3390]: E1212 17:40:11.708540 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.708825 kubelet[3390]: E1212 17:40:11.708812 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.709505 kubelet[3390]: W1212 17:40:11.709485 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.709613 kubelet[3390]: E1212 17:40:11.709568 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.709780 kubelet[3390]: E1212 17:40:11.709769 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.709906 kubelet[3390]: W1212 17:40:11.709840 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.709906 kubelet[3390]: E1212 17:40:11.709852 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.710079 kubelet[3390]: E1212 17:40:11.710069 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.710280 kubelet[3390]: W1212 17:40:11.710265 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.710374 kubelet[3390]: E1212 17:40:11.710353 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.710808 kubelet[3390]: E1212 17:40:11.710796 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.711279 kubelet[3390]: W1212 17:40:11.710874 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.711279 kubelet[3390]: E1212 17:40:11.710888 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.711491 kubelet[3390]: E1212 17:40:11.711481 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.711874 kubelet[3390]: W1212 17:40:11.711547 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.711874 kubelet[3390]: E1212 17:40:11.711562 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.712070 kubelet[3390]: E1212 17:40:11.712059 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.712177 kubelet[3390]: W1212 17:40:11.712127 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.712177 kubelet[3390]: E1212 17:40:11.712142 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.712632 kubelet[3390]: E1212 17:40:11.712622 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.714304 kubelet[3390]: W1212 17:40:11.714273 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.714304 kubelet[3390]: E1212 17:40:11.714300 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.718848 kubelet[3390]: E1212 17:40:11.718835 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.719023 kubelet[3390]: W1212 17:40:11.718908 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.719023 kubelet[3390]: E1212 17:40:11.718924 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.719023 kubelet[3390]: I1212 17:40:11.718951 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8t2j\" (UniqueName: \"kubernetes.io/projected/c62a98c7-a503-42c2-845c-ea1022fbba96-kube-api-access-d8t2j\") pod \"csi-node-driver-ccgmd\" (UID: \"c62a98c7-a503-42c2-845c-ea1022fbba96\") " pod="calico-system/csi-node-driver-ccgmd" Dec 12 17:40:11.719173 kubelet[3390]: E1212 17:40:11.719161 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.719241 kubelet[3390]: W1212 17:40:11.719207 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.719396 kubelet[3390]: E1212 17:40:11.719296 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.719396 kubelet[3390]: I1212 17:40:11.719314 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/c62a98c7-a503-42c2-845c-ea1022fbba96-varrun\") pod \"csi-node-driver-ccgmd\" (UID: \"c62a98c7-a503-42c2-845c-ea1022fbba96\") " pod="calico-system/csi-node-driver-ccgmd" Dec 12 17:40:11.719555 kubelet[3390]: E1212 17:40:11.719543 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.719615 kubelet[3390]: W1212 17:40:11.719604 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.719696 kubelet[3390]: E1212 17:40:11.719685 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.719759 kubelet[3390]: I1212 17:40:11.719749 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c62a98c7-a503-42c2-845c-ea1022fbba96-kubelet-dir\") pod \"csi-node-driver-ccgmd\" (UID: \"c62a98c7-a503-42c2-845c-ea1022fbba96\") " pod="calico-system/csi-node-driver-ccgmd" Dec 12 17:40:11.720313 kubelet[3390]: E1212 17:40:11.720290 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.720313 kubelet[3390]: W1212 17:40:11.720305 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.720313 kubelet[3390]: E1212 17:40:11.720320 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.720441 kubelet[3390]: E1212 17:40:11.720429 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.720441 kubelet[3390]: W1212 17:40:11.720435 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.720477 kubelet[3390]: E1212 17:40:11.720441 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.720553 kubelet[3390]: E1212 17:40:11.720543 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.720553 kubelet[3390]: W1212 17:40:11.720550 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.720606 kubelet[3390]: E1212 17:40:11.720563 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.720660 kubelet[3390]: E1212 17:40:11.720650 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.720660 kubelet[3390]: W1212 17:40:11.720658 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.720873 kubelet[3390]: E1212 17:40:11.720849 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.721033 kubelet[3390]: E1212 17:40:11.721020 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.721033 kubelet[3390]: W1212 17:40:11.721031 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.722287 kubelet[3390]: E1212 17:40:11.722260 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.722359 kubelet[3390]: I1212 17:40:11.722291 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c62a98c7-a503-42c2-845c-ea1022fbba96-registration-dir\") pod \"csi-node-driver-ccgmd\" (UID: \"c62a98c7-a503-42c2-845c-ea1022fbba96\") " pod="calico-system/csi-node-driver-ccgmd" Dec 12 17:40:11.722460 kubelet[3390]: E1212 17:40:11.722444 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.722460 kubelet[3390]: W1212 17:40:11.722456 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.722521 kubelet[3390]: E1212 17:40:11.722470 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.724009 kubelet[3390]: E1212 17:40:11.722560 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.724009 kubelet[3390]: W1212 17:40:11.722565 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.724009 kubelet[3390]: E1212 17:40:11.722572 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.724009 kubelet[3390]: E1212 17:40:11.722665 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.724009 kubelet[3390]: W1212 17:40:11.722669 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.724009 kubelet[3390]: E1212 17:40:11.722675 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.724009 kubelet[3390]: I1212 17:40:11.722687 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c62a98c7-a503-42c2-845c-ea1022fbba96-socket-dir\") pod \"csi-node-driver-ccgmd\" (UID: \"c62a98c7-a503-42c2-845c-ea1022fbba96\") " pod="calico-system/csi-node-driver-ccgmd" Dec 12 17:40:11.724009 kubelet[3390]: E1212 17:40:11.722784 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.724009 kubelet[3390]: W1212 17:40:11.722789 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.724333 kubelet[3390]: E1212 17:40:11.722795 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.724333 kubelet[3390]: E1212 17:40:11.722856 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.724333 kubelet[3390]: W1212 17:40:11.722860 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.724333 kubelet[3390]: E1212 17:40:11.722865 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.724333 kubelet[3390]: E1212 17:40:11.722937 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.724333 kubelet[3390]: W1212 17:40:11.722941 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.724333 kubelet[3390]: E1212 17:40:11.722945 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.724333 kubelet[3390]: E1212 17:40:11.723008 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.724333 kubelet[3390]: W1212 17:40:11.723011 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.724333 kubelet[3390]: E1212 17:40:11.723015 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.769655 containerd[1898]: time="2025-12-12T17:40:11.769599999Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-78b4df6dbf-87skb,Uid:d941e87e-d44c-4761-a1c0-2721aafc58b3,Namespace:calico-system,Attempt:0,} returns sandbox id \"dbba162dd19cfe64d6ee114a3010c6753801472599478493acf2fda8c63196b6\"" Dec 12 17:40:11.772210 containerd[1898]: time="2025-12-12T17:40:11.772181507Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Dec 12 17:40:11.777500 containerd[1898]: time="2025-12-12T17:40:11.777466557Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-5v4v9,Uid:a2cde55e-efae-46b1-b518-57466ec89f65,Namespace:calico-system,Attempt:0,}" Dec 12 17:40:11.821389 containerd[1898]: time="2025-12-12T17:40:11.821352923Z" level=info msg="connecting to shim 32638453933c3bd1e80ef5aa1252ecaae34c5a0949dda0e8f0a1898f780f6ef9" address="unix:///run/containerd/s/b09b4e47554929bd945bb552ada442e7e69a9e38e5fa9b65c988b66d115429f1" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:40:11.823469 kubelet[3390]: E1212 17:40:11.823442 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.823792 kubelet[3390]: W1212 17:40:11.823579 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.823792 kubelet[3390]: E1212 17:40:11.823606 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.824154 kubelet[3390]: E1212 17:40:11.824141 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.824255 kubelet[3390]: W1212 17:40:11.824236 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.824387 kubelet[3390]: E1212 17:40:11.824306 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.824628 kubelet[3390]: E1212 17:40:11.824618 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.824790 kubelet[3390]: W1212 17:40:11.824694 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.824790 kubelet[3390]: E1212 17:40:11.824714 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.826581 kubelet[3390]: E1212 17:40:11.826291 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.826581 kubelet[3390]: W1212 17:40:11.826307 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.826581 kubelet[3390]: E1212 17:40:11.826329 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.827029 kubelet[3390]: E1212 17:40:11.826971 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.827322 kubelet[3390]: W1212 17:40:11.827184 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.827322 kubelet[3390]: E1212 17:40:11.827246 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.828670 kubelet[3390]: E1212 17:40:11.828486 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.828670 kubelet[3390]: W1212 17:40:11.828500 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.828670 kubelet[3390]: E1212 17:40:11.828635 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.829643 kubelet[3390]: E1212 17:40:11.829529 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.829943 kubelet[3390]: W1212 17:40:11.829707 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.830809 kubelet[3390]: E1212 17:40:11.830443 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.830809 kubelet[3390]: E1212 17:40:11.830620 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.830809 kubelet[3390]: W1212 17:40:11.830629 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.830809 kubelet[3390]: E1212 17:40:11.830764 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.831516 kubelet[3390]: E1212 17:40:11.831317 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.831516 kubelet[3390]: W1212 17:40:11.831388 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.831516 kubelet[3390]: E1212 17:40:11.831418 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.831989 kubelet[3390]: E1212 17:40:11.831886 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.832321 kubelet[3390]: W1212 17:40:11.832148 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.832767 kubelet[3390]: E1212 17:40:11.832637 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.833564 kubelet[3390]: E1212 17:40:11.833549 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.834207 kubelet[3390]: W1212 17:40:11.834188 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.834849 kubelet[3390]: E1212 17:40:11.834747 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.835391 kubelet[3390]: W1212 17:40:11.835129 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.837857 kubelet[3390]: E1212 17:40:11.837770 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.838280 kubelet[3390]: W1212 17:40:11.838257 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.838396 kubelet[3390]: E1212 17:40:11.837964 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.838396 kubelet[3390]: E1212 17:40:11.834876 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.839510 kubelet[3390]: E1212 17:40:11.839479 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.840624 kubelet[3390]: E1212 17:40:11.840606 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.840624 kubelet[3390]: W1212 17:40:11.840619 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.841649 kubelet[3390]: E1212 17:40:11.840745 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.841649 kubelet[3390]: E1212 17:40:11.840790 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.841649 kubelet[3390]: W1212 17:40:11.840795 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.841649 kubelet[3390]: E1212 17:40:11.840873 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.841649 kubelet[3390]: E1212 17:40:11.840903 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.841649 kubelet[3390]: W1212 17:40:11.840907 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.841649 kubelet[3390]: E1212 17:40:11.840983 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.842976 kubelet[3390]: E1212 17:40:11.841783 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.842976 kubelet[3390]: W1212 17:40:11.841791 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.844916 kubelet[3390]: E1212 17:40:11.844625 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.844916 kubelet[3390]: W1212 17:40:11.844641 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.844916 kubelet[3390]: E1212 17:40:11.844889 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.844916 kubelet[3390]: W1212 17:40:11.844898 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.845452 kubelet[3390]: E1212 17:40:11.845248 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.845452 kubelet[3390]: W1212 17:40:11.845261 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.847054 kubelet[3390]: E1212 17:40:11.846338 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.847054 kubelet[3390]: W1212 17:40:11.846353 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.847054 kubelet[3390]: E1212 17:40:11.846386 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.847054 kubelet[3390]: E1212 17:40:11.846411 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.847054 kubelet[3390]: E1212 17:40:11.846578 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.847054 kubelet[3390]: E1212 17:40:11.846623 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.847054 kubelet[3390]: E1212 17:40:11.846645 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.847054 kubelet[3390]: W1212 17:40:11.846652 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.847054 kubelet[3390]: E1212 17:40:11.846661 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.847054 kubelet[3390]: E1212 17:40:11.846771 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.847267 kubelet[3390]: W1212 17:40:11.846776 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.847267 kubelet[3390]: E1212 17:40:11.846784 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.847267 kubelet[3390]: E1212 17:40:11.846876 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.847267 kubelet[3390]: W1212 17:40:11.846880 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.847267 kubelet[3390]: E1212 17:40:11.846885 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.847267 kubelet[3390]: E1212 17:40:11.847055 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.847267 kubelet[3390]: E1212 17:40:11.847209 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.847267 kubelet[3390]: W1212 17:40:11.847226 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.847267 kubelet[3390]: E1212 17:40:11.847236 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.852372 systemd[1]: Started cri-containerd-32638453933c3bd1e80ef5aa1252ecaae34c5a0949dda0e8f0a1898f780f6ef9.scope - libcontainer container 32638453933c3bd1e80ef5aa1252ecaae34c5a0949dda0e8f0a1898f780f6ef9. Dec 12 17:40:11.867046 kubelet[3390]: E1212 17:40:11.867022 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:11.867046 kubelet[3390]: W1212 17:40:11.867041 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:11.867354 kubelet[3390]: E1212 17:40:11.867057 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:11.904405 containerd[1898]: time="2025-12-12T17:40:11.904371239Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-5v4v9,Uid:a2cde55e-efae-46b1-b518-57466ec89f65,Namespace:calico-system,Attempt:0,} returns sandbox id \"32638453933c3bd1e80ef5aa1252ecaae34c5a0949dda0e8f0a1898f780f6ef9\"" Dec 12 17:40:13.186362 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3270524131.mount: Deactivated successfully. Dec 12 17:40:13.794526 kubelet[3390]: E1212 17:40:13.794473 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ccgmd" podUID="c62a98c7-a503-42c2-845c-ea1022fbba96" Dec 12 17:40:13.946301 containerd[1898]: time="2025-12-12T17:40:13.946236690Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:13.949122 containerd[1898]: time="2025-12-12T17:40:13.948998908Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=33090687" Dec 12 17:40:13.951990 containerd[1898]: time="2025-12-12T17:40:13.951951861Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:13.956262 containerd[1898]: time="2025-12-12T17:40:13.956227349Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:13.956640 containerd[1898]: time="2025-12-12T17:40:13.956614259Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 2.184403103s" Dec 12 17:40:13.956732 containerd[1898]: time="2025-12-12T17:40:13.956719678Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Dec 12 17:40:13.957972 containerd[1898]: time="2025-12-12T17:40:13.957943122Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Dec 12 17:40:13.967005 containerd[1898]: time="2025-12-12T17:40:13.966980323Z" level=info msg="CreateContainer within sandbox \"dbba162dd19cfe64d6ee114a3010c6753801472599478493acf2fda8c63196b6\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Dec 12 17:40:13.987793 containerd[1898]: time="2025-12-12T17:40:13.987652546Z" level=info msg="Container 655dc58c4731eb2e6bc8d72b74509f7e57b34b0ed0b49fa54e3741f85c188d51: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:40:14.006686 containerd[1898]: time="2025-12-12T17:40:14.006650541Z" level=info msg="CreateContainer within sandbox \"dbba162dd19cfe64d6ee114a3010c6753801472599478493acf2fda8c63196b6\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"655dc58c4731eb2e6bc8d72b74509f7e57b34b0ed0b49fa54e3741f85c188d51\"" Dec 12 17:40:14.007160 containerd[1898]: time="2025-12-12T17:40:14.007147319Z" level=info msg="StartContainer for \"655dc58c4731eb2e6bc8d72b74509f7e57b34b0ed0b49fa54e3741f85c188d51\"" Dec 12 17:40:14.009419 containerd[1898]: time="2025-12-12T17:40:14.009379990Z" level=info msg="connecting to shim 655dc58c4731eb2e6bc8d72b74509f7e57b34b0ed0b49fa54e3741f85c188d51" address="unix:///run/containerd/s/86b01956cd3e04a73b0a933db6da2565a16b8d628610c86c239f79a0c10ef199" protocol=ttrpc version=3 Dec 12 17:40:14.028368 systemd[1]: Started cri-containerd-655dc58c4731eb2e6bc8d72b74509f7e57b34b0ed0b49fa54e3741f85c188d51.scope - libcontainer container 655dc58c4731eb2e6bc8d72b74509f7e57b34b0ed0b49fa54e3741f85c188d51. Dec 12 17:40:14.063905 containerd[1898]: time="2025-12-12T17:40:14.063542115Z" level=info msg="StartContainer for \"655dc58c4731eb2e6bc8d72b74509f7e57b34b0ed0b49fa54e3741f85c188d51\" returns successfully" Dec 12 17:40:14.906861 kubelet[3390]: I1212 17:40:14.906434 3390 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-78b4df6dbf-87skb" podStartSLOduration=1.719589349 podStartE2EDuration="3.906237523s" podCreationTimestamp="2025-12-12 17:40:11 +0000 UTC" firstStartedPulling="2025-12-12 17:40:11.771023114 +0000 UTC m=+22.045670930" lastFinishedPulling="2025-12-12 17:40:13.957671184 +0000 UTC m=+24.232319104" observedRunningTime="2025-12-12 17:40:14.891518368 +0000 UTC m=+25.166166184" watchObservedRunningTime="2025-12-12 17:40:14.906237523 +0000 UTC m=+25.180885339" Dec 12 17:40:14.933892 kubelet[3390]: E1212 17:40:14.933565 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.933892 kubelet[3390]: W1212 17:40:14.933587 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.933892 kubelet[3390]: E1212 17:40:14.933658 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.934328 kubelet[3390]: E1212 17:40:14.934299 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.934379 kubelet[3390]: W1212 17:40:14.934311 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.934379 kubelet[3390]: E1212 17:40:14.934367 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.934545 kubelet[3390]: E1212 17:40:14.934520 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.934545 kubelet[3390]: W1212 17:40:14.934531 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.934545 kubelet[3390]: E1212 17:40:14.934539 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.934751 kubelet[3390]: E1212 17:40:14.934705 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.934955 kubelet[3390]: W1212 17:40:14.934720 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.934955 kubelet[3390]: E1212 17:40:14.934934 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.935393 kubelet[3390]: E1212 17:40:14.935370 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.935530 kubelet[3390]: W1212 17:40:14.935404 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.935574 kubelet[3390]: E1212 17:40:14.935531 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.935706 kubelet[3390]: E1212 17:40:14.935692 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.935706 kubelet[3390]: W1212 17:40:14.935702 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.935824 kubelet[3390]: E1212 17:40:14.935711 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.936093 kubelet[3390]: E1212 17:40:14.936077 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.936093 kubelet[3390]: W1212 17:40:14.936088 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.936163 kubelet[3390]: E1212 17:40:14.936099 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.936483 kubelet[3390]: E1212 17:40:14.936466 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.936483 kubelet[3390]: W1212 17:40:14.936480 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.936711 kubelet[3390]: E1212 17:40:14.936490 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.936921 kubelet[3390]: E1212 17:40:14.936872 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.936921 kubelet[3390]: W1212 17:40:14.936883 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.936921 kubelet[3390]: E1212 17:40:14.936894 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.937326 kubelet[3390]: E1212 17:40:14.937302 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.937326 kubelet[3390]: W1212 17:40:14.937321 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.937403 kubelet[3390]: E1212 17:40:14.937332 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.937710 kubelet[3390]: E1212 17:40:14.937696 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.937710 kubelet[3390]: W1212 17:40:14.937709 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.937775 kubelet[3390]: E1212 17:40:14.937719 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.937945 kubelet[3390]: E1212 17:40:14.937933 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.937945 kubelet[3390]: W1212 17:40:14.937945 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.938023 kubelet[3390]: E1212 17:40:14.937954 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.938267 kubelet[3390]: E1212 17:40:14.938250 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.938267 kubelet[3390]: W1212 17:40:14.938261 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.938323 kubelet[3390]: E1212 17:40:14.938270 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.938497 kubelet[3390]: E1212 17:40:14.938484 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.938497 kubelet[3390]: W1212 17:40:14.938494 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.938714 kubelet[3390]: E1212 17:40:14.938504 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.938875 kubelet[3390]: E1212 17:40:14.938857 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.938875 kubelet[3390]: W1212 17:40:14.938873 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.938931 kubelet[3390]: E1212 17:40:14.938884 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.947327 kubelet[3390]: E1212 17:40:14.947301 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.947444 kubelet[3390]: W1212 17:40:14.947319 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.947479 kubelet[3390]: E1212 17:40:14.947447 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.947789 kubelet[3390]: E1212 17:40:14.947652 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.947789 kubelet[3390]: W1212 17:40:14.947663 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.947789 kubelet[3390]: E1212 17:40:14.947674 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.948142 kubelet[3390]: E1212 17:40:14.947909 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.948360 kubelet[3390]: W1212 17:40:14.948234 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.948360 kubelet[3390]: E1212 17:40:14.948267 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.948486 kubelet[3390]: E1212 17:40:14.948474 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.948944 kubelet[3390]: W1212 17:40:14.948829 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.948944 kubelet[3390]: E1212 17:40:14.948857 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.949066 kubelet[3390]: E1212 17:40:14.949055 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.949146 kubelet[3390]: W1212 17:40:14.949129 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.949279 kubelet[3390]: E1212 17:40:14.949259 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.949638 kubelet[3390]: E1212 17:40:14.949614 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.949744 kubelet[3390]: W1212 17:40:14.949721 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.949808 kubelet[3390]: E1212 17:40:14.949795 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.949994 kubelet[3390]: E1212 17:40:14.949981 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.950144 kubelet[3390]: W1212 17:40:14.950068 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.950144 kubelet[3390]: E1212 17:40:14.950099 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.950400 kubelet[3390]: E1212 17:40:14.950362 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.950400 kubelet[3390]: W1212 17:40:14.950376 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.950776 kubelet[3390]: E1212 17:40:14.950493 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.950776 kubelet[3390]: E1212 17:40:14.950553 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.950776 kubelet[3390]: W1212 17:40:14.950563 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.950776 kubelet[3390]: E1212 17:40:14.950572 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.950995 kubelet[3390]: E1212 17:40:14.950980 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.950995 kubelet[3390]: W1212 17:40:14.950989 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.951045 kubelet[3390]: E1212 17:40:14.951003 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.951114 kubelet[3390]: E1212 17:40:14.951102 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.951114 kubelet[3390]: W1212 17:40:14.951110 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.951162 kubelet[3390]: E1212 17:40:14.951118 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.951305 kubelet[3390]: E1212 17:40:14.951292 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.951305 kubelet[3390]: W1212 17:40:14.951302 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.951357 kubelet[3390]: E1212 17:40:14.951312 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.951485 kubelet[3390]: E1212 17:40:14.951472 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.951485 kubelet[3390]: W1212 17:40:14.951482 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.951537 kubelet[3390]: E1212 17:40:14.951488 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.951663 kubelet[3390]: E1212 17:40:14.951648 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.951663 kubelet[3390]: W1212 17:40:14.951658 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.951707 kubelet[3390]: E1212 17:40:14.951667 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.951774 kubelet[3390]: E1212 17:40:14.951760 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.951774 kubelet[3390]: W1212 17:40:14.951770 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.951936 kubelet[3390]: E1212 17:40:14.951776 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.952001 kubelet[3390]: E1212 17:40:14.951988 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.952001 kubelet[3390]: W1212 17:40:14.951997 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.952040 kubelet[3390]: E1212 17:40:14.952009 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.952262 kubelet[3390]: E1212 17:40:14.952246 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.952262 kubelet[3390]: W1212 17:40:14.952258 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.952322 kubelet[3390]: E1212 17:40:14.952268 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:14.952393 kubelet[3390]: E1212 17:40:14.952378 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:14.952393 kubelet[3390]: W1212 17:40:14.952387 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:14.952393 kubelet[3390]: E1212 17:40:14.952393 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.794778 kubelet[3390]: E1212 17:40:15.794405 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ccgmd" podUID="c62a98c7-a503-42c2-845c-ea1022fbba96" Dec 12 17:40:15.934942 containerd[1898]: time="2025-12-12T17:40:15.934888324Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:15.938589 containerd[1898]: time="2025-12-12T17:40:15.938554694Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4266741" Dec 12 17:40:15.941878 containerd[1898]: time="2025-12-12T17:40:15.941830739Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:15.945535 kubelet[3390]: E1212 17:40:15.945491 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.946085 containerd[1898]: time="2025-12-12T17:40:15.945912444Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:15.946492 containerd[1898]: time="2025-12-12T17:40:15.946144684Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 1.988178457s" Dec 12 17:40:15.946492 containerd[1898]: time="2025-12-12T17:40:15.946171493Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Dec 12 17:40:15.946710 kubelet[3390]: W1212 17:40:15.945515 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.946710 kubelet[3390]: E1212 17:40:15.946609 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.946874 kubelet[3390]: E1212 17:40:15.946864 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.946990 kubelet[3390]: W1212 17:40:15.946921 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.946990 kubelet[3390]: E1212 17:40:15.946936 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.947386 kubelet[3390]: E1212 17:40:15.947286 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.947386 kubelet[3390]: W1212 17:40:15.947298 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.947386 kubelet[3390]: E1212 17:40:15.947309 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.947681 kubelet[3390]: E1212 17:40:15.947591 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.947681 kubelet[3390]: W1212 17:40:15.947602 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.947681 kubelet[3390]: E1212 17:40:15.947614 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.947913 kubelet[3390]: E1212 17:40:15.947844 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.947913 kubelet[3390]: W1212 17:40:15.947854 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.947913 kubelet[3390]: E1212 17:40:15.947863 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.948119 kubelet[3390]: E1212 17:40:15.948056 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.948119 kubelet[3390]: W1212 17:40:15.948064 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.948119 kubelet[3390]: E1212 17:40:15.948072 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.948354 kubelet[3390]: E1212 17:40:15.948283 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.948354 kubelet[3390]: W1212 17:40:15.948292 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.948354 kubelet[3390]: E1212 17:40:15.948301 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.948576 kubelet[3390]: E1212 17:40:15.948499 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.948576 kubelet[3390]: W1212 17:40:15.948507 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.948576 kubelet[3390]: E1212 17:40:15.948515 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.948798 kubelet[3390]: E1212 17:40:15.948733 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.948798 kubelet[3390]: W1212 17:40:15.948743 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.948798 kubelet[3390]: E1212 17:40:15.948751 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.949013 kubelet[3390]: E1212 17:40:15.948954 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.949013 kubelet[3390]: W1212 17:40:15.948963 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.949013 kubelet[3390]: E1212 17:40:15.948971 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.949210 kubelet[3390]: E1212 17:40:15.949200 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.949348 kubelet[3390]: W1212 17:40:15.949277 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.949348 kubelet[3390]: E1212 17:40:15.949291 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.949592 kubelet[3390]: E1212 17:40:15.949515 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.949592 kubelet[3390]: W1212 17:40:15.949525 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.949592 kubelet[3390]: E1212 17:40:15.949534 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.949785 kubelet[3390]: E1212 17:40:15.949744 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.949785 kubelet[3390]: W1212 17:40:15.949753 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.949785 kubelet[3390]: E1212 17:40:15.949762 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.949994 kubelet[3390]: E1212 17:40:15.949985 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.950167 kubelet[3390]: W1212 17:40:15.950054 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.950167 kubelet[3390]: E1212 17:40:15.950104 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.950373 kubelet[3390]: E1212 17:40:15.950338 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.950373 kubelet[3390]: W1212 17:40:15.950348 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.950373 kubelet[3390]: E1212 17:40:15.950357 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.950919 containerd[1898]: time="2025-12-12T17:40:15.950893573Z" level=info msg="CreateContainer within sandbox \"32638453933c3bd1e80ef5aa1252ecaae34c5a0949dda0e8f0a1898f780f6ef9\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Dec 12 17:40:15.956993 kubelet[3390]: E1212 17:40:15.956494 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.956993 kubelet[3390]: W1212 17:40:15.956507 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.956993 kubelet[3390]: E1212 17:40:15.956516 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.957166 kubelet[3390]: E1212 17:40:15.957150 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.957414 kubelet[3390]: W1212 17:40:15.957208 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.957414 kubelet[3390]: E1212 17:40:15.957242 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.957839 kubelet[3390]: E1212 17:40:15.957825 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.957914 kubelet[3390]: W1212 17:40:15.957903 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.957981 kubelet[3390]: E1212 17:40:15.957968 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.958311 kubelet[3390]: E1212 17:40:15.958164 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.958438 kubelet[3390]: W1212 17:40:15.958396 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.958438 kubelet[3390]: E1212 17:40:15.958418 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.958676 kubelet[3390]: E1212 17:40:15.958653 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.958676 kubelet[3390]: W1212 17:40:15.958671 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.958734 kubelet[3390]: E1212 17:40:15.958690 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.958823 kubelet[3390]: E1212 17:40:15.958811 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.958823 kubelet[3390]: W1212 17:40:15.958821 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.958867 kubelet[3390]: E1212 17:40:15.958829 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.958931 kubelet[3390]: E1212 17:40:15.958920 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.958931 kubelet[3390]: W1212 17:40:15.958928 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.958977 kubelet[3390]: E1212 17:40:15.958940 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.959100 kubelet[3390]: E1212 17:40:15.959088 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.959100 kubelet[3390]: W1212 17:40:15.959097 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.959188 kubelet[3390]: E1212 17:40:15.959171 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.959490 kubelet[3390]: E1212 17:40:15.959477 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.959490 kubelet[3390]: W1212 17:40:15.959488 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.959554 kubelet[3390]: E1212 17:40:15.959499 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.959629 kubelet[3390]: E1212 17:40:15.959618 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.959629 kubelet[3390]: W1212 17:40:15.959626 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.959670 kubelet[3390]: E1212 17:40:15.959633 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.959748 kubelet[3390]: E1212 17:40:15.959734 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.959748 kubelet[3390]: W1212 17:40:15.959743 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.959826 kubelet[3390]: E1212 17:40:15.959799 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.959826 kubelet[3390]: E1212 17:40:15.959824 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.959933 kubelet[3390]: W1212 17:40:15.959828 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.959933 kubelet[3390]: E1212 17:40:15.959904 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.960023 kubelet[3390]: E1212 17:40:15.960009 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.960023 kubelet[3390]: W1212 17:40:15.960017 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.960023 kubelet[3390]: E1212 17:40:15.960026 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.960272 kubelet[3390]: E1212 17:40:15.960259 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.960330 kubelet[3390]: W1212 17:40:15.960320 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.960384 kubelet[3390]: E1212 17:40:15.960374 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.960552 kubelet[3390]: E1212 17:40:15.960530 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.960552 kubelet[3390]: W1212 17:40:15.960542 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.960552 kubelet[3390]: E1212 17:40:15.960553 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.960680 kubelet[3390]: E1212 17:40:15.960665 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.960680 kubelet[3390]: W1212 17:40:15.960675 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.960721 kubelet[3390]: E1212 17:40:15.960681 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.960785 kubelet[3390]: E1212 17:40:15.960772 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.960785 kubelet[3390]: W1212 17:40:15.960782 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.960821 kubelet[3390]: E1212 17:40:15.960788 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.961475 kubelet[3390]: E1212 17:40:15.961457 3390 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 17:40:15.961475 kubelet[3390]: W1212 17:40:15.961472 3390 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 17:40:15.961528 kubelet[3390]: E1212 17:40:15.961482 3390 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 17:40:15.975534 containerd[1898]: time="2025-12-12T17:40:15.975493279Z" level=info msg="Container 5167ed67a8b0f0903e71c547734a62ddf79c6195e62ccd5a1471768958964e05: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:40:15.994728 containerd[1898]: time="2025-12-12T17:40:15.994679425Z" level=info msg="CreateContainer within sandbox \"32638453933c3bd1e80ef5aa1252ecaae34c5a0949dda0e8f0a1898f780f6ef9\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"5167ed67a8b0f0903e71c547734a62ddf79c6195e62ccd5a1471768958964e05\"" Dec 12 17:40:15.995930 containerd[1898]: time="2025-12-12T17:40:15.995519471Z" level=info msg="StartContainer for \"5167ed67a8b0f0903e71c547734a62ddf79c6195e62ccd5a1471768958964e05\"" Dec 12 17:40:15.997379 containerd[1898]: time="2025-12-12T17:40:15.997355648Z" level=info msg="connecting to shim 5167ed67a8b0f0903e71c547734a62ddf79c6195e62ccd5a1471768958964e05" address="unix:///run/containerd/s/b09b4e47554929bd945bb552ada442e7e69a9e38e5fa9b65c988b66d115429f1" protocol=ttrpc version=3 Dec 12 17:40:16.019361 systemd[1]: Started cri-containerd-5167ed67a8b0f0903e71c547734a62ddf79c6195e62ccd5a1471768958964e05.scope - libcontainer container 5167ed67a8b0f0903e71c547734a62ddf79c6195e62ccd5a1471768958964e05. Dec 12 17:40:16.072881 containerd[1898]: time="2025-12-12T17:40:16.072716471Z" level=info msg="StartContainer for \"5167ed67a8b0f0903e71c547734a62ddf79c6195e62ccd5a1471768958964e05\" returns successfully" Dec 12 17:40:16.081838 systemd[1]: cri-containerd-5167ed67a8b0f0903e71c547734a62ddf79c6195e62ccd5a1471768958964e05.scope: Deactivated successfully. Dec 12 17:40:16.085122 containerd[1898]: time="2025-12-12T17:40:16.085072334Z" level=info msg="received container exit event container_id:\"5167ed67a8b0f0903e71c547734a62ddf79c6195e62ccd5a1471768958964e05\" id:\"5167ed67a8b0f0903e71c547734a62ddf79c6195e62ccd5a1471768958964e05\" pid:4129 exited_at:{seconds:1765561216 nanos:84770211}" Dec 12 17:40:16.103074 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5167ed67a8b0f0903e71c547734a62ddf79c6195e62ccd5a1471768958964e05-rootfs.mount: Deactivated successfully. Dec 12 17:40:17.794239 kubelet[3390]: E1212 17:40:17.793828 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ccgmd" podUID="c62a98c7-a503-42c2-845c-ea1022fbba96" Dec 12 17:40:17.889639 containerd[1898]: time="2025-12-12T17:40:17.889601249Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Dec 12 17:40:19.795036 kubelet[3390]: E1212 17:40:19.794944 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ccgmd" podUID="c62a98c7-a503-42c2-845c-ea1022fbba96" Dec 12 17:40:20.980712 containerd[1898]: time="2025-12-12T17:40:20.980663562Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:20.983904 containerd[1898]: time="2025-12-12T17:40:20.983868556Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65925816" Dec 12 17:40:20.987510 containerd[1898]: time="2025-12-12T17:40:20.987478925Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:20.991398 containerd[1898]: time="2025-12-12T17:40:20.991334094Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:20.991882 containerd[1898]: time="2025-12-12T17:40:20.991673010Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 3.102036295s" Dec 12 17:40:20.991882 containerd[1898]: time="2025-12-12T17:40:20.991701667Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Dec 12 17:40:20.994293 containerd[1898]: time="2025-12-12T17:40:20.993972651Z" level=info msg="CreateContainer within sandbox \"32638453933c3bd1e80ef5aa1252ecaae34c5a0949dda0e8f0a1898f780f6ef9\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Dec 12 17:40:21.015714 containerd[1898]: time="2025-12-12T17:40:21.015425662Z" level=info msg="Container 919696dfef21c4568daeea21752971481b61d8597dd7e87f33394261581909dd: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:40:21.017014 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount714511685.mount: Deactivated successfully. Dec 12 17:40:21.034401 containerd[1898]: time="2025-12-12T17:40:21.034350310Z" level=info msg="CreateContainer within sandbox \"32638453933c3bd1e80ef5aa1252ecaae34c5a0949dda0e8f0a1898f780f6ef9\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"919696dfef21c4568daeea21752971481b61d8597dd7e87f33394261581909dd\"" Dec 12 17:40:21.036123 containerd[1898]: time="2025-12-12T17:40:21.035050399Z" level=info msg="StartContainer for \"919696dfef21c4568daeea21752971481b61d8597dd7e87f33394261581909dd\"" Dec 12 17:40:21.037362 containerd[1898]: time="2025-12-12T17:40:21.037332721Z" level=info msg="connecting to shim 919696dfef21c4568daeea21752971481b61d8597dd7e87f33394261581909dd" address="unix:///run/containerd/s/b09b4e47554929bd945bb552ada442e7e69a9e38e5fa9b65c988b66d115429f1" protocol=ttrpc version=3 Dec 12 17:40:21.055359 systemd[1]: Started cri-containerd-919696dfef21c4568daeea21752971481b61d8597dd7e87f33394261581909dd.scope - libcontainer container 919696dfef21c4568daeea21752971481b61d8597dd7e87f33394261581909dd. Dec 12 17:40:21.116244 containerd[1898]: time="2025-12-12T17:40:21.116154322Z" level=info msg="StartContainer for \"919696dfef21c4568daeea21752971481b61d8597dd7e87f33394261581909dd\" returns successfully" Dec 12 17:40:21.794252 kubelet[3390]: E1212 17:40:21.793966 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ccgmd" podUID="c62a98c7-a503-42c2-845c-ea1022fbba96" Dec 12 17:40:22.296512 containerd[1898]: time="2025-12-12T17:40:22.296189045Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 12 17:40:22.299067 systemd[1]: cri-containerd-919696dfef21c4568daeea21752971481b61d8597dd7e87f33394261581909dd.scope: Deactivated successfully. Dec 12 17:40:22.299703 systemd[1]: cri-containerd-919696dfef21c4568daeea21752971481b61d8597dd7e87f33394261581909dd.scope: Consumed 319ms CPU time, 189.5M memory peak, 165.9M written to disk. Dec 12 17:40:22.301665 containerd[1898]: time="2025-12-12T17:40:22.301631746Z" level=info msg="received container exit event container_id:\"919696dfef21c4568daeea21752971481b61d8597dd7e87f33394261581909dd\" id:\"919696dfef21c4568daeea21752971481b61d8597dd7e87f33394261581909dd\" pid:4187 exited_at:{seconds:1765561222 nanos:301436899}" Dec 12 17:40:22.320133 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-919696dfef21c4568daeea21752971481b61d8597dd7e87f33394261581909dd-rootfs.mount: Deactivated successfully. Dec 12 17:40:22.353555 kubelet[3390]: I1212 17:40:22.353527 3390 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Dec 12 17:40:22.699919 kubelet[3390]: W1212 17:40:22.420795 3390 reflector.go:569] object-"calico-system"/"goldmane-ca-bundle": failed to list *v1.ConfigMap: configmaps "goldmane-ca-bundle" is forbidden: User "system:node:ci-4459.2.2-a-260bc0236d" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4459.2.2-a-260bc0236d' and this object Dec 12 17:40:22.699919 kubelet[3390]: E1212 17:40:22.420829 3390 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"goldmane-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"goldmane-ca-bundle\" is forbidden: User \"system:node:ci-4459.2.2-a-260bc0236d\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4459.2.2-a-260bc0236d' and this object" logger="UnhandledError" Dec 12 17:40:22.699919 kubelet[3390]: W1212 17:40:22.420795 3390 reflector.go:569] object-"calico-system"/"goldmane": failed to list *v1.ConfigMap: configmaps "goldmane" is forbidden: User "system:node:ci-4459.2.2-a-260bc0236d" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4459.2.2-a-260bc0236d' and this object Dec 12 17:40:22.699919 kubelet[3390]: E1212 17:40:22.420856 3390 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"goldmane\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"goldmane\" is forbidden: User \"system:node:ci-4459.2.2-a-260bc0236d\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4459.2.2-a-260bc0236d' and this object" logger="UnhandledError" Dec 12 17:40:22.699919 kubelet[3390]: W1212 17:40:22.420844 3390 reflector.go:569] object-"calico-system"/"goldmane-key-pair": failed to list *v1.Secret: secrets "goldmane-key-pair" is forbidden: User "system:node:ci-4459.2.2-a-260bc0236d" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4459.2.2-a-260bc0236d' and this object Dec 12 17:40:22.410569 systemd[1]: Created slice kubepods-besteffort-pod5d1301d2_72a1_41e5_ae8f_db3bb6f52314.slice - libcontainer container kubepods-besteffort-pod5d1301d2_72a1_41e5_ae8f_db3bb6f52314.slice. Dec 12 17:40:22.700165 kubelet[3390]: E1212 17:40:22.420868 3390 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"goldmane-key-pair\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"goldmane-key-pair\" is forbidden: User \"system:node:ci-4459.2.2-a-260bc0236d\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4459.2.2-a-260bc0236d' and this object" logger="UnhandledError" Dec 12 17:40:22.700165 kubelet[3390]: W1212 17:40:22.426767 3390 reflector.go:569] object-"calico-system"/"whisker-backend-key-pair": failed to list *v1.Secret: secrets "whisker-backend-key-pair" is forbidden: User "system:node:ci-4459.2.2-a-260bc0236d" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4459.2.2-a-260bc0236d' and this object Dec 12 17:40:22.700165 kubelet[3390]: E1212 17:40:22.426797 3390 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"whisker-backend-key-pair\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"whisker-backend-key-pair\" is forbidden: User \"system:node:ci-4459.2.2-a-260bc0236d\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4459.2.2-a-260bc0236d' and this object" logger="UnhandledError" Dec 12 17:40:22.700165 kubelet[3390]: I1212 17:40:22.499181 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sgzd\" (UniqueName: \"kubernetes.io/projected/5d1301d2-72a1-41e5-ae8f-db3bb6f52314-kube-api-access-2sgzd\") pod \"calico-apiserver-7984dd694b-tk5md\" (UID: \"5d1301d2-72a1-41e5-ae8f-db3bb6f52314\") " pod="calico-apiserver/calico-apiserver-7984dd694b-tk5md" Dec 12 17:40:22.423860 systemd[1]: Created slice kubepods-besteffort-pod674dddc6_fb43_422e_85c0_76f7f4bb018f.slice - libcontainer container kubepods-besteffort-pod674dddc6_fb43_422e_85c0_76f7f4bb018f.slice. Dec 12 17:40:22.701059 kubelet[3390]: I1212 17:40:22.499232 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f485e4eb-196e-4ab6-a695-2d4c1db5d278-config-volume\") pod \"coredns-668d6bf9bc-6wqk8\" (UID: \"f485e4eb-196e-4ab6-a695-2d4c1db5d278\") " pod="kube-system/coredns-668d6bf9bc-6wqk8" Dec 12 17:40:22.701059 kubelet[3390]: I1212 17:40:22.499259 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/5d1301d2-72a1-41e5-ae8f-db3bb6f52314-calico-apiserver-certs\") pod \"calico-apiserver-7984dd694b-tk5md\" (UID: \"5d1301d2-72a1-41e5-ae8f-db3bb6f52314\") " pod="calico-apiserver/calico-apiserver-7984dd694b-tk5md" Dec 12 17:40:22.701059 kubelet[3390]: I1212 17:40:22.499276 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs55d\" (UniqueName: \"kubernetes.io/projected/1960dd80-6c2f-452c-bbce-0ac57ff9095f-kube-api-access-gs55d\") pod \"whisker-76fc8fc45c-j6pkt\" (UID: \"1960dd80-6c2f-452c-bbce-0ac57ff9095f\") " pod="calico-system/whisker-76fc8fc45c-j6pkt" Dec 12 17:40:22.701059 kubelet[3390]: I1212 17:40:22.499288 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6td9\" (UniqueName: \"kubernetes.io/projected/674dddc6-fb43-422e-85c0-76f7f4bb018f-kube-api-access-b6td9\") pod \"calico-kube-controllers-5fdd79f47b-hk8h4\" (UID: \"674dddc6-fb43-422e-85c0-76f7f4bb018f\") " pod="calico-system/calico-kube-controllers-5fdd79f47b-hk8h4" Dec 12 17:40:22.701059 kubelet[3390]: I1212 17:40:22.499299 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpgx5\" (UniqueName: \"kubernetes.io/projected/f485e4eb-196e-4ab6-a695-2d4c1db5d278-kube-api-access-cpgx5\") pod \"coredns-668d6bf9bc-6wqk8\" (UID: \"f485e4eb-196e-4ab6-a695-2d4c1db5d278\") " pod="kube-system/coredns-668d6bf9bc-6wqk8" Dec 12 17:40:22.432404 systemd[1]: Created slice kubepods-burstable-podae0ae75f_5f13_4f15_8f02_a60871a329ef.slice - libcontainer container kubepods-burstable-podae0ae75f_5f13_4f15_8f02_a60871a329ef.slice. Dec 12 17:40:22.701232 kubelet[3390]: I1212 17:40:22.499311 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a665283-3e69-4f2e-9c9b-8aae93e17ef9-config\") pod \"goldmane-666569f655-lwtn5\" (UID: \"1a665283-3e69-4f2e-9c9b-8aae93e17ef9\") " pod="calico-system/goldmane-666569f655-lwtn5" Dec 12 17:40:22.701232 kubelet[3390]: I1212 17:40:22.499324 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a665283-3e69-4f2e-9c9b-8aae93e17ef9-goldmane-ca-bundle\") pod \"goldmane-666569f655-lwtn5\" (UID: \"1a665283-3e69-4f2e-9c9b-8aae93e17ef9\") " pod="calico-system/goldmane-666569f655-lwtn5" Dec 12 17:40:22.701232 kubelet[3390]: I1212 17:40:22.499334 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/1960dd80-6c2f-452c-bbce-0ac57ff9095f-whisker-backend-key-pair\") pod \"whisker-76fc8fc45c-j6pkt\" (UID: \"1960dd80-6c2f-452c-bbce-0ac57ff9095f\") " pod="calico-system/whisker-76fc8fc45c-j6pkt" Dec 12 17:40:22.701232 kubelet[3390]: I1212 17:40:22.499347 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sc8n\" (UniqueName: \"kubernetes.io/projected/1a665283-3e69-4f2e-9c9b-8aae93e17ef9-kube-api-access-8sc8n\") pod \"goldmane-666569f655-lwtn5\" (UID: \"1a665283-3e69-4f2e-9c9b-8aae93e17ef9\") " pod="calico-system/goldmane-666569f655-lwtn5" Dec 12 17:40:22.701232 kubelet[3390]: I1212 17:40:22.499361 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/0083021a-0d5d-42a9-b6e2-679318c1ae2e-calico-apiserver-certs\") pod \"calico-apiserver-7984dd694b-72thk\" (UID: \"0083021a-0d5d-42a9-b6e2-679318c1ae2e\") " pod="calico-apiserver/calico-apiserver-7984dd694b-72thk" Dec 12 17:40:22.437061 systemd[1]: Created slice kubepods-besteffort-pod1960dd80_6c2f_452c_bbce_0ac57ff9095f.slice - libcontainer container kubepods-besteffort-pod1960dd80_6c2f_452c_bbce_0ac57ff9095f.slice. Dec 12 17:40:22.701390 kubelet[3390]: I1212 17:40:22.499374 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnjm4\" (UniqueName: \"kubernetes.io/projected/ae0ae75f-5f13-4f15-8f02-a60871a329ef-kube-api-access-wnjm4\") pod \"coredns-668d6bf9bc-qxwtm\" (UID: \"ae0ae75f-5f13-4f15-8f02-a60871a329ef\") " pod="kube-system/coredns-668d6bf9bc-qxwtm" Dec 12 17:40:22.701390 kubelet[3390]: I1212 17:40:22.499386 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4xqg\" (UniqueName: \"kubernetes.io/projected/0083021a-0d5d-42a9-b6e2-679318c1ae2e-kube-api-access-s4xqg\") pod \"calico-apiserver-7984dd694b-72thk\" (UID: \"0083021a-0d5d-42a9-b6e2-679318c1ae2e\") " pod="calico-apiserver/calico-apiserver-7984dd694b-72thk" Dec 12 17:40:22.701390 kubelet[3390]: I1212 17:40:22.499396 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/674dddc6-fb43-422e-85c0-76f7f4bb018f-tigera-ca-bundle\") pod \"calico-kube-controllers-5fdd79f47b-hk8h4\" (UID: \"674dddc6-fb43-422e-85c0-76f7f4bb018f\") " pod="calico-system/calico-kube-controllers-5fdd79f47b-hk8h4" Dec 12 17:40:22.701390 kubelet[3390]: I1212 17:40:22.499405 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/1a665283-3e69-4f2e-9c9b-8aae93e17ef9-goldmane-key-pair\") pod \"goldmane-666569f655-lwtn5\" (UID: \"1a665283-3e69-4f2e-9c9b-8aae93e17ef9\") " pod="calico-system/goldmane-666569f655-lwtn5" Dec 12 17:40:22.701390 kubelet[3390]: I1212 17:40:22.499416 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1960dd80-6c2f-452c-bbce-0ac57ff9095f-whisker-ca-bundle\") pod \"whisker-76fc8fc45c-j6pkt\" (UID: \"1960dd80-6c2f-452c-bbce-0ac57ff9095f\") " pod="calico-system/whisker-76fc8fc45c-j6pkt" Dec 12 17:40:22.445134 systemd[1]: Created slice kubepods-burstable-podf485e4eb_196e_4ab6_a695_2d4c1db5d278.slice - libcontainer container kubepods-burstable-podf485e4eb_196e_4ab6_a695_2d4c1db5d278.slice. Dec 12 17:40:22.701576 kubelet[3390]: I1212 17:40:22.499425 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae0ae75f-5f13-4f15-8f02-a60871a329ef-config-volume\") pod \"coredns-668d6bf9bc-qxwtm\" (UID: \"ae0ae75f-5f13-4f15-8f02-a60871a329ef\") " pod="kube-system/coredns-668d6bf9bc-qxwtm" Dec 12 17:40:22.451838 systemd[1]: Created slice kubepods-besteffort-pod1a665283_3e69_4f2e_9c9b_8aae93e17ef9.slice - libcontainer container kubepods-besteffort-pod1a665283_3e69_4f2e_9c9b_8aae93e17ef9.slice. Dec 12 17:40:22.457331 systemd[1]: Created slice kubepods-besteffort-pod0083021a_0d5d_42a9_b6e2_679318c1ae2e.slice - libcontainer container kubepods-besteffort-pod0083021a_0d5d_42a9_b6e2_679318c1ae2e.slice. Dec 12 17:40:23.003722 containerd[1898]: time="2025-12-12T17:40:23.003532636Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7984dd694b-tk5md,Uid:5d1301d2-72a1-41e5-ae8f-db3bb6f52314,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:40:23.009961 containerd[1898]: time="2025-12-12T17:40:23.009661729Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6wqk8,Uid:f485e4eb-196e-4ab6-a695-2d4c1db5d278,Namespace:kube-system,Attempt:0,}" Dec 12 17:40:23.009961 containerd[1898]: time="2025-12-12T17:40:23.009845759Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5fdd79f47b-hk8h4,Uid:674dddc6-fb43-422e-85c0-76f7f4bb018f,Namespace:calico-system,Attempt:0,}" Dec 12 17:40:23.018914 containerd[1898]: time="2025-12-12T17:40:23.018874249Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7984dd694b-72thk,Uid:0083021a-0d5d-42a9-b6e2-679318c1ae2e,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:40:23.019020 containerd[1898]: time="2025-12-12T17:40:23.018880737Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-qxwtm,Uid:ae0ae75f-5f13-4f15-8f02-a60871a329ef,Namespace:kube-system,Attempt:0,}" Dec 12 17:40:23.353305 containerd[1898]: time="2025-12-12T17:40:23.351701521Z" level=error msg="Failed to destroy network for sandbox \"5469268f76c7038fac5aefcd66e262d9f0cdeade39e31f9670e450f501f97d3b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:40:23.353764 containerd[1898]: time="2025-12-12T17:40:23.353652789Z" level=error msg="Failed to destroy network for sandbox \"e406963f6b21c4d2f6e547d6b4403e79ddca90eb27e7f10a996b7fd63b6ab5a0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:40:23.355059 systemd[1]: run-netns-cni\x2d948f62f0\x2da524\x2de5fc\x2d0bc4\x2d7eeb58b981b2.mount: Deactivated successfully. Dec 12 17:40:23.358332 systemd[1]: run-netns-cni\x2dbaaede8a\x2dd1f5\x2d1bf2\x2d7b75\x2d51ea270775fd.mount: Deactivated successfully. Dec 12 17:40:23.358731 containerd[1898]: time="2025-12-12T17:40:23.358695236Z" level=error msg="Failed to destroy network for sandbox \"95609545219140f2c7387ee5bdabcf84b73d204c1007a7c84bd849ea20ab6dce\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:40:23.359899 containerd[1898]: time="2025-12-12T17:40:23.359836547Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6wqk8,Uid:f485e4eb-196e-4ab6-a695-2d4c1db5d278,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5469268f76c7038fac5aefcd66e262d9f0cdeade39e31f9670e450f501f97d3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:40:23.360648 systemd[1]: run-netns-cni\x2d67625d97\x2d536b\x2d5387\x2da19e\x2d04cdcc451e59.mount: Deactivated successfully. Dec 12 17:40:23.361443 kubelet[3390]: E1212 17:40:23.361234 3390 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5469268f76c7038fac5aefcd66e262d9f0cdeade39e31f9670e450f501f97d3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:40:23.361443 kubelet[3390]: E1212 17:40:23.361313 3390 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5469268f76c7038fac5aefcd66e262d9f0cdeade39e31f9670e450f501f97d3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-6wqk8" Dec 12 17:40:23.361443 kubelet[3390]: E1212 17:40:23.361329 3390 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5469268f76c7038fac5aefcd66e262d9f0cdeade39e31f9670e450f501f97d3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-6wqk8" Dec 12 17:40:23.362135 kubelet[3390]: E1212 17:40:23.361764 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-6wqk8_kube-system(f485e4eb-196e-4ab6-a695-2d4c1db5d278)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-6wqk8_kube-system(f485e4eb-196e-4ab6-a695-2d4c1db5d278)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5469268f76c7038fac5aefcd66e262d9f0cdeade39e31f9670e450f501f97d3b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-6wqk8" podUID="f485e4eb-196e-4ab6-a695-2d4c1db5d278" Dec 12 17:40:23.364074 containerd[1898]: time="2025-12-12T17:40:23.362305041Z" level=error msg="Failed to destroy network for sandbox \"bf2c3d144373ee256897d38593d699fb3e36ecf08903c8ddb357a30e23a4b46d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:40:23.363540 systemd[1]: run-netns-cni\x2dae5bea1e\x2dae3a\x2db11c\x2dd8ed\x2dc4027db50e84.mount: Deactivated successfully. Dec 12 17:40:23.365110 containerd[1898]: time="2025-12-12T17:40:23.365071529Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5fdd79f47b-hk8h4,Uid:674dddc6-fb43-422e-85c0-76f7f4bb018f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e406963f6b21c4d2f6e547d6b4403e79ddca90eb27e7f10a996b7fd63b6ab5a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:40:23.365474 kubelet[3390]: E1212 17:40:23.365397 3390 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e406963f6b21c4d2f6e547d6b4403e79ddca90eb27e7f10a996b7fd63b6ab5a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:40:23.365474 kubelet[3390]: E1212 17:40:23.365438 3390 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e406963f6b21c4d2f6e547d6b4403e79ddca90eb27e7f10a996b7fd63b6ab5a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5fdd79f47b-hk8h4" Dec 12 17:40:23.365474 kubelet[3390]: E1212 17:40:23.365451 3390 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e406963f6b21c4d2f6e547d6b4403e79ddca90eb27e7f10a996b7fd63b6ab5a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5fdd79f47b-hk8h4" Dec 12 17:40:23.365577 kubelet[3390]: E1212 17:40:23.365479 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5fdd79f47b-hk8h4_calico-system(674dddc6-fb43-422e-85c0-76f7f4bb018f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5fdd79f47b-hk8h4_calico-system(674dddc6-fb43-422e-85c0-76f7f4bb018f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e406963f6b21c4d2f6e547d6b4403e79ddca90eb27e7f10a996b7fd63b6ab5a0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5fdd79f47b-hk8h4" podUID="674dddc6-fb43-422e-85c0-76f7f4bb018f" Dec 12 17:40:23.368009 containerd[1898]: time="2025-12-12T17:40:23.367984719Z" level=error msg="Failed to destroy network for sandbox \"f6ff4d83bf3aacddc5e9d790bf46a16f498ebe26cba36b11d0111773acc1d464\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:40:23.370353 containerd[1898]: time="2025-12-12T17:40:23.370276294Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7984dd694b-72thk,Uid:0083021a-0d5d-42a9-b6e2-679318c1ae2e,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"95609545219140f2c7387ee5bdabcf84b73d204c1007a7c84bd849ea20ab6dce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:40:23.370450 kubelet[3390]: E1212 17:40:23.370421 3390 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"95609545219140f2c7387ee5bdabcf84b73d204c1007a7c84bd849ea20ab6dce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:40:23.370522 kubelet[3390]: E1212 17:40:23.370453 3390 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"95609545219140f2c7387ee5bdabcf84b73d204c1007a7c84bd849ea20ab6dce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7984dd694b-72thk" Dec 12 17:40:23.370522 kubelet[3390]: E1212 17:40:23.370470 3390 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"95609545219140f2c7387ee5bdabcf84b73d204c1007a7c84bd849ea20ab6dce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7984dd694b-72thk" Dec 12 17:40:23.370522 kubelet[3390]: E1212 17:40:23.370506 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7984dd694b-72thk_calico-apiserver(0083021a-0d5d-42a9-b6e2-679318c1ae2e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7984dd694b-72thk_calico-apiserver(0083021a-0d5d-42a9-b6e2-679318c1ae2e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"95609545219140f2c7387ee5bdabcf84b73d204c1007a7c84bd849ea20ab6dce\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7984dd694b-72thk" podUID="0083021a-0d5d-42a9-b6e2-679318c1ae2e" Dec 12 17:40:23.374420 containerd[1898]: time="2025-12-12T17:40:23.374336603Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-qxwtm,Uid:ae0ae75f-5f13-4f15-8f02-a60871a329ef,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf2c3d144373ee256897d38593d699fb3e36ecf08903c8ddb357a30e23a4b46d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:40:23.374534 kubelet[3390]: E1212 17:40:23.374483 3390 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf2c3d144373ee256897d38593d699fb3e36ecf08903c8ddb357a30e23a4b46d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:40:23.374534 kubelet[3390]: E1212 17:40:23.374517 3390 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf2c3d144373ee256897d38593d699fb3e36ecf08903c8ddb357a30e23a4b46d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-qxwtm" Dec 12 17:40:23.374622 kubelet[3390]: E1212 17:40:23.374529 3390 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf2c3d144373ee256897d38593d699fb3e36ecf08903c8ddb357a30e23a4b46d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-qxwtm" Dec 12 17:40:23.374622 kubelet[3390]: E1212 17:40:23.374578 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-qxwtm_kube-system(ae0ae75f-5f13-4f15-8f02-a60871a329ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-qxwtm_kube-system(ae0ae75f-5f13-4f15-8f02-a60871a329ef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bf2c3d144373ee256897d38593d699fb3e36ecf08903c8ddb357a30e23a4b46d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-qxwtm" podUID="ae0ae75f-5f13-4f15-8f02-a60871a329ef" Dec 12 17:40:23.377310 containerd[1898]: time="2025-12-12T17:40:23.377271609Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7984dd694b-tk5md,Uid:5d1301d2-72a1-41e5-ae8f-db3bb6f52314,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6ff4d83bf3aacddc5e9d790bf46a16f498ebe26cba36b11d0111773acc1d464\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:40:23.377636 kubelet[3390]: E1212 17:40:23.377442 3390 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6ff4d83bf3aacddc5e9d790bf46a16f498ebe26cba36b11d0111773acc1d464\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:40:23.377636 kubelet[3390]: E1212 17:40:23.377476 3390 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6ff4d83bf3aacddc5e9d790bf46a16f498ebe26cba36b11d0111773acc1d464\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7984dd694b-tk5md" Dec 12 17:40:23.377636 kubelet[3390]: E1212 17:40:23.377489 3390 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6ff4d83bf3aacddc5e9d790bf46a16f498ebe26cba36b11d0111773acc1d464\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7984dd694b-tk5md" Dec 12 17:40:23.377702 kubelet[3390]: E1212 17:40:23.377513 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7984dd694b-tk5md_calico-apiserver(5d1301d2-72a1-41e5-ae8f-db3bb6f52314)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7984dd694b-tk5md_calico-apiserver(5d1301d2-72a1-41e5-ae8f-db3bb6f52314)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f6ff4d83bf3aacddc5e9d790bf46a16f498ebe26cba36b11d0111773acc1d464\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7984dd694b-tk5md" podUID="5d1301d2-72a1-41e5-ae8f-db3bb6f52314" Dec 12 17:40:23.611133 containerd[1898]: time="2025-12-12T17:40:23.610991772Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-76fc8fc45c-j6pkt,Uid:1960dd80-6c2f-452c-bbce-0ac57ff9095f,Namespace:calico-system,Attempt:0,}" Dec 12 17:40:23.616967 kubelet[3390]: E1212 17:40:23.616874 3390 secret.go:189] Couldn't get secret calico-system/goldmane-key-pair: failed to sync secret cache: timed out waiting for the condition Dec 12 17:40:23.618259 kubelet[3390]: E1212 17:40:23.617126 3390 configmap.go:193] Couldn't get configMap calico-system/goldmane-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 12 17:40:23.618259 kubelet[3390]: E1212 17:40:23.617829 3390 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a665283-3e69-4f2e-9c9b-8aae93e17ef9-goldmane-key-pair podName:1a665283-3e69-4f2e-9c9b-8aae93e17ef9 nodeName:}" failed. No retries permitted until 2025-12-12 17:40:24.116939667 +0000 UTC m=+34.391587491 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "goldmane-key-pair" (UniqueName: "kubernetes.io/secret/1a665283-3e69-4f2e-9c9b-8aae93e17ef9-goldmane-key-pair") pod "goldmane-666569f655-lwtn5" (UID: "1a665283-3e69-4f2e-9c9b-8aae93e17ef9") : failed to sync secret cache: timed out waiting for the condition Dec 12 17:40:23.618259 kubelet[3390]: E1212 17:40:23.617856 3390 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1a665283-3e69-4f2e-9c9b-8aae93e17ef9-goldmane-ca-bundle podName:1a665283-3e69-4f2e-9c9b-8aae93e17ef9 nodeName:}" failed. No retries permitted until 2025-12-12 17:40:24.11784957 +0000 UTC m=+34.392497394 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "goldmane-ca-bundle" (UniqueName: "kubernetes.io/configmap/1a665283-3e69-4f2e-9c9b-8aae93e17ef9-goldmane-ca-bundle") pod "goldmane-666569f655-lwtn5" (UID: "1a665283-3e69-4f2e-9c9b-8aae93e17ef9") : failed to sync configmap cache: timed out waiting for the condition Dec 12 17:40:23.653317 containerd[1898]: time="2025-12-12T17:40:23.653253377Z" level=error msg="Failed to destroy network for sandbox \"aee79928d3ee3ab387d9877ac8697d3a7b833730d8c1e75d5200ba3a158444c2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:40:23.656905 containerd[1898]: time="2025-12-12T17:40:23.656802492Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-76fc8fc45c-j6pkt,Uid:1960dd80-6c2f-452c-bbce-0ac57ff9095f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"aee79928d3ee3ab387d9877ac8697d3a7b833730d8c1e75d5200ba3a158444c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:40:23.657058 kubelet[3390]: E1212 17:40:23.657006 3390 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aee79928d3ee3ab387d9877ac8697d3a7b833730d8c1e75d5200ba3a158444c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:40:23.657108 kubelet[3390]: E1212 17:40:23.657057 3390 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aee79928d3ee3ab387d9877ac8697d3a7b833730d8c1e75d5200ba3a158444c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-76fc8fc45c-j6pkt" Dec 12 17:40:23.657108 kubelet[3390]: E1212 17:40:23.657074 3390 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aee79928d3ee3ab387d9877ac8697d3a7b833730d8c1e75d5200ba3a158444c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-76fc8fc45c-j6pkt" Dec 12 17:40:23.657174 kubelet[3390]: E1212 17:40:23.657105 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-76fc8fc45c-j6pkt_calico-system(1960dd80-6c2f-452c-bbce-0ac57ff9095f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-76fc8fc45c-j6pkt_calico-system(1960dd80-6c2f-452c-bbce-0ac57ff9095f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"aee79928d3ee3ab387d9877ac8697d3a7b833730d8c1e75d5200ba3a158444c2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-76fc8fc45c-j6pkt" podUID="1960dd80-6c2f-452c-bbce-0ac57ff9095f" Dec 12 17:40:23.799780 systemd[1]: Created slice kubepods-besteffort-podc62a98c7_a503_42c2_845c_ea1022fbba96.slice - libcontainer container kubepods-besteffort-podc62a98c7_a503_42c2_845c_ea1022fbba96.slice. Dec 12 17:40:23.801623 containerd[1898]: time="2025-12-12T17:40:23.801588564Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ccgmd,Uid:c62a98c7-a503-42c2-845c-ea1022fbba96,Namespace:calico-system,Attempt:0,}" Dec 12 17:40:23.841706 containerd[1898]: time="2025-12-12T17:40:23.841655549Z" level=error msg="Failed to destroy network for sandbox \"5cf08d26e081b40f1e27274a8aa1b5799e939a7d0e4d3414b603ce1acb0f8f2a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:40:23.846649 containerd[1898]: time="2025-12-12T17:40:23.846589040Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ccgmd,Uid:c62a98c7-a503-42c2-845c-ea1022fbba96,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5cf08d26e081b40f1e27274a8aa1b5799e939a7d0e4d3414b603ce1acb0f8f2a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:40:23.846822 kubelet[3390]: E1212 17:40:23.846788 3390 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5cf08d26e081b40f1e27274a8aa1b5799e939a7d0e4d3414b603ce1acb0f8f2a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:40:23.846888 kubelet[3390]: E1212 17:40:23.846838 3390 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5cf08d26e081b40f1e27274a8aa1b5799e939a7d0e4d3414b603ce1acb0f8f2a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ccgmd" Dec 12 17:40:23.846916 kubelet[3390]: E1212 17:40:23.846884 3390 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5cf08d26e081b40f1e27274a8aa1b5799e939a7d0e4d3414b603ce1acb0f8f2a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ccgmd" Dec 12 17:40:23.846951 kubelet[3390]: E1212 17:40:23.846923 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-ccgmd_calico-system(c62a98c7-a503-42c2-845c-ea1022fbba96)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-ccgmd_calico-system(c62a98c7-a503-42c2-845c-ea1022fbba96)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5cf08d26e081b40f1e27274a8aa1b5799e939a7d0e4d3414b603ce1acb0f8f2a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-ccgmd" podUID="c62a98c7-a503-42c2-845c-ea1022fbba96" Dec 12 17:40:23.904589 containerd[1898]: time="2025-12-12T17:40:23.904452108Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Dec 12 17:40:24.321867 systemd[1]: run-netns-cni\x2dfc404cbf\x2dd733\x2da50c\x2de354\x2daee231ab16e2.mount: Deactivated successfully. Dec 12 17:40:24.510009 containerd[1898]: time="2025-12-12T17:40:24.509966104Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-lwtn5,Uid:1a665283-3e69-4f2e-9c9b-8aae93e17ef9,Namespace:calico-system,Attempt:0,}" Dec 12 17:40:24.558048 containerd[1898]: time="2025-12-12T17:40:24.558000645Z" level=error msg="Failed to destroy network for sandbox \"733fc33d7c8bcbc57d1175026cf27345d8fa8c37720b49a4c10d300916780942\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:40:24.559684 systemd[1]: run-netns-cni\x2d118cc39d\x2d2efd\x2db678\x2d4fc1\x2d4028cc5460bc.mount: Deactivated successfully. Dec 12 17:40:24.561816 containerd[1898]: time="2025-12-12T17:40:24.561770896Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-lwtn5,Uid:1a665283-3e69-4f2e-9c9b-8aae93e17ef9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"733fc33d7c8bcbc57d1175026cf27345d8fa8c37720b49a4c10d300916780942\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:40:24.562167 kubelet[3390]: E1212 17:40:24.562101 3390 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"733fc33d7c8bcbc57d1175026cf27345d8fa8c37720b49a4c10d300916780942\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 17:40:24.562640 kubelet[3390]: E1212 17:40:24.562158 3390 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"733fc33d7c8bcbc57d1175026cf27345d8fa8c37720b49a4c10d300916780942\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-lwtn5" Dec 12 17:40:24.562640 kubelet[3390]: E1212 17:40:24.562443 3390 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"733fc33d7c8bcbc57d1175026cf27345d8fa8c37720b49a4c10d300916780942\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-lwtn5" Dec 12 17:40:24.562640 kubelet[3390]: E1212 17:40:24.562504 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-lwtn5_calico-system(1a665283-3e69-4f2e-9c9b-8aae93e17ef9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-lwtn5_calico-system(1a665283-3e69-4f2e-9c9b-8aae93e17ef9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"733fc33d7c8bcbc57d1175026cf27345d8fa8c37720b49a4c10d300916780942\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-lwtn5" podUID="1a665283-3e69-4f2e-9c9b-8aae93e17ef9" Dec 12 17:40:29.835932 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4048702616.mount: Deactivated successfully. Dec 12 17:40:30.533186 containerd[1898]: time="2025-12-12T17:40:30.532727908Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:30.535368 containerd[1898]: time="2025-12-12T17:40:30.535341375Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150934562" Dec 12 17:40:30.538385 containerd[1898]: time="2025-12-12T17:40:30.538343056Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:30.542157 containerd[1898]: time="2025-12-12T17:40:30.542120236Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 17:40:30.542574 containerd[1898]: time="2025-12-12T17:40:30.542549523Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 6.638014476s" Dec 12 17:40:30.542655 containerd[1898]: time="2025-12-12T17:40:30.542643278Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Dec 12 17:40:30.554587 containerd[1898]: time="2025-12-12T17:40:30.553624309Z" level=info msg="CreateContainer within sandbox \"32638453933c3bd1e80ef5aa1252ecaae34c5a0949dda0e8f0a1898f780f6ef9\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Dec 12 17:40:30.577550 containerd[1898]: time="2025-12-12T17:40:30.577489590Z" level=info msg="Container 0eb7230eba55b6b5bef67492a3b899b057e68716fa9fd6436fe4c7e75e33cdd2: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:40:30.580418 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2650497631.mount: Deactivated successfully. Dec 12 17:40:30.600379 containerd[1898]: time="2025-12-12T17:40:30.600341804Z" level=info msg="CreateContainer within sandbox \"32638453933c3bd1e80ef5aa1252ecaae34c5a0949dda0e8f0a1898f780f6ef9\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"0eb7230eba55b6b5bef67492a3b899b057e68716fa9fd6436fe4c7e75e33cdd2\"" Dec 12 17:40:30.601179 containerd[1898]: time="2025-12-12T17:40:30.601153368Z" level=info msg="StartContainer for \"0eb7230eba55b6b5bef67492a3b899b057e68716fa9fd6436fe4c7e75e33cdd2\"" Dec 12 17:40:30.602295 containerd[1898]: time="2025-12-12T17:40:30.602263055Z" level=info msg="connecting to shim 0eb7230eba55b6b5bef67492a3b899b057e68716fa9fd6436fe4c7e75e33cdd2" address="unix:///run/containerd/s/b09b4e47554929bd945bb552ada442e7e69a9e38e5fa9b65c988b66d115429f1" protocol=ttrpc version=3 Dec 12 17:40:30.623756 systemd[1]: Started cri-containerd-0eb7230eba55b6b5bef67492a3b899b057e68716fa9fd6436fe4c7e75e33cdd2.scope - libcontainer container 0eb7230eba55b6b5bef67492a3b899b057e68716fa9fd6436fe4c7e75e33cdd2. Dec 12 17:40:30.692929 containerd[1898]: time="2025-12-12T17:40:30.692883531Z" level=info msg="StartContainer for \"0eb7230eba55b6b5bef67492a3b899b057e68716fa9fd6436fe4c7e75e33cdd2\" returns successfully" Dec 12 17:40:30.880252 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Dec 12 17:40:30.880433 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Dec 12 17:40:31.007099 kubelet[3390]: I1212 17:40:31.006886 3390 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-5v4v9" podStartSLOduration=1.369898284 podStartE2EDuration="20.006867069s" podCreationTimestamp="2025-12-12 17:40:11 +0000 UTC" firstStartedPulling="2025-12-12 17:40:11.906274338 +0000 UTC m=+22.180922154" lastFinishedPulling="2025-12-12 17:40:30.543243123 +0000 UTC m=+40.817890939" observedRunningTime="2025-12-12 17:40:30.948679621 +0000 UTC m=+41.223327445" watchObservedRunningTime="2025-12-12 17:40:31.006867069 +0000 UTC m=+41.281514885" Dec 12 17:40:31.053741 kubelet[3390]: I1212 17:40:31.053680 3390 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs55d\" (UniqueName: \"kubernetes.io/projected/1960dd80-6c2f-452c-bbce-0ac57ff9095f-kube-api-access-gs55d\") pod \"1960dd80-6c2f-452c-bbce-0ac57ff9095f\" (UID: \"1960dd80-6c2f-452c-bbce-0ac57ff9095f\") " Dec 12 17:40:31.053741 kubelet[3390]: I1212 17:40:31.053718 3390 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/1960dd80-6c2f-452c-bbce-0ac57ff9095f-whisker-backend-key-pair\") pod \"1960dd80-6c2f-452c-bbce-0ac57ff9095f\" (UID: \"1960dd80-6c2f-452c-bbce-0ac57ff9095f\") " Dec 12 17:40:31.053741 kubelet[3390]: I1212 17:40:31.053734 3390 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1960dd80-6c2f-452c-bbce-0ac57ff9095f-whisker-ca-bundle\") pod \"1960dd80-6c2f-452c-bbce-0ac57ff9095f\" (UID: \"1960dd80-6c2f-452c-bbce-0ac57ff9095f\") " Dec 12 17:40:31.054294 kubelet[3390]: I1212 17:40:31.054053 3390 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1960dd80-6c2f-452c-bbce-0ac57ff9095f-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "1960dd80-6c2f-452c-bbce-0ac57ff9095f" (UID: "1960dd80-6c2f-452c-bbce-0ac57ff9095f"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Dec 12 17:40:31.060747 systemd[1]: var-lib-kubelet-pods-1960dd80\x2d6c2f\x2d452c\x2dbbce\x2d0ac57ff9095f-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Dec 12 17:40:31.060844 systemd[1]: var-lib-kubelet-pods-1960dd80\x2d6c2f\x2d452c\x2dbbce\x2d0ac57ff9095f-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dgs55d.mount: Deactivated successfully. Dec 12 17:40:31.065792 kubelet[3390]: I1212 17:40:31.065733 3390 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1960dd80-6c2f-452c-bbce-0ac57ff9095f-kube-api-access-gs55d" (OuterVolumeSpecName: "kube-api-access-gs55d") pod "1960dd80-6c2f-452c-bbce-0ac57ff9095f" (UID: "1960dd80-6c2f-452c-bbce-0ac57ff9095f"). InnerVolumeSpecName "kube-api-access-gs55d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Dec 12 17:40:31.065961 kubelet[3390]: I1212 17:40:31.065936 3390 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1960dd80-6c2f-452c-bbce-0ac57ff9095f-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "1960dd80-6c2f-452c-bbce-0ac57ff9095f" (UID: "1960dd80-6c2f-452c-bbce-0ac57ff9095f"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Dec 12 17:40:31.154003 kubelet[3390]: I1212 17:40:31.153955 3390 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gs55d\" (UniqueName: \"kubernetes.io/projected/1960dd80-6c2f-452c-bbce-0ac57ff9095f-kube-api-access-gs55d\") on node \"ci-4459.2.2-a-260bc0236d\" DevicePath \"\"" Dec 12 17:40:31.154003 kubelet[3390]: I1212 17:40:31.153992 3390 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/1960dd80-6c2f-452c-bbce-0ac57ff9095f-whisker-backend-key-pair\") on node \"ci-4459.2.2-a-260bc0236d\" DevicePath \"\"" Dec 12 17:40:31.154003 kubelet[3390]: I1212 17:40:31.154000 3390 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1960dd80-6c2f-452c-bbce-0ac57ff9095f-whisker-ca-bundle\") on node \"ci-4459.2.2-a-260bc0236d\" DevicePath \"\"" Dec 12 17:40:31.801136 systemd[1]: Removed slice kubepods-besteffort-pod1960dd80_6c2f_452c_bbce_0ac57ff9095f.slice - libcontainer container kubepods-besteffort-pod1960dd80_6c2f_452c_bbce_0ac57ff9095f.slice. Dec 12 17:40:32.027678 systemd[1]: Created slice kubepods-besteffort-pod9727898f_0ef8_4f55_b4ae_4e451f2586ba.slice - libcontainer container kubepods-besteffort-pod9727898f_0ef8_4f55_b4ae_4e451f2586ba.slice. Dec 12 17:40:32.058276 kubelet[3390]: I1212 17:40:32.058145 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q952d\" (UniqueName: \"kubernetes.io/projected/9727898f-0ef8-4f55-b4ae-4e451f2586ba-kube-api-access-q952d\") pod \"whisker-64584c4865-wbzd7\" (UID: \"9727898f-0ef8-4f55-b4ae-4e451f2586ba\") " pod="calico-system/whisker-64584c4865-wbzd7" Dec 12 17:40:32.058276 kubelet[3390]: I1212 17:40:32.058193 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9727898f-0ef8-4f55-b4ae-4e451f2586ba-whisker-backend-key-pair\") pod \"whisker-64584c4865-wbzd7\" (UID: \"9727898f-0ef8-4f55-b4ae-4e451f2586ba\") " pod="calico-system/whisker-64584c4865-wbzd7" Dec 12 17:40:32.058276 kubelet[3390]: I1212 17:40:32.058213 3390 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9727898f-0ef8-4f55-b4ae-4e451f2586ba-whisker-ca-bundle\") pod \"whisker-64584c4865-wbzd7\" (UID: \"9727898f-0ef8-4f55-b4ae-4e451f2586ba\") " pod="calico-system/whisker-64584c4865-wbzd7" Dec 12 17:40:32.331986 containerd[1898]: time="2025-12-12T17:40:32.331874559Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-64584c4865-wbzd7,Uid:9727898f-0ef8-4f55-b4ae-4e451f2586ba,Namespace:calico-system,Attempt:0,}" Dec 12 17:40:32.409469 kubelet[3390]: I1212 17:40:32.409363 3390 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 12 17:40:32.561359 systemd-networkd[1491]: cali7486ba0d9ce: Link UP Dec 12 17:40:32.562672 systemd-networkd[1491]: cali7486ba0d9ce: Gained carrier Dec 12 17:40:32.595849 containerd[1898]: 2025-12-12 17:40:32.403 [INFO][4611] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 12 17:40:32.595849 containerd[1898]: 2025-12-12 17:40:32.466 [INFO][4611] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.2--a--260bc0236d-k8s-whisker--64584c4865--wbzd7-eth0 whisker-64584c4865- calico-system 9727898f-0ef8-4f55-b4ae-4e451f2586ba 914 0 2025-12-12 17:40:31 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:64584c4865 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459.2.2-a-260bc0236d whisker-64584c4865-wbzd7 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali7486ba0d9ce [] [] }} ContainerID="b6ea21f4e84b574e451ec124f6188c6f3eaeb9bb6c251fdcbd102b969e483daa" Namespace="calico-system" Pod="whisker-64584c4865-wbzd7" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-whisker--64584c4865--wbzd7-" Dec 12 17:40:32.595849 containerd[1898]: 2025-12-12 17:40:32.466 [INFO][4611] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b6ea21f4e84b574e451ec124f6188c6f3eaeb9bb6c251fdcbd102b969e483daa" Namespace="calico-system" Pod="whisker-64584c4865-wbzd7" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-whisker--64584c4865--wbzd7-eth0" Dec 12 17:40:32.595849 containerd[1898]: 2025-12-12 17:40:32.502 [INFO][4659] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b6ea21f4e84b574e451ec124f6188c6f3eaeb9bb6c251fdcbd102b969e483daa" HandleID="k8s-pod-network.b6ea21f4e84b574e451ec124f6188c6f3eaeb9bb6c251fdcbd102b969e483daa" Workload="ci--4459.2.2--a--260bc0236d-k8s-whisker--64584c4865--wbzd7-eth0" Dec 12 17:40:32.596033 containerd[1898]: 2025-12-12 17:40:32.502 [INFO][4659] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="b6ea21f4e84b574e451ec124f6188c6f3eaeb9bb6c251fdcbd102b969e483daa" HandleID="k8s-pod-network.b6ea21f4e84b574e451ec124f6188c6f3eaeb9bb6c251fdcbd102b969e483daa" Workload="ci--4459.2.2--a--260bc0236d-k8s-whisker--64584c4865--wbzd7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b800), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.2-a-260bc0236d", "pod":"whisker-64584c4865-wbzd7", "timestamp":"2025-12-12 17:40:32.502198578 +0000 UTC"}, Hostname:"ci-4459.2.2-a-260bc0236d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:40:32.596033 containerd[1898]: 2025-12-12 17:40:32.503 [INFO][4659] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:40:32.596033 containerd[1898]: 2025-12-12 17:40:32.503 [INFO][4659] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:40:32.596033 containerd[1898]: 2025-12-12 17:40:32.503 [INFO][4659] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.2-a-260bc0236d' Dec 12 17:40:32.596033 containerd[1898]: 2025-12-12 17:40:32.510 [INFO][4659] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b6ea21f4e84b574e451ec124f6188c6f3eaeb9bb6c251fdcbd102b969e483daa" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:32.596033 containerd[1898]: 2025-12-12 17:40:32.514 [INFO][4659] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:32.596033 containerd[1898]: 2025-12-12 17:40:32.518 [INFO][4659] ipam/ipam.go 511: Trying affinity for 192.168.17.64/26 host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:32.596033 containerd[1898]: 2025-12-12 17:40:32.520 [INFO][4659] ipam/ipam.go 158: Attempting to load block cidr=192.168.17.64/26 host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:32.596033 containerd[1898]: 2025-12-12 17:40:32.522 [INFO][4659] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.17.64/26 host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:32.596173 containerd[1898]: 2025-12-12 17:40:32.522 [INFO][4659] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.17.64/26 handle="k8s-pod-network.b6ea21f4e84b574e451ec124f6188c6f3eaeb9bb6c251fdcbd102b969e483daa" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:32.596173 containerd[1898]: 2025-12-12 17:40:32.524 [INFO][4659] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.b6ea21f4e84b574e451ec124f6188c6f3eaeb9bb6c251fdcbd102b969e483daa Dec 12 17:40:32.596173 containerd[1898]: 2025-12-12 17:40:32.529 [INFO][4659] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.17.64/26 handle="k8s-pod-network.b6ea21f4e84b574e451ec124f6188c6f3eaeb9bb6c251fdcbd102b969e483daa" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:32.596173 containerd[1898]: 2025-12-12 17:40:32.548 [INFO][4659] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.17.65/26] block=192.168.17.64/26 handle="k8s-pod-network.b6ea21f4e84b574e451ec124f6188c6f3eaeb9bb6c251fdcbd102b969e483daa" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:32.596173 containerd[1898]: 2025-12-12 17:40:32.548 [INFO][4659] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.17.65/26] handle="k8s-pod-network.b6ea21f4e84b574e451ec124f6188c6f3eaeb9bb6c251fdcbd102b969e483daa" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:32.596173 containerd[1898]: 2025-12-12 17:40:32.548 [INFO][4659] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:40:32.596173 containerd[1898]: 2025-12-12 17:40:32.548 [INFO][4659] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.17.65/26] IPv6=[] ContainerID="b6ea21f4e84b574e451ec124f6188c6f3eaeb9bb6c251fdcbd102b969e483daa" HandleID="k8s-pod-network.b6ea21f4e84b574e451ec124f6188c6f3eaeb9bb6c251fdcbd102b969e483daa" Workload="ci--4459.2.2--a--260bc0236d-k8s-whisker--64584c4865--wbzd7-eth0" Dec 12 17:40:32.597080 containerd[1898]: 2025-12-12 17:40:32.552 [INFO][4611] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b6ea21f4e84b574e451ec124f6188c6f3eaeb9bb6c251fdcbd102b969e483daa" Namespace="calico-system" Pod="whisker-64584c4865-wbzd7" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-whisker--64584c4865--wbzd7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--260bc0236d-k8s-whisker--64584c4865--wbzd7-eth0", GenerateName:"whisker-64584c4865-", Namespace:"calico-system", SelfLink:"", UID:"9727898f-0ef8-4f55-b4ae-4e451f2586ba", ResourceVersion:"914", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 40, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"64584c4865", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-260bc0236d", ContainerID:"", Pod:"whisker-64584c4865-wbzd7", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.17.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali7486ba0d9ce", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:40:32.597080 containerd[1898]: 2025-12-12 17:40:32.553 [INFO][4611] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.17.65/32] ContainerID="b6ea21f4e84b574e451ec124f6188c6f3eaeb9bb6c251fdcbd102b969e483daa" Namespace="calico-system" Pod="whisker-64584c4865-wbzd7" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-whisker--64584c4865--wbzd7-eth0" Dec 12 17:40:32.597142 containerd[1898]: 2025-12-12 17:40:32.553 [INFO][4611] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7486ba0d9ce ContainerID="b6ea21f4e84b574e451ec124f6188c6f3eaeb9bb6c251fdcbd102b969e483daa" Namespace="calico-system" Pod="whisker-64584c4865-wbzd7" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-whisker--64584c4865--wbzd7-eth0" Dec 12 17:40:32.597142 containerd[1898]: 2025-12-12 17:40:32.563 [INFO][4611] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b6ea21f4e84b574e451ec124f6188c6f3eaeb9bb6c251fdcbd102b969e483daa" Namespace="calico-system" Pod="whisker-64584c4865-wbzd7" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-whisker--64584c4865--wbzd7-eth0" Dec 12 17:40:32.597173 containerd[1898]: 2025-12-12 17:40:32.565 [INFO][4611] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b6ea21f4e84b574e451ec124f6188c6f3eaeb9bb6c251fdcbd102b969e483daa" Namespace="calico-system" Pod="whisker-64584c4865-wbzd7" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-whisker--64584c4865--wbzd7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--260bc0236d-k8s-whisker--64584c4865--wbzd7-eth0", GenerateName:"whisker-64584c4865-", Namespace:"calico-system", SelfLink:"", UID:"9727898f-0ef8-4f55-b4ae-4e451f2586ba", ResourceVersion:"914", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 40, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"64584c4865", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-260bc0236d", ContainerID:"b6ea21f4e84b574e451ec124f6188c6f3eaeb9bb6c251fdcbd102b969e483daa", Pod:"whisker-64584c4865-wbzd7", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.17.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali7486ba0d9ce", MAC:"ae:d6:ec:a2:db:2d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:40:32.597206 containerd[1898]: 2025-12-12 17:40:32.590 [INFO][4611] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b6ea21f4e84b574e451ec124f6188c6f3eaeb9bb6c251fdcbd102b969e483daa" Namespace="calico-system" Pod="whisker-64584c4865-wbzd7" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-whisker--64584c4865--wbzd7-eth0" Dec 12 17:40:32.650807 containerd[1898]: time="2025-12-12T17:40:32.650735027Z" level=info msg="connecting to shim b6ea21f4e84b574e451ec124f6188c6f3eaeb9bb6c251fdcbd102b969e483daa" address="unix:///run/containerd/s/b6c6636af22d1f966c18571c3b5da24de278e65e18b1affc035e730cb2f64e86" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:40:32.686380 systemd[1]: Started cri-containerd-b6ea21f4e84b574e451ec124f6188c6f3eaeb9bb6c251fdcbd102b969e483daa.scope - libcontainer container b6ea21f4e84b574e451ec124f6188c6f3eaeb9bb6c251fdcbd102b969e483daa. Dec 12 17:40:32.725653 containerd[1898]: time="2025-12-12T17:40:32.725594097Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-64584c4865-wbzd7,Uid:9727898f-0ef8-4f55-b4ae-4e451f2586ba,Namespace:calico-system,Attempt:0,} returns sandbox id \"b6ea21f4e84b574e451ec124f6188c6f3eaeb9bb6c251fdcbd102b969e483daa\"" Dec 12 17:40:32.727308 containerd[1898]: time="2025-12-12T17:40:32.727259787Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:40:32.825669 systemd-networkd[1491]: vxlan.calico: Link UP Dec 12 17:40:32.825678 systemd-networkd[1491]: vxlan.calico: Gained carrier Dec 12 17:40:33.022888 containerd[1898]: time="2025-12-12T17:40:33.022750583Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:40:33.027896 containerd[1898]: time="2025-12-12T17:40:33.027810720Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:40:33.028094 containerd[1898]: time="2025-12-12T17:40:33.028015143Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:40:33.028311 kubelet[3390]: E1212 17:40:33.028265 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:40:33.028402 kubelet[3390]: E1212 17:40:33.028325 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:40:33.032630 kubelet[3390]: E1212 17:40:33.032579 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:bc07d3c9c9fe4f658351332e55d349eb,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q952d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-64584c4865-wbzd7_calico-system(9727898f-0ef8-4f55-b4ae-4e451f2586ba): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:40:33.034866 containerd[1898]: time="2025-12-12T17:40:33.034805596Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:40:33.315079 containerd[1898]: time="2025-12-12T17:40:33.314938552Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:40:33.318932 containerd[1898]: time="2025-12-12T17:40:33.318882562Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:40:33.319041 containerd[1898]: time="2025-12-12T17:40:33.318981605Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:40:33.319251 kubelet[3390]: E1212 17:40:33.319185 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:40:33.319947 kubelet[3390]: E1212 17:40:33.319603 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:40:33.320026 kubelet[3390]: E1212 17:40:33.319721 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q952d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-64584c4865-wbzd7_calico-system(9727898f-0ef8-4f55-b4ae-4e451f2586ba): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:40:33.321329 kubelet[3390]: E1212 17:40:33.321275 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64584c4865-wbzd7" podUID="9727898f-0ef8-4f55-b4ae-4e451f2586ba" Dec 12 17:40:33.795062 containerd[1898]: time="2025-12-12T17:40:33.794778384Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7984dd694b-72thk,Uid:0083021a-0d5d-42a9-b6e2-679318c1ae2e,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:40:33.796461 kubelet[3390]: I1212 17:40:33.796430 3390 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1960dd80-6c2f-452c-bbce-0ac57ff9095f" path="/var/lib/kubelet/pods/1960dd80-6c2f-452c-bbce-0ac57ff9095f/volumes" Dec 12 17:40:33.888826 systemd-networkd[1491]: calia3a9b829bee: Link UP Dec 12 17:40:33.888961 systemd-networkd[1491]: calia3a9b829bee: Gained carrier Dec 12 17:40:33.905018 containerd[1898]: 2025-12-12 17:40:33.829 [INFO][4842] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--72thk-eth0 calico-apiserver-7984dd694b- calico-apiserver 0083021a-0d5d-42a9-b6e2-679318c1ae2e 843 0 2025-12-12 17:40:05 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7984dd694b projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459.2.2-a-260bc0236d calico-apiserver-7984dd694b-72thk eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia3a9b829bee [] [] }} ContainerID="77e88319d8dd0ac0e9993f13b0e8fa680a9eeffa0c73caec9bf5ab226befa6e5" Namespace="calico-apiserver" Pod="calico-apiserver-7984dd694b-72thk" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--72thk-" Dec 12 17:40:33.905018 containerd[1898]: 2025-12-12 17:40:33.829 [INFO][4842] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="77e88319d8dd0ac0e9993f13b0e8fa680a9eeffa0c73caec9bf5ab226befa6e5" Namespace="calico-apiserver" Pod="calico-apiserver-7984dd694b-72thk" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--72thk-eth0" Dec 12 17:40:33.905018 containerd[1898]: 2025-12-12 17:40:33.849 [INFO][4853] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="77e88319d8dd0ac0e9993f13b0e8fa680a9eeffa0c73caec9bf5ab226befa6e5" HandleID="k8s-pod-network.77e88319d8dd0ac0e9993f13b0e8fa680a9eeffa0c73caec9bf5ab226befa6e5" Workload="ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--72thk-eth0" Dec 12 17:40:33.905209 containerd[1898]: 2025-12-12 17:40:33.849 [INFO][4853] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="77e88319d8dd0ac0e9993f13b0e8fa680a9eeffa0c73caec9bf5ab226befa6e5" HandleID="k8s-pod-network.77e88319d8dd0ac0e9993f13b0e8fa680a9eeffa0c73caec9bf5ab226befa6e5" Workload="ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--72thk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024afe0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459.2.2-a-260bc0236d", "pod":"calico-apiserver-7984dd694b-72thk", "timestamp":"2025-12-12 17:40:33.849508023 +0000 UTC"}, Hostname:"ci-4459.2.2-a-260bc0236d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:40:33.905209 containerd[1898]: 2025-12-12 17:40:33.849 [INFO][4853] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:40:33.905209 containerd[1898]: 2025-12-12 17:40:33.849 [INFO][4853] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:40:33.905209 containerd[1898]: 2025-12-12 17:40:33.849 [INFO][4853] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.2-a-260bc0236d' Dec 12 17:40:33.905209 containerd[1898]: 2025-12-12 17:40:33.855 [INFO][4853] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.77e88319d8dd0ac0e9993f13b0e8fa680a9eeffa0c73caec9bf5ab226befa6e5" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:33.905209 containerd[1898]: 2025-12-12 17:40:33.859 [INFO][4853] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:33.905209 containerd[1898]: 2025-12-12 17:40:33.863 [INFO][4853] ipam/ipam.go 511: Trying affinity for 192.168.17.64/26 host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:33.905209 containerd[1898]: 2025-12-12 17:40:33.865 [INFO][4853] ipam/ipam.go 158: Attempting to load block cidr=192.168.17.64/26 host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:33.905209 containerd[1898]: 2025-12-12 17:40:33.867 [INFO][4853] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.17.64/26 host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:33.905362 containerd[1898]: 2025-12-12 17:40:33.867 [INFO][4853] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.17.64/26 handle="k8s-pod-network.77e88319d8dd0ac0e9993f13b0e8fa680a9eeffa0c73caec9bf5ab226befa6e5" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:33.905362 containerd[1898]: 2025-12-12 17:40:33.868 [INFO][4853] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.77e88319d8dd0ac0e9993f13b0e8fa680a9eeffa0c73caec9bf5ab226befa6e5 Dec 12 17:40:33.905362 containerd[1898]: 2025-12-12 17:40:33.874 [INFO][4853] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.17.64/26 handle="k8s-pod-network.77e88319d8dd0ac0e9993f13b0e8fa680a9eeffa0c73caec9bf5ab226befa6e5" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:33.905362 containerd[1898]: 2025-12-12 17:40:33.883 [INFO][4853] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.17.66/26] block=192.168.17.64/26 handle="k8s-pod-network.77e88319d8dd0ac0e9993f13b0e8fa680a9eeffa0c73caec9bf5ab226befa6e5" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:33.905362 containerd[1898]: 2025-12-12 17:40:33.884 [INFO][4853] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.17.66/26] handle="k8s-pod-network.77e88319d8dd0ac0e9993f13b0e8fa680a9eeffa0c73caec9bf5ab226befa6e5" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:33.905362 containerd[1898]: 2025-12-12 17:40:33.884 [INFO][4853] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:40:33.905362 containerd[1898]: 2025-12-12 17:40:33.884 [INFO][4853] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.17.66/26] IPv6=[] ContainerID="77e88319d8dd0ac0e9993f13b0e8fa680a9eeffa0c73caec9bf5ab226befa6e5" HandleID="k8s-pod-network.77e88319d8dd0ac0e9993f13b0e8fa680a9eeffa0c73caec9bf5ab226befa6e5" Workload="ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--72thk-eth0" Dec 12 17:40:33.905462 containerd[1898]: 2025-12-12 17:40:33.885 [INFO][4842] cni-plugin/k8s.go 418: Populated endpoint ContainerID="77e88319d8dd0ac0e9993f13b0e8fa680a9eeffa0c73caec9bf5ab226befa6e5" Namespace="calico-apiserver" Pod="calico-apiserver-7984dd694b-72thk" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--72thk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--72thk-eth0", GenerateName:"calico-apiserver-7984dd694b-", Namespace:"calico-apiserver", SelfLink:"", UID:"0083021a-0d5d-42a9-b6e2-679318c1ae2e", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 40, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7984dd694b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-260bc0236d", ContainerID:"", Pod:"calico-apiserver-7984dd694b-72thk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.17.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia3a9b829bee", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:40:33.905496 containerd[1898]: 2025-12-12 17:40:33.885 [INFO][4842] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.17.66/32] ContainerID="77e88319d8dd0ac0e9993f13b0e8fa680a9eeffa0c73caec9bf5ab226befa6e5" Namespace="calico-apiserver" Pod="calico-apiserver-7984dd694b-72thk" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--72thk-eth0" Dec 12 17:40:33.905496 containerd[1898]: 2025-12-12 17:40:33.885 [INFO][4842] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia3a9b829bee ContainerID="77e88319d8dd0ac0e9993f13b0e8fa680a9eeffa0c73caec9bf5ab226befa6e5" Namespace="calico-apiserver" Pod="calico-apiserver-7984dd694b-72thk" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--72thk-eth0" Dec 12 17:40:33.905496 containerd[1898]: 2025-12-12 17:40:33.887 [INFO][4842] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="77e88319d8dd0ac0e9993f13b0e8fa680a9eeffa0c73caec9bf5ab226befa6e5" Namespace="calico-apiserver" Pod="calico-apiserver-7984dd694b-72thk" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--72thk-eth0" Dec 12 17:40:33.905539 containerd[1898]: 2025-12-12 17:40:33.888 [INFO][4842] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="77e88319d8dd0ac0e9993f13b0e8fa680a9eeffa0c73caec9bf5ab226befa6e5" Namespace="calico-apiserver" Pod="calico-apiserver-7984dd694b-72thk" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--72thk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--72thk-eth0", GenerateName:"calico-apiserver-7984dd694b-", Namespace:"calico-apiserver", SelfLink:"", UID:"0083021a-0d5d-42a9-b6e2-679318c1ae2e", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 40, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7984dd694b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-260bc0236d", ContainerID:"77e88319d8dd0ac0e9993f13b0e8fa680a9eeffa0c73caec9bf5ab226befa6e5", Pod:"calico-apiserver-7984dd694b-72thk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.17.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia3a9b829bee", MAC:"3a:12:44:be:2c:d6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:40:33.905571 containerd[1898]: 2025-12-12 17:40:33.901 [INFO][4842] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="77e88319d8dd0ac0e9993f13b0e8fa680a9eeffa0c73caec9bf5ab226befa6e5" Namespace="calico-apiserver" Pod="calico-apiserver-7984dd694b-72thk" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--72thk-eth0" Dec 12 17:40:33.938483 kubelet[3390]: E1212 17:40:33.938438 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64584c4865-wbzd7" podUID="9727898f-0ef8-4f55-b4ae-4e451f2586ba" Dec 12 17:40:33.950930 containerd[1898]: time="2025-12-12T17:40:33.950252452Z" level=info msg="connecting to shim 77e88319d8dd0ac0e9993f13b0e8fa680a9eeffa0c73caec9bf5ab226befa6e5" address="unix:///run/containerd/s/6b09b2ae81eca66377222a3ff5c0147ce2291add61018c1fd9a7f0025ce83bd4" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:40:33.973378 systemd[1]: Started cri-containerd-77e88319d8dd0ac0e9993f13b0e8fa680a9eeffa0c73caec9bf5ab226befa6e5.scope - libcontainer container 77e88319d8dd0ac0e9993f13b0e8fa680a9eeffa0c73caec9bf5ab226befa6e5. Dec 12 17:40:34.011842 containerd[1898]: time="2025-12-12T17:40:34.011804425Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7984dd694b-72thk,Uid:0083021a-0d5d-42a9-b6e2-679318c1ae2e,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"77e88319d8dd0ac0e9993f13b0e8fa680a9eeffa0c73caec9bf5ab226befa6e5\"" Dec 12 17:40:34.013460 containerd[1898]: time="2025-12-12T17:40:34.013435810Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:40:34.267342 containerd[1898]: time="2025-12-12T17:40:34.267298304Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:40:34.270493 containerd[1898]: time="2025-12-12T17:40:34.270451094Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:40:34.270558 containerd[1898]: time="2025-12-12T17:40:34.270540914Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:40:34.270776 kubelet[3390]: E1212 17:40:34.270707 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:40:34.270776 kubelet[3390]: E1212 17:40:34.270758 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:40:34.271761 kubelet[3390]: E1212 17:40:34.271710 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s4xqg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7984dd694b-72thk_calico-apiserver(0083021a-0d5d-42a9-b6e2-679318c1ae2e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:40:34.273028 kubelet[3390]: E1212 17:40:34.272874 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-72thk" podUID="0083021a-0d5d-42a9-b6e2-679318c1ae2e" Dec 12 17:40:34.438393 systemd-networkd[1491]: cali7486ba0d9ce: Gained IPv6LL Dec 12 17:40:34.566396 systemd-networkd[1491]: vxlan.calico: Gained IPv6LL Dec 12 17:40:34.943348 kubelet[3390]: E1212 17:40:34.942027 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-72thk" podUID="0083021a-0d5d-42a9-b6e2-679318c1ae2e" Dec 12 17:40:35.654340 systemd-networkd[1491]: calia3a9b829bee: Gained IPv6LL Dec 12 17:40:35.795150 containerd[1898]: time="2025-12-12T17:40:35.794852322Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7984dd694b-tk5md,Uid:5d1301d2-72a1-41e5-ae8f-db3bb6f52314,Namespace:calico-apiserver,Attempt:0,}" Dec 12 17:40:35.795150 containerd[1898]: time="2025-12-12T17:40:35.794949229Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-qxwtm,Uid:ae0ae75f-5f13-4f15-8f02-a60871a329ef,Namespace:kube-system,Attempt:0,}" Dec 12 17:40:35.795889 containerd[1898]: time="2025-12-12T17:40:35.795861125Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6wqk8,Uid:f485e4eb-196e-4ab6-a695-2d4c1db5d278,Namespace:kube-system,Attempt:0,}" Dec 12 17:40:35.933597 systemd-networkd[1491]: cali07a8c759d31: Link UP Dec 12 17:40:35.934598 systemd-networkd[1491]: cali07a8c759d31: Gained carrier Dec 12 17:40:35.944999 kubelet[3390]: E1212 17:40:35.944966 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-72thk" podUID="0083021a-0d5d-42a9-b6e2-679318c1ae2e" Dec 12 17:40:35.954945 containerd[1898]: 2025-12-12 17:40:35.851 [INFO][4924] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--tk5md-eth0 calico-apiserver-7984dd694b- calico-apiserver 5d1301d2-72a1-41e5-ae8f-db3bb6f52314 833 0 2025-12-12 17:40:05 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7984dd694b projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459.2.2-a-260bc0236d calico-apiserver-7984dd694b-tk5md eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali07a8c759d31 [] [] }} ContainerID="767b523cbe49a65ce44e08cf29534eebe4b1a59f44fb338cb486ba80c059fbb9" Namespace="calico-apiserver" Pod="calico-apiserver-7984dd694b-tk5md" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--tk5md-" Dec 12 17:40:35.954945 containerd[1898]: 2025-12-12 17:40:35.852 [INFO][4924] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="767b523cbe49a65ce44e08cf29534eebe4b1a59f44fb338cb486ba80c059fbb9" Namespace="calico-apiserver" Pod="calico-apiserver-7984dd694b-tk5md" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--tk5md-eth0" Dec 12 17:40:35.954945 containerd[1898]: 2025-12-12 17:40:35.880 [INFO][4961] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="767b523cbe49a65ce44e08cf29534eebe4b1a59f44fb338cb486ba80c059fbb9" HandleID="k8s-pod-network.767b523cbe49a65ce44e08cf29534eebe4b1a59f44fb338cb486ba80c059fbb9" Workload="ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--tk5md-eth0" Dec 12 17:40:35.955660 containerd[1898]: 2025-12-12 17:40:35.880 [INFO][4961] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="767b523cbe49a65ce44e08cf29534eebe4b1a59f44fb338cb486ba80c059fbb9" HandleID="k8s-pod-network.767b523cbe49a65ce44e08cf29534eebe4b1a59f44fb338cb486ba80c059fbb9" Workload="ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--tk5md-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3600), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459.2.2-a-260bc0236d", "pod":"calico-apiserver-7984dd694b-tk5md", "timestamp":"2025-12-12 17:40:35.880703239 +0000 UTC"}, Hostname:"ci-4459.2.2-a-260bc0236d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:40:35.955660 containerd[1898]: 2025-12-12 17:40:35.880 [INFO][4961] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:40:35.955660 containerd[1898]: 2025-12-12 17:40:35.880 [INFO][4961] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:40:35.955660 containerd[1898]: 2025-12-12 17:40:35.881 [INFO][4961] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.2-a-260bc0236d' Dec 12 17:40:35.955660 containerd[1898]: 2025-12-12 17:40:35.889 [INFO][4961] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.767b523cbe49a65ce44e08cf29534eebe4b1a59f44fb338cb486ba80c059fbb9" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:35.955660 containerd[1898]: 2025-12-12 17:40:35.895 [INFO][4961] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:35.955660 containerd[1898]: 2025-12-12 17:40:35.900 [INFO][4961] ipam/ipam.go 511: Trying affinity for 192.168.17.64/26 host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:35.955660 containerd[1898]: 2025-12-12 17:40:35.903 [INFO][4961] ipam/ipam.go 158: Attempting to load block cidr=192.168.17.64/26 host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:35.955660 containerd[1898]: 2025-12-12 17:40:35.905 [INFO][4961] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.17.64/26 host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:35.956443 containerd[1898]: 2025-12-12 17:40:35.905 [INFO][4961] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.17.64/26 handle="k8s-pod-network.767b523cbe49a65ce44e08cf29534eebe4b1a59f44fb338cb486ba80c059fbb9" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:35.956443 containerd[1898]: 2025-12-12 17:40:35.907 [INFO][4961] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.767b523cbe49a65ce44e08cf29534eebe4b1a59f44fb338cb486ba80c059fbb9 Dec 12 17:40:35.956443 containerd[1898]: 2025-12-12 17:40:35.916 [INFO][4961] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.17.64/26 handle="k8s-pod-network.767b523cbe49a65ce44e08cf29534eebe4b1a59f44fb338cb486ba80c059fbb9" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:35.956443 containerd[1898]: 2025-12-12 17:40:35.925 [INFO][4961] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.17.67/26] block=192.168.17.64/26 handle="k8s-pod-network.767b523cbe49a65ce44e08cf29534eebe4b1a59f44fb338cb486ba80c059fbb9" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:35.956443 containerd[1898]: 2025-12-12 17:40:35.925 [INFO][4961] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.17.67/26] handle="k8s-pod-network.767b523cbe49a65ce44e08cf29534eebe4b1a59f44fb338cb486ba80c059fbb9" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:35.956443 containerd[1898]: 2025-12-12 17:40:35.925 [INFO][4961] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:40:35.956443 containerd[1898]: 2025-12-12 17:40:35.925 [INFO][4961] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.17.67/26] IPv6=[] ContainerID="767b523cbe49a65ce44e08cf29534eebe4b1a59f44fb338cb486ba80c059fbb9" HandleID="k8s-pod-network.767b523cbe49a65ce44e08cf29534eebe4b1a59f44fb338cb486ba80c059fbb9" Workload="ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--tk5md-eth0" Dec 12 17:40:35.956550 containerd[1898]: 2025-12-12 17:40:35.928 [INFO][4924] cni-plugin/k8s.go 418: Populated endpoint ContainerID="767b523cbe49a65ce44e08cf29534eebe4b1a59f44fb338cb486ba80c059fbb9" Namespace="calico-apiserver" Pod="calico-apiserver-7984dd694b-tk5md" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--tk5md-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--tk5md-eth0", GenerateName:"calico-apiserver-7984dd694b-", Namespace:"calico-apiserver", SelfLink:"", UID:"5d1301d2-72a1-41e5-ae8f-db3bb6f52314", ResourceVersion:"833", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 40, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7984dd694b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-260bc0236d", ContainerID:"", Pod:"calico-apiserver-7984dd694b-tk5md", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.17.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali07a8c759d31", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:40:35.956602 containerd[1898]: 2025-12-12 17:40:35.928 [INFO][4924] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.17.67/32] ContainerID="767b523cbe49a65ce44e08cf29534eebe4b1a59f44fb338cb486ba80c059fbb9" Namespace="calico-apiserver" Pod="calico-apiserver-7984dd694b-tk5md" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--tk5md-eth0" Dec 12 17:40:35.956602 containerd[1898]: 2025-12-12 17:40:35.928 [INFO][4924] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali07a8c759d31 ContainerID="767b523cbe49a65ce44e08cf29534eebe4b1a59f44fb338cb486ba80c059fbb9" Namespace="calico-apiserver" Pod="calico-apiserver-7984dd694b-tk5md" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--tk5md-eth0" Dec 12 17:40:35.956602 containerd[1898]: 2025-12-12 17:40:35.934 [INFO][4924] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="767b523cbe49a65ce44e08cf29534eebe4b1a59f44fb338cb486ba80c059fbb9" Namespace="calico-apiserver" Pod="calico-apiserver-7984dd694b-tk5md" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--tk5md-eth0" Dec 12 17:40:35.956655 containerd[1898]: 2025-12-12 17:40:35.935 [INFO][4924] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="767b523cbe49a65ce44e08cf29534eebe4b1a59f44fb338cb486ba80c059fbb9" Namespace="calico-apiserver" Pod="calico-apiserver-7984dd694b-tk5md" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--tk5md-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--tk5md-eth0", GenerateName:"calico-apiserver-7984dd694b-", Namespace:"calico-apiserver", SelfLink:"", UID:"5d1301d2-72a1-41e5-ae8f-db3bb6f52314", ResourceVersion:"833", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 40, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7984dd694b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-260bc0236d", ContainerID:"767b523cbe49a65ce44e08cf29534eebe4b1a59f44fb338cb486ba80c059fbb9", Pod:"calico-apiserver-7984dd694b-tk5md", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.17.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali07a8c759d31", MAC:"02:8f:a7:b0:fe:d6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:40:35.956689 containerd[1898]: 2025-12-12 17:40:35.952 [INFO][4924] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="767b523cbe49a65ce44e08cf29534eebe4b1a59f44fb338cb486ba80c059fbb9" Namespace="calico-apiserver" Pod="calico-apiserver-7984dd694b-tk5md" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-calico--apiserver--7984dd694b--tk5md-eth0" Dec 12 17:40:36.014251 containerd[1898]: time="2025-12-12T17:40:36.013861960Z" level=info msg="connecting to shim 767b523cbe49a65ce44e08cf29534eebe4b1a59f44fb338cb486ba80c059fbb9" address="unix:///run/containerd/s/208082fab756cdb20a6c704b091a72abede663c5792cc401c1cadc97f8694019" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:40:36.038370 systemd[1]: Started cri-containerd-767b523cbe49a65ce44e08cf29534eebe4b1a59f44fb338cb486ba80c059fbb9.scope - libcontainer container 767b523cbe49a65ce44e08cf29534eebe4b1a59f44fb338cb486ba80c059fbb9. Dec 12 17:40:36.052654 systemd-networkd[1491]: cali312e673309b: Link UP Dec 12 17:40:36.053573 systemd-networkd[1491]: cali312e673309b: Gained carrier Dec 12 17:40:36.073024 containerd[1898]: 2025-12-12 17:40:35.869 [INFO][4943] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--6wqk8-eth0 coredns-668d6bf9bc- kube-system f485e4eb-196e-4ab6-a695-2d4c1db5d278 842 0 2025-12-12 17:39:56 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459.2.2-a-260bc0236d coredns-668d6bf9bc-6wqk8 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali312e673309b [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5" Namespace="kube-system" Pod="coredns-668d6bf9bc-6wqk8" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--6wqk8-" Dec 12 17:40:36.073024 containerd[1898]: 2025-12-12 17:40:35.869 [INFO][4943] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5" Namespace="kube-system" Pod="coredns-668d6bf9bc-6wqk8" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--6wqk8-eth0" Dec 12 17:40:36.073024 containerd[1898]: 2025-12-12 17:40:35.902 [INFO][4969] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5" HandleID="k8s-pod-network.beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5" Workload="ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--6wqk8-eth0" Dec 12 17:40:36.073204 containerd[1898]: 2025-12-12 17:40:35.903 [INFO][4969] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5" HandleID="k8s-pod-network.beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5" Workload="ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--6wqk8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cb8b0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459.2.2-a-260bc0236d", "pod":"coredns-668d6bf9bc-6wqk8", "timestamp":"2025-12-12 17:40:35.902924287 +0000 UTC"}, Hostname:"ci-4459.2.2-a-260bc0236d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:40:36.073204 containerd[1898]: 2025-12-12 17:40:35.903 [INFO][4969] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:40:36.073204 containerd[1898]: 2025-12-12 17:40:35.926 [INFO][4969] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:40:36.073204 containerd[1898]: 2025-12-12 17:40:35.926 [INFO][4969] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.2-a-260bc0236d' Dec 12 17:40:36.073204 containerd[1898]: 2025-12-12 17:40:35.989 [INFO][4969] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:36.073204 containerd[1898]: 2025-12-12 17:40:35.996 [INFO][4969] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:36.073204 containerd[1898]: 2025-12-12 17:40:36.002 [INFO][4969] ipam/ipam.go 511: Trying affinity for 192.168.17.64/26 host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:36.073204 containerd[1898]: 2025-12-12 17:40:36.022 [INFO][4969] ipam/ipam.go 158: Attempting to load block cidr=192.168.17.64/26 host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:36.073204 containerd[1898]: 2025-12-12 17:40:36.024 [INFO][4969] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.17.64/26 host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:36.073359 containerd[1898]: 2025-12-12 17:40:36.024 [INFO][4969] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.17.64/26 handle="k8s-pod-network.beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:36.073359 containerd[1898]: 2025-12-12 17:40:36.026 [INFO][4969] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5 Dec 12 17:40:36.073359 containerd[1898]: 2025-12-12 17:40:36.036 [INFO][4969] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.17.64/26 handle="k8s-pod-network.beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:36.073359 containerd[1898]: 2025-12-12 17:40:36.042 [INFO][4969] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.17.68/26] block=192.168.17.64/26 handle="k8s-pod-network.beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:36.073359 containerd[1898]: 2025-12-12 17:40:36.042 [INFO][4969] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.17.68/26] handle="k8s-pod-network.beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:36.073359 containerd[1898]: 2025-12-12 17:40:36.043 [INFO][4969] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:40:36.073359 containerd[1898]: 2025-12-12 17:40:36.043 [INFO][4969] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.17.68/26] IPv6=[] ContainerID="beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5" HandleID="k8s-pod-network.beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5" Workload="ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--6wqk8-eth0" Dec 12 17:40:36.073459 containerd[1898]: 2025-12-12 17:40:36.047 [INFO][4943] cni-plugin/k8s.go 418: Populated endpoint ContainerID="beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5" Namespace="kube-system" Pod="coredns-668d6bf9bc-6wqk8" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--6wqk8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--6wqk8-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"f485e4eb-196e-4ab6-a695-2d4c1db5d278", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 39, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-260bc0236d", ContainerID:"", Pod:"coredns-668d6bf9bc-6wqk8", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.17.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali312e673309b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:40:36.073459 containerd[1898]: 2025-12-12 17:40:36.048 [INFO][4943] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.17.68/32] ContainerID="beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5" Namespace="kube-system" Pod="coredns-668d6bf9bc-6wqk8" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--6wqk8-eth0" Dec 12 17:40:36.073459 containerd[1898]: 2025-12-12 17:40:36.048 [INFO][4943] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali312e673309b ContainerID="beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5" Namespace="kube-system" Pod="coredns-668d6bf9bc-6wqk8" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--6wqk8-eth0" Dec 12 17:40:36.073459 containerd[1898]: 2025-12-12 17:40:36.055 [INFO][4943] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5" Namespace="kube-system" Pod="coredns-668d6bf9bc-6wqk8" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--6wqk8-eth0" Dec 12 17:40:36.073459 containerd[1898]: 2025-12-12 17:40:36.056 [INFO][4943] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5" Namespace="kube-system" Pod="coredns-668d6bf9bc-6wqk8" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--6wqk8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--6wqk8-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"f485e4eb-196e-4ab6-a695-2d4c1db5d278", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 39, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-260bc0236d", ContainerID:"beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5", Pod:"coredns-668d6bf9bc-6wqk8", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.17.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali312e673309b", MAC:"92:77:f5:70:a7:f1", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:40:36.073459 containerd[1898]: 2025-12-12 17:40:36.069 [INFO][4943] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5" Namespace="kube-system" Pod="coredns-668d6bf9bc-6wqk8" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--6wqk8-eth0" Dec 12 17:40:36.104647 containerd[1898]: time="2025-12-12T17:40:36.104599927Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7984dd694b-tk5md,Uid:5d1301d2-72a1-41e5-ae8f-db3bb6f52314,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"767b523cbe49a65ce44e08cf29534eebe4b1a59f44fb338cb486ba80c059fbb9\"" Dec 12 17:40:36.107459 containerd[1898]: time="2025-12-12T17:40:36.107345943Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:40:36.130369 containerd[1898]: time="2025-12-12T17:40:36.130326970Z" level=info msg="connecting to shim beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5" address="unix:///run/containerd/s/69620c6ef60e15beb2eec6a4beedea8609b45a863948a35eebe9cdda031dc3c3" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:40:36.146969 systemd-networkd[1491]: cali193899fde43: Link UP Dec 12 17:40:36.147092 systemd-networkd[1491]: cali193899fde43: Gained carrier Dec 12 17:40:36.160357 systemd[1]: Started cri-containerd-beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5.scope - libcontainer container beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5. Dec 12 17:40:36.168235 containerd[1898]: 2025-12-12 17:40:35.878 [INFO][4935] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--qxwtm-eth0 coredns-668d6bf9bc- kube-system ae0ae75f-5f13-4f15-8f02-a60871a329ef 844 0 2025-12-12 17:39:56 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459.2.2-a-260bc0236d coredns-668d6bf9bc-qxwtm eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali193899fde43 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6" Namespace="kube-system" Pod="coredns-668d6bf9bc-qxwtm" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--qxwtm-" Dec 12 17:40:36.168235 containerd[1898]: 2025-12-12 17:40:35.878 [INFO][4935] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6" Namespace="kube-system" Pod="coredns-668d6bf9bc-qxwtm" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--qxwtm-eth0" Dec 12 17:40:36.168235 containerd[1898]: 2025-12-12 17:40:35.912 [INFO][4975] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6" HandleID="k8s-pod-network.7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6" Workload="ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--qxwtm-eth0" Dec 12 17:40:36.168235 containerd[1898]: 2025-12-12 17:40:35.912 [INFO][4975] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6" HandleID="k8s-pod-network.7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6" Workload="ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--qxwtm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002bafe0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459.2.2-a-260bc0236d", "pod":"coredns-668d6bf9bc-qxwtm", "timestamp":"2025-12-12 17:40:35.912113967 +0000 UTC"}, Hostname:"ci-4459.2.2-a-260bc0236d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:40:36.168235 containerd[1898]: 2025-12-12 17:40:35.912 [INFO][4975] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:40:36.168235 containerd[1898]: 2025-12-12 17:40:36.043 [INFO][4975] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:40:36.168235 containerd[1898]: 2025-12-12 17:40:36.044 [INFO][4975] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.2-a-260bc0236d' Dec 12 17:40:36.168235 containerd[1898]: 2025-12-12 17:40:36.091 [INFO][4975] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:36.168235 containerd[1898]: 2025-12-12 17:40:36.106 [INFO][4975] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:36.168235 containerd[1898]: 2025-12-12 17:40:36.111 [INFO][4975] ipam/ipam.go 511: Trying affinity for 192.168.17.64/26 host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:36.168235 containerd[1898]: 2025-12-12 17:40:36.113 [INFO][4975] ipam/ipam.go 158: Attempting to load block cidr=192.168.17.64/26 host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:36.168235 containerd[1898]: 2025-12-12 17:40:36.115 [INFO][4975] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.17.64/26 host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:36.168235 containerd[1898]: 2025-12-12 17:40:36.115 [INFO][4975] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.17.64/26 handle="k8s-pod-network.7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:36.168235 containerd[1898]: 2025-12-12 17:40:36.116 [INFO][4975] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6 Dec 12 17:40:36.168235 containerd[1898]: 2025-12-12 17:40:36.126 [INFO][4975] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.17.64/26 handle="k8s-pod-network.7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:36.168235 containerd[1898]: 2025-12-12 17:40:36.138 [INFO][4975] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.17.69/26] block=192.168.17.64/26 handle="k8s-pod-network.7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:36.168235 containerd[1898]: 2025-12-12 17:40:36.138 [INFO][4975] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.17.69/26] handle="k8s-pod-network.7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:36.168235 containerd[1898]: 2025-12-12 17:40:36.139 [INFO][4975] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:40:36.168235 containerd[1898]: 2025-12-12 17:40:36.139 [INFO][4975] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.17.69/26] IPv6=[] ContainerID="7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6" HandleID="k8s-pod-network.7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6" Workload="ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--qxwtm-eth0" Dec 12 17:40:36.168645 containerd[1898]: 2025-12-12 17:40:36.143 [INFO][4935] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6" Namespace="kube-system" Pod="coredns-668d6bf9bc-qxwtm" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--qxwtm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--qxwtm-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"ae0ae75f-5f13-4f15-8f02-a60871a329ef", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 39, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-260bc0236d", ContainerID:"", Pod:"coredns-668d6bf9bc-qxwtm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.17.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali193899fde43", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:40:36.168645 containerd[1898]: 2025-12-12 17:40:36.144 [INFO][4935] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.17.69/32] ContainerID="7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6" Namespace="kube-system" Pod="coredns-668d6bf9bc-qxwtm" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--qxwtm-eth0" Dec 12 17:40:36.168645 containerd[1898]: 2025-12-12 17:40:36.144 [INFO][4935] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali193899fde43 ContainerID="7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6" Namespace="kube-system" Pod="coredns-668d6bf9bc-qxwtm" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--qxwtm-eth0" Dec 12 17:40:36.168645 containerd[1898]: 2025-12-12 17:40:36.146 [INFO][4935] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6" Namespace="kube-system" Pod="coredns-668d6bf9bc-qxwtm" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--qxwtm-eth0" Dec 12 17:40:36.168645 containerd[1898]: 2025-12-12 17:40:36.148 [INFO][4935] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6" Namespace="kube-system" Pod="coredns-668d6bf9bc-qxwtm" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--qxwtm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--qxwtm-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"ae0ae75f-5f13-4f15-8f02-a60871a329ef", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 39, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-260bc0236d", ContainerID:"7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6", Pod:"coredns-668d6bf9bc-qxwtm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.17.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali193899fde43", MAC:"8a:05:90:6d:41:94", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:40:36.168645 containerd[1898]: 2025-12-12 17:40:36.163 [INFO][4935] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6" Namespace="kube-system" Pod="coredns-668d6bf9bc-qxwtm" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-coredns--668d6bf9bc--qxwtm-eth0" Dec 12 17:40:36.209790 containerd[1898]: time="2025-12-12T17:40:36.208774468Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6wqk8,Uid:f485e4eb-196e-4ab6-a695-2d4c1db5d278,Namespace:kube-system,Attempt:0,} returns sandbox id \"beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5\"" Dec 12 17:40:36.213141 containerd[1898]: time="2025-12-12T17:40:36.213020465Z" level=info msg="CreateContainer within sandbox \"beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 12 17:40:36.225778 containerd[1898]: time="2025-12-12T17:40:36.225726876Z" level=info msg="connecting to shim 7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6" address="unix:///run/containerd/s/e469adcb2e70327af4918189908958962b517b2ea184d07e20343122f6814d70" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:40:36.234569 containerd[1898]: time="2025-12-12T17:40:36.234532688Z" level=info msg="Container d2efae125cbb3bc71116ddb89568d42555e2fd46864988b8dfe8ac429c5c75b4: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:40:36.245556 systemd[1]: Started cri-containerd-7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6.scope - libcontainer container 7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6. Dec 12 17:40:36.250741 containerd[1898]: time="2025-12-12T17:40:36.250663683Z" level=info msg="CreateContainer within sandbox \"beb3fe6e0e2815ac222e8495fd83cc4a0dc059dc77aa67ad1e44874fb012d2f5\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d2efae125cbb3bc71116ddb89568d42555e2fd46864988b8dfe8ac429c5c75b4\"" Dec 12 17:40:36.252372 containerd[1898]: time="2025-12-12T17:40:36.252340405Z" level=info msg="StartContainer for \"d2efae125cbb3bc71116ddb89568d42555e2fd46864988b8dfe8ac429c5c75b4\"" Dec 12 17:40:36.252978 containerd[1898]: time="2025-12-12T17:40:36.252949427Z" level=info msg="connecting to shim d2efae125cbb3bc71116ddb89568d42555e2fd46864988b8dfe8ac429c5c75b4" address="unix:///run/containerd/s/69620c6ef60e15beb2eec6a4beedea8609b45a863948a35eebe9cdda031dc3c3" protocol=ttrpc version=3 Dec 12 17:40:36.272458 systemd[1]: Started cri-containerd-d2efae125cbb3bc71116ddb89568d42555e2fd46864988b8dfe8ac429c5c75b4.scope - libcontainer container d2efae125cbb3bc71116ddb89568d42555e2fd46864988b8dfe8ac429c5c75b4. Dec 12 17:40:36.285637 containerd[1898]: time="2025-12-12T17:40:36.285600759Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-qxwtm,Uid:ae0ae75f-5f13-4f15-8f02-a60871a329ef,Namespace:kube-system,Attempt:0,} returns sandbox id \"7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6\"" Dec 12 17:40:36.290198 containerd[1898]: time="2025-12-12T17:40:36.290154613Z" level=info msg="CreateContainer within sandbox \"7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 12 17:40:36.306580 containerd[1898]: time="2025-12-12T17:40:36.306551714Z" level=info msg="StartContainer for \"d2efae125cbb3bc71116ddb89568d42555e2fd46864988b8dfe8ac429c5c75b4\" returns successfully" Dec 12 17:40:36.312010 containerd[1898]: time="2025-12-12T17:40:36.311925205Z" level=info msg="Container c06bc4a48c8dd5a5eb8250b32fad3bb457b31fb57dffd7572bf3b2727dd836bc: CDI devices from CRI Config.CDIDevices: []" Dec 12 17:40:36.326392 containerd[1898]: time="2025-12-12T17:40:36.326358277Z" level=info msg="CreateContainer within sandbox \"7908175f964383eb8b1759a8e6c84a2946b6dc1178b29842aff7c4a8527168e6\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"c06bc4a48c8dd5a5eb8250b32fad3bb457b31fb57dffd7572bf3b2727dd836bc\"" Dec 12 17:40:36.327741 containerd[1898]: time="2025-12-12T17:40:36.326944578Z" level=info msg="StartContainer for \"c06bc4a48c8dd5a5eb8250b32fad3bb457b31fb57dffd7572bf3b2727dd836bc\"" Dec 12 17:40:36.328276 containerd[1898]: time="2025-12-12T17:40:36.328253232Z" level=info msg="connecting to shim c06bc4a48c8dd5a5eb8250b32fad3bb457b31fb57dffd7572bf3b2727dd836bc" address="unix:///run/containerd/s/e469adcb2e70327af4918189908958962b517b2ea184d07e20343122f6814d70" protocol=ttrpc version=3 Dec 12 17:40:36.346881 containerd[1898]: time="2025-12-12T17:40:36.346851313Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:40:36.350812 containerd[1898]: time="2025-12-12T17:40:36.350075193Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:40:36.350812 containerd[1898]: time="2025-12-12T17:40:36.350168525Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:40:36.351714 kubelet[3390]: E1212 17:40:36.350967 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:40:36.351714 kubelet[3390]: E1212 17:40:36.351019 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:40:36.351714 kubelet[3390]: E1212 17:40:36.351125 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2sgzd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7984dd694b-tk5md_calico-apiserver(5d1301d2-72a1-41e5-ae8f-db3bb6f52314): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:40:36.353622 kubelet[3390]: E1212 17:40:36.353594 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-tk5md" podUID="5d1301d2-72a1-41e5-ae8f-db3bb6f52314" Dec 12 17:40:36.358374 systemd[1]: Started cri-containerd-c06bc4a48c8dd5a5eb8250b32fad3bb457b31fb57dffd7572bf3b2727dd836bc.scope - libcontainer container c06bc4a48c8dd5a5eb8250b32fad3bb457b31fb57dffd7572bf3b2727dd836bc. Dec 12 17:40:36.399023 containerd[1898]: time="2025-12-12T17:40:36.398903082Z" level=info msg="StartContainer for \"c06bc4a48c8dd5a5eb8250b32fad3bb457b31fb57dffd7572bf3b2727dd836bc\" returns successfully" Dec 12 17:40:36.951401 kubelet[3390]: E1212 17:40:36.950708 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-tk5md" podUID="5d1301d2-72a1-41e5-ae8f-db3bb6f52314" Dec 12 17:40:36.983298 kubelet[3390]: I1212 17:40:36.983243 3390 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-6wqk8" podStartSLOduration=40.983214249 podStartE2EDuration="40.983214249s" podCreationTimestamp="2025-12-12 17:39:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:40:36.98236738 +0000 UTC m=+47.257015196" watchObservedRunningTime="2025-12-12 17:40:36.983214249 +0000 UTC m=+47.257862065" Dec 12 17:40:36.998397 systemd-networkd[1491]: cali07a8c759d31: Gained IPv6LL Dec 12 17:40:37.702414 systemd-networkd[1491]: cali193899fde43: Gained IPv6LL Dec 12 17:40:37.766467 systemd-networkd[1491]: cali312e673309b: Gained IPv6LL Dec 12 17:40:37.960450 kubelet[3390]: E1212 17:40:37.960289 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-tk5md" podUID="5d1301d2-72a1-41e5-ae8f-db3bb6f52314" Dec 12 17:40:37.984706 kubelet[3390]: I1212 17:40:37.984485 3390 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-qxwtm" podStartSLOduration=41.984470277 podStartE2EDuration="41.984470277s" podCreationTimestamp="2025-12-12 17:39:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 17:40:37.003072567 +0000 UTC m=+47.277720391" watchObservedRunningTime="2025-12-12 17:40:37.984470277 +0000 UTC m=+48.259118101" Dec 12 17:40:38.795152 containerd[1898]: time="2025-12-12T17:40:38.795064722Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5fdd79f47b-hk8h4,Uid:674dddc6-fb43-422e-85c0-76f7f4bb018f,Namespace:calico-system,Attempt:0,}" Dec 12 17:40:38.795994 containerd[1898]: time="2025-12-12T17:40:38.795065554Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ccgmd,Uid:c62a98c7-a503-42c2-845c-ea1022fbba96,Namespace:calico-system,Attempt:0,}" Dec 12 17:40:38.919671 systemd-networkd[1491]: cali04a682fa476: Link UP Dec 12 17:40:38.921027 systemd-networkd[1491]: cali04a682fa476: Gained carrier Dec 12 17:40:38.940987 containerd[1898]: 2025-12-12 17:40:38.846 [INFO][5236] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.2--a--260bc0236d-k8s-csi--node--driver--ccgmd-eth0 csi-node-driver- calico-system c62a98c7-a503-42c2-845c-ea1022fbba96 723 0 2025-12-12 17:40:11 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459.2.2-a-260bc0236d csi-node-driver-ccgmd eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali04a682fa476 [] [] }} ContainerID="b8c1a84320e99cd46d70cbb869b3735d02d2fe5eed7e87a79a1584e9de99ee2a" Namespace="calico-system" Pod="csi-node-driver-ccgmd" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-csi--node--driver--ccgmd-" Dec 12 17:40:38.940987 containerd[1898]: 2025-12-12 17:40:38.846 [INFO][5236] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b8c1a84320e99cd46d70cbb869b3735d02d2fe5eed7e87a79a1584e9de99ee2a" Namespace="calico-system" Pod="csi-node-driver-ccgmd" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-csi--node--driver--ccgmd-eth0" Dec 12 17:40:38.940987 containerd[1898]: 2025-12-12 17:40:38.868 [INFO][5249] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b8c1a84320e99cd46d70cbb869b3735d02d2fe5eed7e87a79a1584e9de99ee2a" HandleID="k8s-pod-network.b8c1a84320e99cd46d70cbb869b3735d02d2fe5eed7e87a79a1584e9de99ee2a" Workload="ci--4459.2.2--a--260bc0236d-k8s-csi--node--driver--ccgmd-eth0" Dec 12 17:40:38.940987 containerd[1898]: 2025-12-12 17:40:38.868 [INFO][5249] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="b8c1a84320e99cd46d70cbb869b3735d02d2fe5eed7e87a79a1584e9de99ee2a" HandleID="k8s-pod-network.b8c1a84320e99cd46d70cbb869b3735d02d2fe5eed7e87a79a1584e9de99ee2a" Workload="ci--4459.2.2--a--260bc0236d-k8s-csi--node--driver--ccgmd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d36d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.2-a-260bc0236d", "pod":"csi-node-driver-ccgmd", "timestamp":"2025-12-12 17:40:38.868373042 +0000 UTC"}, Hostname:"ci-4459.2.2-a-260bc0236d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:40:38.940987 containerd[1898]: 2025-12-12 17:40:38.868 [INFO][5249] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:40:38.940987 containerd[1898]: 2025-12-12 17:40:38.868 [INFO][5249] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:40:38.940987 containerd[1898]: 2025-12-12 17:40:38.868 [INFO][5249] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.2-a-260bc0236d' Dec 12 17:40:38.940987 containerd[1898]: 2025-12-12 17:40:38.877 [INFO][5249] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b8c1a84320e99cd46d70cbb869b3735d02d2fe5eed7e87a79a1584e9de99ee2a" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:38.940987 containerd[1898]: 2025-12-12 17:40:38.882 [INFO][5249] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:38.940987 containerd[1898]: 2025-12-12 17:40:38.886 [INFO][5249] ipam/ipam.go 511: Trying affinity for 192.168.17.64/26 host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:38.940987 containerd[1898]: 2025-12-12 17:40:38.887 [INFO][5249] ipam/ipam.go 158: Attempting to load block cidr=192.168.17.64/26 host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:38.940987 containerd[1898]: 2025-12-12 17:40:38.889 [INFO][5249] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.17.64/26 host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:38.940987 containerd[1898]: 2025-12-12 17:40:38.889 [INFO][5249] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.17.64/26 handle="k8s-pod-network.b8c1a84320e99cd46d70cbb869b3735d02d2fe5eed7e87a79a1584e9de99ee2a" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:38.940987 containerd[1898]: 2025-12-12 17:40:38.890 [INFO][5249] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.b8c1a84320e99cd46d70cbb869b3735d02d2fe5eed7e87a79a1584e9de99ee2a Dec 12 17:40:38.940987 containerd[1898]: 2025-12-12 17:40:38.896 [INFO][5249] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.17.64/26 handle="k8s-pod-network.b8c1a84320e99cd46d70cbb869b3735d02d2fe5eed7e87a79a1584e9de99ee2a" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:38.940987 containerd[1898]: 2025-12-12 17:40:38.910 [INFO][5249] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.17.70/26] block=192.168.17.64/26 handle="k8s-pod-network.b8c1a84320e99cd46d70cbb869b3735d02d2fe5eed7e87a79a1584e9de99ee2a" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:38.940987 containerd[1898]: 2025-12-12 17:40:38.910 [INFO][5249] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.17.70/26] handle="k8s-pod-network.b8c1a84320e99cd46d70cbb869b3735d02d2fe5eed7e87a79a1584e9de99ee2a" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:38.940987 containerd[1898]: 2025-12-12 17:40:38.910 [INFO][5249] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:40:38.940987 containerd[1898]: 2025-12-12 17:40:38.910 [INFO][5249] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.17.70/26] IPv6=[] ContainerID="b8c1a84320e99cd46d70cbb869b3735d02d2fe5eed7e87a79a1584e9de99ee2a" HandleID="k8s-pod-network.b8c1a84320e99cd46d70cbb869b3735d02d2fe5eed7e87a79a1584e9de99ee2a" Workload="ci--4459.2.2--a--260bc0236d-k8s-csi--node--driver--ccgmd-eth0" Dec 12 17:40:38.941575 containerd[1898]: 2025-12-12 17:40:38.914 [INFO][5236] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b8c1a84320e99cd46d70cbb869b3735d02d2fe5eed7e87a79a1584e9de99ee2a" Namespace="calico-system" Pod="csi-node-driver-ccgmd" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-csi--node--driver--ccgmd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--260bc0236d-k8s-csi--node--driver--ccgmd-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"c62a98c7-a503-42c2-845c-ea1022fbba96", ResourceVersion:"723", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 40, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-260bc0236d", ContainerID:"", Pod:"csi-node-driver-ccgmd", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.17.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali04a682fa476", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:40:38.941575 containerd[1898]: 2025-12-12 17:40:38.914 [INFO][5236] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.17.70/32] ContainerID="b8c1a84320e99cd46d70cbb869b3735d02d2fe5eed7e87a79a1584e9de99ee2a" Namespace="calico-system" Pod="csi-node-driver-ccgmd" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-csi--node--driver--ccgmd-eth0" Dec 12 17:40:38.941575 containerd[1898]: 2025-12-12 17:40:38.914 [INFO][5236] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali04a682fa476 ContainerID="b8c1a84320e99cd46d70cbb869b3735d02d2fe5eed7e87a79a1584e9de99ee2a" Namespace="calico-system" Pod="csi-node-driver-ccgmd" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-csi--node--driver--ccgmd-eth0" Dec 12 17:40:38.941575 containerd[1898]: 2025-12-12 17:40:38.921 [INFO][5236] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b8c1a84320e99cd46d70cbb869b3735d02d2fe5eed7e87a79a1584e9de99ee2a" Namespace="calico-system" Pod="csi-node-driver-ccgmd" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-csi--node--driver--ccgmd-eth0" Dec 12 17:40:38.941575 containerd[1898]: 2025-12-12 17:40:38.922 [INFO][5236] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b8c1a84320e99cd46d70cbb869b3735d02d2fe5eed7e87a79a1584e9de99ee2a" Namespace="calico-system" Pod="csi-node-driver-ccgmd" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-csi--node--driver--ccgmd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--260bc0236d-k8s-csi--node--driver--ccgmd-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"c62a98c7-a503-42c2-845c-ea1022fbba96", ResourceVersion:"723", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 40, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-260bc0236d", ContainerID:"b8c1a84320e99cd46d70cbb869b3735d02d2fe5eed7e87a79a1584e9de99ee2a", Pod:"csi-node-driver-ccgmd", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.17.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali04a682fa476", MAC:"32:5e:e0:74:08:5b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:40:38.941575 containerd[1898]: 2025-12-12 17:40:38.939 [INFO][5236] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b8c1a84320e99cd46d70cbb869b3735d02d2fe5eed7e87a79a1584e9de99ee2a" Namespace="calico-system" Pod="csi-node-driver-ccgmd" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-csi--node--driver--ccgmd-eth0" Dec 12 17:40:38.997924 containerd[1898]: time="2025-12-12T17:40:38.997406348Z" level=info msg="connecting to shim b8c1a84320e99cd46d70cbb869b3735d02d2fe5eed7e87a79a1584e9de99ee2a" address="unix:///run/containerd/s/08287eb6455f9a8fe6c5ba5734c749607ecbb76d05952eca62b9a4b4198924cb" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:40:39.059163 systemd-networkd[1491]: cali0a0a8c04562: Link UP Dec 12 17:40:39.060998 systemd-networkd[1491]: cali0a0a8c04562: Gained carrier Dec 12 17:40:39.062948 systemd[1]: Started cri-containerd-b8c1a84320e99cd46d70cbb869b3735d02d2fe5eed7e87a79a1584e9de99ee2a.scope - libcontainer container b8c1a84320e99cd46d70cbb869b3735d02d2fe5eed7e87a79a1584e9de99ee2a. Dec 12 17:40:39.084359 containerd[1898]: 2025-12-12 17:40:38.844 [INFO][5226] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.2--a--260bc0236d-k8s-calico--kube--controllers--5fdd79f47b--hk8h4-eth0 calico-kube-controllers-5fdd79f47b- calico-system 674dddc6-fb43-422e-85c0-76f7f4bb018f 838 0 2025-12-12 17:40:11 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5fdd79f47b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459.2.2-a-260bc0236d calico-kube-controllers-5fdd79f47b-hk8h4 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali0a0a8c04562 [] [] }} ContainerID="f126b1369577a304380c0ef4e190a0e4522b4f7bd78e01bf328f24d625c25a55" Namespace="calico-system" Pod="calico-kube-controllers-5fdd79f47b-hk8h4" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-calico--kube--controllers--5fdd79f47b--hk8h4-" Dec 12 17:40:39.084359 containerd[1898]: 2025-12-12 17:40:38.845 [INFO][5226] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f126b1369577a304380c0ef4e190a0e4522b4f7bd78e01bf328f24d625c25a55" Namespace="calico-system" Pod="calico-kube-controllers-5fdd79f47b-hk8h4" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-calico--kube--controllers--5fdd79f47b--hk8h4-eth0" Dec 12 17:40:39.084359 containerd[1898]: 2025-12-12 17:40:38.876 [INFO][5252] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f126b1369577a304380c0ef4e190a0e4522b4f7bd78e01bf328f24d625c25a55" HandleID="k8s-pod-network.f126b1369577a304380c0ef4e190a0e4522b4f7bd78e01bf328f24d625c25a55" Workload="ci--4459.2.2--a--260bc0236d-k8s-calico--kube--controllers--5fdd79f47b--hk8h4-eth0" Dec 12 17:40:39.084359 containerd[1898]: 2025-12-12 17:40:38.876 [INFO][5252] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="f126b1369577a304380c0ef4e190a0e4522b4f7bd78e01bf328f24d625c25a55" HandleID="k8s-pod-network.f126b1369577a304380c0ef4e190a0e4522b4f7bd78e01bf328f24d625c25a55" Workload="ci--4459.2.2--a--260bc0236d-k8s-calico--kube--controllers--5fdd79f47b--hk8h4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b200), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.2-a-260bc0236d", "pod":"calico-kube-controllers-5fdd79f47b-hk8h4", "timestamp":"2025-12-12 17:40:38.876794801 +0000 UTC"}, Hostname:"ci-4459.2.2-a-260bc0236d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:40:39.084359 containerd[1898]: 2025-12-12 17:40:38.877 [INFO][5252] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:40:39.084359 containerd[1898]: 2025-12-12 17:40:38.910 [INFO][5252] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:40:39.084359 containerd[1898]: 2025-12-12 17:40:38.910 [INFO][5252] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.2-a-260bc0236d' Dec 12 17:40:39.084359 containerd[1898]: 2025-12-12 17:40:38.979 [INFO][5252] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f126b1369577a304380c0ef4e190a0e4522b4f7bd78e01bf328f24d625c25a55" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:39.084359 containerd[1898]: 2025-12-12 17:40:38.990 [INFO][5252] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:39.084359 containerd[1898]: 2025-12-12 17:40:39.000 [INFO][5252] ipam/ipam.go 511: Trying affinity for 192.168.17.64/26 host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:39.084359 containerd[1898]: 2025-12-12 17:40:39.009 [INFO][5252] ipam/ipam.go 158: Attempting to load block cidr=192.168.17.64/26 host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:39.084359 containerd[1898]: 2025-12-12 17:40:39.017 [INFO][5252] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.17.64/26 host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:39.084359 containerd[1898]: 2025-12-12 17:40:39.018 [INFO][5252] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.17.64/26 handle="k8s-pod-network.f126b1369577a304380c0ef4e190a0e4522b4f7bd78e01bf328f24d625c25a55" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:39.084359 containerd[1898]: 2025-12-12 17:40:39.020 [INFO][5252] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.f126b1369577a304380c0ef4e190a0e4522b4f7bd78e01bf328f24d625c25a55 Dec 12 17:40:39.084359 containerd[1898]: 2025-12-12 17:40:39.041 [INFO][5252] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.17.64/26 handle="k8s-pod-network.f126b1369577a304380c0ef4e190a0e4522b4f7bd78e01bf328f24d625c25a55" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:39.084359 containerd[1898]: 2025-12-12 17:40:39.052 [INFO][5252] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.17.71/26] block=192.168.17.64/26 handle="k8s-pod-network.f126b1369577a304380c0ef4e190a0e4522b4f7bd78e01bf328f24d625c25a55" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:39.084359 containerd[1898]: 2025-12-12 17:40:39.052 [INFO][5252] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.17.71/26] handle="k8s-pod-network.f126b1369577a304380c0ef4e190a0e4522b4f7bd78e01bf328f24d625c25a55" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:39.084359 containerd[1898]: 2025-12-12 17:40:39.052 [INFO][5252] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:40:39.084359 containerd[1898]: 2025-12-12 17:40:39.052 [INFO][5252] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.17.71/26] IPv6=[] ContainerID="f126b1369577a304380c0ef4e190a0e4522b4f7bd78e01bf328f24d625c25a55" HandleID="k8s-pod-network.f126b1369577a304380c0ef4e190a0e4522b4f7bd78e01bf328f24d625c25a55" Workload="ci--4459.2.2--a--260bc0236d-k8s-calico--kube--controllers--5fdd79f47b--hk8h4-eth0" Dec 12 17:40:39.086300 containerd[1898]: 2025-12-12 17:40:39.056 [INFO][5226] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f126b1369577a304380c0ef4e190a0e4522b4f7bd78e01bf328f24d625c25a55" Namespace="calico-system" Pod="calico-kube-controllers-5fdd79f47b-hk8h4" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-calico--kube--controllers--5fdd79f47b--hk8h4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--260bc0236d-k8s-calico--kube--controllers--5fdd79f47b--hk8h4-eth0", GenerateName:"calico-kube-controllers-5fdd79f47b-", Namespace:"calico-system", SelfLink:"", UID:"674dddc6-fb43-422e-85c0-76f7f4bb018f", ResourceVersion:"838", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 40, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5fdd79f47b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-260bc0236d", ContainerID:"", Pod:"calico-kube-controllers-5fdd79f47b-hk8h4", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.17.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0a0a8c04562", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:40:39.086300 containerd[1898]: 2025-12-12 17:40:39.056 [INFO][5226] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.17.71/32] ContainerID="f126b1369577a304380c0ef4e190a0e4522b4f7bd78e01bf328f24d625c25a55" Namespace="calico-system" Pod="calico-kube-controllers-5fdd79f47b-hk8h4" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-calico--kube--controllers--5fdd79f47b--hk8h4-eth0" Dec 12 17:40:39.086300 containerd[1898]: 2025-12-12 17:40:39.056 [INFO][5226] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0a0a8c04562 ContainerID="f126b1369577a304380c0ef4e190a0e4522b4f7bd78e01bf328f24d625c25a55" Namespace="calico-system" Pod="calico-kube-controllers-5fdd79f47b-hk8h4" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-calico--kube--controllers--5fdd79f47b--hk8h4-eth0" Dec 12 17:40:39.086300 containerd[1898]: 2025-12-12 17:40:39.062 [INFO][5226] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f126b1369577a304380c0ef4e190a0e4522b4f7bd78e01bf328f24d625c25a55" Namespace="calico-system" Pod="calico-kube-controllers-5fdd79f47b-hk8h4" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-calico--kube--controllers--5fdd79f47b--hk8h4-eth0" Dec 12 17:40:39.086300 containerd[1898]: 2025-12-12 17:40:39.062 [INFO][5226] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f126b1369577a304380c0ef4e190a0e4522b4f7bd78e01bf328f24d625c25a55" Namespace="calico-system" Pod="calico-kube-controllers-5fdd79f47b-hk8h4" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-calico--kube--controllers--5fdd79f47b--hk8h4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--260bc0236d-k8s-calico--kube--controllers--5fdd79f47b--hk8h4-eth0", GenerateName:"calico-kube-controllers-5fdd79f47b-", Namespace:"calico-system", SelfLink:"", UID:"674dddc6-fb43-422e-85c0-76f7f4bb018f", ResourceVersion:"838", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 40, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5fdd79f47b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-260bc0236d", ContainerID:"f126b1369577a304380c0ef4e190a0e4522b4f7bd78e01bf328f24d625c25a55", Pod:"calico-kube-controllers-5fdd79f47b-hk8h4", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.17.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0a0a8c04562", MAC:"ea:c2:3d:5b:c1:84", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:40:39.086300 containerd[1898]: 2025-12-12 17:40:39.080 [INFO][5226] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f126b1369577a304380c0ef4e190a0e4522b4f7bd78e01bf328f24d625c25a55" Namespace="calico-system" Pod="calico-kube-controllers-5fdd79f47b-hk8h4" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-calico--kube--controllers--5fdd79f47b--hk8h4-eth0" Dec 12 17:40:39.111976 containerd[1898]: time="2025-12-12T17:40:39.111936297Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ccgmd,Uid:c62a98c7-a503-42c2-845c-ea1022fbba96,Namespace:calico-system,Attempt:0,} returns sandbox id \"b8c1a84320e99cd46d70cbb869b3735d02d2fe5eed7e87a79a1584e9de99ee2a\"" Dec 12 17:40:39.113244 containerd[1898]: time="2025-12-12T17:40:39.113203548Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:40:39.129137 containerd[1898]: time="2025-12-12T17:40:39.129006342Z" level=info msg="connecting to shim f126b1369577a304380c0ef4e190a0e4522b4f7bd78e01bf328f24d625c25a55" address="unix:///run/containerd/s/3bed8f8f0230719286dffbd72f5802b2d2aad5485361dfab8763a6dffb90f07d" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:40:39.151371 systemd[1]: Started cri-containerd-f126b1369577a304380c0ef4e190a0e4522b4f7bd78e01bf328f24d625c25a55.scope - libcontainer container f126b1369577a304380c0ef4e190a0e4522b4f7bd78e01bf328f24d625c25a55. Dec 12 17:40:39.190128 containerd[1898]: time="2025-12-12T17:40:39.190071638Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5fdd79f47b-hk8h4,Uid:674dddc6-fb43-422e-85c0-76f7f4bb018f,Namespace:calico-system,Attempt:0,} returns sandbox id \"f126b1369577a304380c0ef4e190a0e4522b4f7bd78e01bf328f24d625c25a55\"" Dec 12 17:40:39.391779 containerd[1898]: time="2025-12-12T17:40:39.391655878Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:40:39.395214 containerd[1898]: time="2025-12-12T17:40:39.395128709Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:40:39.395417 containerd[1898]: time="2025-12-12T17:40:39.395202223Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:40:39.395681 kubelet[3390]: E1212 17:40:39.395582 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:40:39.395681 kubelet[3390]: E1212 17:40:39.395638 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:40:39.396275 kubelet[3390]: E1212 17:40:39.395824 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d8t2j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-ccgmd_calico-system(c62a98c7-a503-42c2-845c-ea1022fbba96): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:40:39.396703 containerd[1898]: time="2025-12-12T17:40:39.396618847Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:40:39.654572 containerd[1898]: time="2025-12-12T17:40:39.654309135Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:40:39.657987 containerd[1898]: time="2025-12-12T17:40:39.657881377Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:40:39.657987 containerd[1898]: time="2025-12-12T17:40:39.657920698Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:40:39.658304 kubelet[3390]: E1212 17:40:39.658256 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:40:39.658553 kubelet[3390]: E1212 17:40:39.658309 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:40:39.658553 kubelet[3390]: E1212 17:40:39.658499 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b6td9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-5fdd79f47b-hk8h4_calico-system(674dddc6-fb43-422e-85c0-76f7f4bb018f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:40:39.659026 containerd[1898]: time="2025-12-12T17:40:39.658988606Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:40:39.660468 kubelet[3390]: E1212 17:40:39.660428 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5fdd79f47b-hk8h4" podUID="674dddc6-fb43-422e-85c0-76f7f4bb018f" Dec 12 17:40:39.795611 containerd[1898]: time="2025-12-12T17:40:39.795477982Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-lwtn5,Uid:1a665283-3e69-4f2e-9c9b-8aae93e17ef9,Namespace:calico-system,Attempt:0,}" Dec 12 17:40:39.895443 systemd-networkd[1491]: cali1108b48c3ea: Link UP Dec 12 17:40:39.896010 systemd-networkd[1491]: cali1108b48c3ea: Gained carrier Dec 12 17:40:39.911886 containerd[1898]: 2025-12-12 17:40:39.830 [INFO][5376] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.2.2--a--260bc0236d-k8s-goldmane--666569f655--lwtn5-eth0 goldmane-666569f655- calico-system 1a665283-3e69-4f2e-9c9b-8aae93e17ef9 841 0 2025-12-12 17:40:09 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459.2.2-a-260bc0236d goldmane-666569f655-lwtn5 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali1108b48c3ea [] [] }} ContainerID="a3aa1a10909d22e8e6728139a879c7a81abc2051b9bc2093f7451ece5de66044" Namespace="calico-system" Pod="goldmane-666569f655-lwtn5" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-goldmane--666569f655--lwtn5-" Dec 12 17:40:39.911886 containerd[1898]: 2025-12-12 17:40:39.830 [INFO][5376] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a3aa1a10909d22e8e6728139a879c7a81abc2051b9bc2093f7451ece5de66044" Namespace="calico-system" Pod="goldmane-666569f655-lwtn5" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-goldmane--666569f655--lwtn5-eth0" Dec 12 17:40:39.911886 containerd[1898]: 2025-12-12 17:40:39.853 [INFO][5389] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a3aa1a10909d22e8e6728139a879c7a81abc2051b9bc2093f7451ece5de66044" HandleID="k8s-pod-network.a3aa1a10909d22e8e6728139a879c7a81abc2051b9bc2093f7451ece5de66044" Workload="ci--4459.2.2--a--260bc0236d-k8s-goldmane--666569f655--lwtn5-eth0" Dec 12 17:40:39.911886 containerd[1898]: 2025-12-12 17:40:39.853 [INFO][5389] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="a3aa1a10909d22e8e6728139a879c7a81abc2051b9bc2093f7451ece5de66044" HandleID="k8s-pod-network.a3aa1a10909d22e8e6728139a879c7a81abc2051b9bc2093f7451ece5de66044" Workload="ci--4459.2.2--a--260bc0236d-k8s-goldmane--666569f655--lwtn5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2fe0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.2.2-a-260bc0236d", "pod":"goldmane-666569f655-lwtn5", "timestamp":"2025-12-12 17:40:39.853049791 +0000 UTC"}, Hostname:"ci-4459.2.2-a-260bc0236d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 17:40:39.911886 containerd[1898]: 2025-12-12 17:40:39.853 [INFO][5389] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 17:40:39.911886 containerd[1898]: 2025-12-12 17:40:39.853 [INFO][5389] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 17:40:39.911886 containerd[1898]: 2025-12-12 17:40:39.853 [INFO][5389] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.2.2-a-260bc0236d' Dec 12 17:40:39.911886 containerd[1898]: 2025-12-12 17:40:39.862 [INFO][5389] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a3aa1a10909d22e8e6728139a879c7a81abc2051b9bc2093f7451ece5de66044" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:39.911886 containerd[1898]: 2025-12-12 17:40:39.866 [INFO][5389] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:39.911886 containerd[1898]: 2025-12-12 17:40:39.870 [INFO][5389] ipam/ipam.go 511: Trying affinity for 192.168.17.64/26 host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:39.911886 containerd[1898]: 2025-12-12 17:40:39.871 [INFO][5389] ipam/ipam.go 158: Attempting to load block cidr=192.168.17.64/26 host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:39.911886 containerd[1898]: 2025-12-12 17:40:39.873 [INFO][5389] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.17.64/26 host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:39.911886 containerd[1898]: 2025-12-12 17:40:39.873 [INFO][5389] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.17.64/26 handle="k8s-pod-network.a3aa1a10909d22e8e6728139a879c7a81abc2051b9bc2093f7451ece5de66044" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:39.911886 containerd[1898]: 2025-12-12 17:40:39.875 [INFO][5389] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.a3aa1a10909d22e8e6728139a879c7a81abc2051b9bc2093f7451ece5de66044 Dec 12 17:40:39.911886 containerd[1898]: 2025-12-12 17:40:39.880 [INFO][5389] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.17.64/26 handle="k8s-pod-network.a3aa1a10909d22e8e6728139a879c7a81abc2051b9bc2093f7451ece5de66044" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:39.911886 containerd[1898]: 2025-12-12 17:40:39.890 [INFO][5389] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.17.72/26] block=192.168.17.64/26 handle="k8s-pod-network.a3aa1a10909d22e8e6728139a879c7a81abc2051b9bc2093f7451ece5de66044" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:39.911886 containerd[1898]: 2025-12-12 17:40:39.890 [INFO][5389] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.17.72/26] handle="k8s-pod-network.a3aa1a10909d22e8e6728139a879c7a81abc2051b9bc2093f7451ece5de66044" host="ci-4459.2.2-a-260bc0236d" Dec 12 17:40:39.911886 containerd[1898]: 2025-12-12 17:40:39.890 [INFO][5389] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 17:40:39.911886 containerd[1898]: 2025-12-12 17:40:39.890 [INFO][5389] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.17.72/26] IPv6=[] ContainerID="a3aa1a10909d22e8e6728139a879c7a81abc2051b9bc2093f7451ece5de66044" HandleID="k8s-pod-network.a3aa1a10909d22e8e6728139a879c7a81abc2051b9bc2093f7451ece5de66044" Workload="ci--4459.2.2--a--260bc0236d-k8s-goldmane--666569f655--lwtn5-eth0" Dec 12 17:40:39.912307 containerd[1898]: 2025-12-12 17:40:39.892 [INFO][5376] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a3aa1a10909d22e8e6728139a879c7a81abc2051b9bc2093f7451ece5de66044" Namespace="calico-system" Pod="goldmane-666569f655-lwtn5" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-goldmane--666569f655--lwtn5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--260bc0236d-k8s-goldmane--666569f655--lwtn5-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"1a665283-3e69-4f2e-9c9b-8aae93e17ef9", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 40, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-260bc0236d", ContainerID:"", Pod:"goldmane-666569f655-lwtn5", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.17.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali1108b48c3ea", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:40:39.912307 containerd[1898]: 2025-12-12 17:40:39.892 [INFO][5376] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.17.72/32] ContainerID="a3aa1a10909d22e8e6728139a879c7a81abc2051b9bc2093f7451ece5de66044" Namespace="calico-system" Pod="goldmane-666569f655-lwtn5" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-goldmane--666569f655--lwtn5-eth0" Dec 12 17:40:39.912307 containerd[1898]: 2025-12-12 17:40:39.892 [INFO][5376] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1108b48c3ea ContainerID="a3aa1a10909d22e8e6728139a879c7a81abc2051b9bc2093f7451ece5de66044" Namespace="calico-system" Pod="goldmane-666569f655-lwtn5" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-goldmane--666569f655--lwtn5-eth0" Dec 12 17:40:39.912307 containerd[1898]: 2025-12-12 17:40:39.894 [INFO][5376] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a3aa1a10909d22e8e6728139a879c7a81abc2051b9bc2093f7451ece5de66044" Namespace="calico-system" Pod="goldmane-666569f655-lwtn5" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-goldmane--666569f655--lwtn5-eth0" Dec 12 17:40:39.912307 containerd[1898]: 2025-12-12 17:40:39.895 [INFO][5376] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a3aa1a10909d22e8e6728139a879c7a81abc2051b9bc2093f7451ece5de66044" Namespace="calico-system" Pod="goldmane-666569f655-lwtn5" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-goldmane--666569f655--lwtn5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.2.2--a--260bc0236d-k8s-goldmane--666569f655--lwtn5-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"1a665283-3e69-4f2e-9c9b-8aae93e17ef9", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 17, 40, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.2.2-a-260bc0236d", ContainerID:"a3aa1a10909d22e8e6728139a879c7a81abc2051b9bc2093f7451ece5de66044", Pod:"goldmane-666569f655-lwtn5", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.17.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali1108b48c3ea", MAC:"7a:3d:9c:90:8b:b6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 17:40:39.912307 containerd[1898]: 2025-12-12 17:40:39.907 [INFO][5376] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a3aa1a10909d22e8e6728139a879c7a81abc2051b9bc2093f7451ece5de66044" Namespace="calico-system" Pod="goldmane-666569f655-lwtn5" WorkloadEndpoint="ci--4459.2.2--a--260bc0236d-k8s-goldmane--666569f655--lwtn5-eth0" Dec 12 17:40:39.923822 containerd[1898]: time="2025-12-12T17:40:39.923781167Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:40:39.930396 containerd[1898]: time="2025-12-12T17:40:39.930355591Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:40:39.930475 containerd[1898]: time="2025-12-12T17:40:39.930421578Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:40:39.932072 kubelet[3390]: E1212 17:40:39.930576 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:40:39.932072 kubelet[3390]: E1212 17:40:39.930613 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:40:39.932072 kubelet[3390]: E1212 17:40:39.930773 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d8t2j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-ccgmd_calico-system(c62a98c7-a503-42c2-845c-ea1022fbba96): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:40:39.932740 kubelet[3390]: E1212 17:40:39.932705 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ccgmd" podUID="c62a98c7-a503-42c2-845c-ea1022fbba96" Dec 12 17:40:39.958658 containerd[1898]: time="2025-12-12T17:40:39.958578641Z" level=info msg="connecting to shim a3aa1a10909d22e8e6728139a879c7a81abc2051b9bc2093f7451ece5de66044" address="unix:///run/containerd/s/5dfe374e800aea353c64590000199e1e2e4a8a155ca41c538cc47807d4da5821" namespace=k8s.io protocol=ttrpc version=3 Dec 12 17:40:39.974487 kubelet[3390]: E1212 17:40:39.974311 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ccgmd" podUID="c62a98c7-a503-42c2-845c-ea1022fbba96" Dec 12 17:40:39.977613 kubelet[3390]: E1212 17:40:39.977064 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5fdd79f47b-hk8h4" podUID="674dddc6-fb43-422e-85c0-76f7f4bb018f" Dec 12 17:40:39.983358 systemd[1]: Started cri-containerd-a3aa1a10909d22e8e6728139a879c7a81abc2051b9bc2093f7451ece5de66044.scope - libcontainer container a3aa1a10909d22e8e6728139a879c7a81abc2051b9bc2093f7451ece5de66044. Dec 12 17:40:40.033965 containerd[1898]: time="2025-12-12T17:40:40.033905758Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-lwtn5,Uid:1a665283-3e69-4f2e-9c9b-8aae93e17ef9,Namespace:calico-system,Attempt:0,} returns sandbox id \"a3aa1a10909d22e8e6728139a879c7a81abc2051b9bc2093f7451ece5de66044\"" Dec 12 17:40:40.037834 containerd[1898]: time="2025-12-12T17:40:40.037803555Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:40:40.134371 systemd-networkd[1491]: cali0a0a8c04562: Gained IPv6LL Dec 12 17:40:40.418147 containerd[1898]: time="2025-12-12T17:40:40.418102466Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:40:40.422440 containerd[1898]: time="2025-12-12T17:40:40.422398164Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:40:40.422501 containerd[1898]: time="2025-12-12T17:40:40.422492695Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:40:40.422695 kubelet[3390]: E1212 17:40:40.422642 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:40:40.423061 kubelet[3390]: E1212 17:40:40.422703 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:40:40.423061 kubelet[3390]: E1212 17:40:40.422817 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8sc8n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-lwtn5_calico-system(1a665283-3e69-4f2e-9c9b-8aae93e17ef9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:40:40.424031 kubelet[3390]: E1212 17:40:40.423975 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-lwtn5" podUID="1a665283-3e69-4f2e-9c9b-8aae93e17ef9" Dec 12 17:40:40.518345 systemd-networkd[1491]: cali04a682fa476: Gained IPv6LL Dec 12 17:40:40.981313 kubelet[3390]: E1212 17:40:40.981208 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-lwtn5" podUID="1a665283-3e69-4f2e-9c9b-8aae93e17ef9" Dec 12 17:40:40.982305 kubelet[3390]: E1212 17:40:40.982174 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5fdd79f47b-hk8h4" podUID="674dddc6-fb43-422e-85c0-76f7f4bb018f" Dec 12 17:40:40.982305 kubelet[3390]: E1212 17:40:40.982271 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ccgmd" podUID="c62a98c7-a503-42c2-845c-ea1022fbba96" Dec 12 17:40:41.606339 systemd-networkd[1491]: cali1108b48c3ea: Gained IPv6LL Dec 12 17:40:41.981893 kubelet[3390]: E1212 17:40:41.981780 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-lwtn5" podUID="1a665283-3e69-4f2e-9c9b-8aae93e17ef9" Dec 12 17:40:46.796710 containerd[1898]: time="2025-12-12T17:40:46.796666499Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:40:47.079513 containerd[1898]: time="2025-12-12T17:40:47.079308942Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:40:47.082338 containerd[1898]: time="2025-12-12T17:40:47.082271553Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:40:47.082338 containerd[1898]: time="2025-12-12T17:40:47.082312170Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:40:47.082541 kubelet[3390]: E1212 17:40:47.082500 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:40:47.082789 kubelet[3390]: E1212 17:40:47.082556 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:40:47.082789 kubelet[3390]: E1212 17:40:47.082651 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:bc07d3c9c9fe4f658351332e55d349eb,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q952d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-64584c4865-wbzd7_calico-system(9727898f-0ef8-4f55-b4ae-4e451f2586ba): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:40:47.084932 containerd[1898]: time="2025-12-12T17:40:47.084904265Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:40:47.360377 containerd[1898]: time="2025-12-12T17:40:47.360254520Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:40:47.363450 containerd[1898]: time="2025-12-12T17:40:47.363359464Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:40:47.363450 containerd[1898]: time="2025-12-12T17:40:47.363414074Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:40:47.363609 kubelet[3390]: E1212 17:40:47.363566 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:40:47.363652 kubelet[3390]: E1212 17:40:47.363614 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:40:47.363758 kubelet[3390]: E1212 17:40:47.363726 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q952d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-64584c4865-wbzd7_calico-system(9727898f-0ef8-4f55-b4ae-4e451f2586ba): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:40:47.365616 kubelet[3390]: E1212 17:40:47.365486 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64584c4865-wbzd7" podUID="9727898f-0ef8-4f55-b4ae-4e451f2586ba" Dec 12 17:40:49.796040 containerd[1898]: time="2025-12-12T17:40:49.795801941Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:40:50.040742 containerd[1898]: time="2025-12-12T17:40:50.040625249Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:40:50.043659 containerd[1898]: time="2025-12-12T17:40:50.043621885Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:40:50.043722 containerd[1898]: time="2025-12-12T17:40:50.043705912Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:40:50.044120 kubelet[3390]: E1212 17:40:50.043878 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:40:50.044120 kubelet[3390]: E1212 17:40:50.043934 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:40:50.044120 kubelet[3390]: E1212 17:40:50.044047 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2sgzd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7984dd694b-tk5md_calico-apiserver(5d1301d2-72a1-41e5-ae8f-db3bb6f52314): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:40:50.045486 kubelet[3390]: E1212 17:40:50.045451 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-tk5md" podUID="5d1301d2-72a1-41e5-ae8f-db3bb6f52314" Dec 12 17:40:50.796001 containerd[1898]: time="2025-12-12T17:40:50.795896788Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:40:51.085019 containerd[1898]: time="2025-12-12T17:40:51.084876387Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:40:51.087941 containerd[1898]: time="2025-12-12T17:40:51.087895296Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:40:51.088004 containerd[1898]: time="2025-12-12T17:40:51.087983867Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:40:51.088229 kubelet[3390]: E1212 17:40:51.088130 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:40:51.088229 kubelet[3390]: E1212 17:40:51.088187 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:40:51.088819 kubelet[3390]: E1212 17:40:51.088752 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s4xqg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7984dd694b-72thk_calico-apiserver(0083021a-0d5d-42a9-b6e2-679318c1ae2e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:40:51.089998 kubelet[3390]: E1212 17:40:51.089925 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-72thk" podUID="0083021a-0d5d-42a9-b6e2-679318c1ae2e" Dec 12 17:40:52.796254 containerd[1898]: time="2025-12-12T17:40:52.795662618Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:40:53.084320 containerd[1898]: time="2025-12-12T17:40:53.084040965Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:40:53.088264 containerd[1898]: time="2025-12-12T17:40:53.088205016Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:40:53.088335 containerd[1898]: time="2025-12-12T17:40:53.088313860Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:40:53.089122 kubelet[3390]: E1212 17:40:53.088632 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:40:53.089122 kubelet[3390]: E1212 17:40:53.088688 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:40:53.089122 kubelet[3390]: E1212 17:40:53.088790 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b6td9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-5fdd79f47b-hk8h4_calico-system(674dddc6-fb43-422e-85c0-76f7f4bb018f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:40:53.090271 kubelet[3390]: E1212 17:40:53.090241 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5fdd79f47b-hk8h4" podUID="674dddc6-fb43-422e-85c0-76f7f4bb018f" Dec 12 17:40:55.796344 containerd[1898]: time="2025-12-12T17:40:55.796268842Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:40:56.073735 containerd[1898]: time="2025-12-12T17:40:56.073476554Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:40:56.076821 containerd[1898]: time="2025-12-12T17:40:56.076763940Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:40:56.077018 containerd[1898]: time="2025-12-12T17:40:56.076771996Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:40:56.077048 kubelet[3390]: E1212 17:40:56.076990 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:40:56.077048 kubelet[3390]: E1212 17:40:56.077036 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:40:56.077964 kubelet[3390]: E1212 17:40:56.077304 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8sc8n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-lwtn5_calico-system(1a665283-3e69-4f2e-9c9b-8aae93e17ef9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:40:56.078082 containerd[1898]: time="2025-12-12T17:40:56.077307662Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:40:56.079324 kubelet[3390]: E1212 17:40:56.079236 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-lwtn5" podUID="1a665283-3e69-4f2e-9c9b-8aae93e17ef9" Dec 12 17:40:56.347316 containerd[1898]: time="2025-12-12T17:40:56.347180433Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:40:56.350263 containerd[1898]: time="2025-12-12T17:40:56.350196825Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:40:56.350336 containerd[1898]: time="2025-12-12T17:40:56.350298917Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:40:56.350505 kubelet[3390]: E1212 17:40:56.350463 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:40:56.350552 kubelet[3390]: E1212 17:40:56.350520 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:40:56.350922 kubelet[3390]: E1212 17:40:56.350638 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d8t2j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-ccgmd_calico-system(c62a98c7-a503-42c2-845c-ea1022fbba96): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:40:56.353323 containerd[1898]: time="2025-12-12T17:40:56.353301221Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:40:56.609501 containerd[1898]: time="2025-12-12T17:40:56.609372171Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:40:56.612495 containerd[1898]: time="2025-12-12T17:40:56.612449333Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:40:56.612556 containerd[1898]: time="2025-12-12T17:40:56.612533304Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:40:56.612871 kubelet[3390]: E1212 17:40:56.612662 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:40:56.612871 kubelet[3390]: E1212 17:40:56.612722 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:40:56.612871 kubelet[3390]: E1212 17:40:56.612829 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d8t2j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-ccgmd_calico-system(c62a98c7-a503-42c2-845c-ea1022fbba96): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:40:56.613995 kubelet[3390]: E1212 17:40:56.613961 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ccgmd" podUID="c62a98c7-a503-42c2-845c-ea1022fbba96" Dec 12 17:40:58.802427 kubelet[3390]: E1212 17:40:58.802373 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64584c4865-wbzd7" podUID="9727898f-0ef8-4f55-b4ae-4e451f2586ba" Dec 12 17:41:01.796548 kubelet[3390]: E1212 17:41:01.796257 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-tk5md" podUID="5d1301d2-72a1-41e5-ae8f-db3bb6f52314" Dec 12 17:41:03.797256 kubelet[3390]: E1212 17:41:03.797173 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-72thk" podUID="0083021a-0d5d-42a9-b6e2-679318c1ae2e" Dec 12 17:41:05.795661 kubelet[3390]: E1212 17:41:05.794937 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5fdd79f47b-hk8h4" podUID="674dddc6-fb43-422e-85c0-76f7f4bb018f" Dec 12 17:41:07.798269 kubelet[3390]: E1212 17:41:07.797061 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-lwtn5" podUID="1a665283-3e69-4f2e-9c9b-8aae93e17ef9" Dec 12 17:41:08.798082 kubelet[3390]: E1212 17:41:08.798018 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ccgmd" podUID="c62a98c7-a503-42c2-845c-ea1022fbba96" Dec 12 17:41:10.795515 containerd[1898]: time="2025-12-12T17:41:10.795099777Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:41:11.096370 containerd[1898]: time="2025-12-12T17:41:11.096211160Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:11.099372 containerd[1898]: time="2025-12-12T17:41:11.099317084Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:41:11.099497 containerd[1898]: time="2025-12-12T17:41:11.099413887Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:41:11.099678 kubelet[3390]: E1212 17:41:11.099590 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:41:11.099678 kubelet[3390]: E1212 17:41:11.099658 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:41:11.101640 kubelet[3390]: E1212 17:41:11.099775 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:bc07d3c9c9fe4f658351332e55d349eb,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q952d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-64584c4865-wbzd7_calico-system(9727898f-0ef8-4f55-b4ae-4e451f2586ba): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:11.103541 containerd[1898]: time="2025-12-12T17:41:11.103514525Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:41:11.357302 containerd[1898]: time="2025-12-12T17:41:11.357044182Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:11.360307 containerd[1898]: time="2025-12-12T17:41:11.360169034Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:41:11.360307 containerd[1898]: time="2025-12-12T17:41:11.360240061Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:41:11.361360 kubelet[3390]: E1212 17:41:11.361313 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:41:11.361450 kubelet[3390]: E1212 17:41:11.361368 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:41:11.361485 kubelet[3390]: E1212 17:41:11.361456 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q952d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-64584c4865-wbzd7_calico-system(9727898f-0ef8-4f55-b4ae-4e451f2586ba): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:11.362850 kubelet[3390]: E1212 17:41:11.362817 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64584c4865-wbzd7" podUID="9727898f-0ef8-4f55-b4ae-4e451f2586ba" Dec 12 17:41:13.796573 containerd[1898]: time="2025-12-12T17:41:13.795917750Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:41:14.057344 containerd[1898]: time="2025-12-12T17:41:14.057201915Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:14.061028 containerd[1898]: time="2025-12-12T17:41:14.060939373Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:41:14.061028 containerd[1898]: time="2025-12-12T17:41:14.060993910Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:41:14.061178 kubelet[3390]: E1212 17:41:14.061144 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:41:14.061506 kubelet[3390]: E1212 17:41:14.061186 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:41:14.061506 kubelet[3390]: E1212 17:41:14.061302 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2sgzd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7984dd694b-tk5md_calico-apiserver(5d1301d2-72a1-41e5-ae8f-db3bb6f52314): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:14.062435 kubelet[3390]: E1212 17:41:14.062398 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-tk5md" podUID="5d1301d2-72a1-41e5-ae8f-db3bb6f52314" Dec 12 17:41:16.796492 containerd[1898]: time="2025-12-12T17:41:16.796385990Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:41:17.057479 containerd[1898]: time="2025-12-12T17:41:17.055084034Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:17.059520 containerd[1898]: time="2025-12-12T17:41:17.059478578Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:41:17.059611 containerd[1898]: time="2025-12-12T17:41:17.059554757Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:41:17.059745 kubelet[3390]: E1212 17:41:17.059684 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:41:17.060127 kubelet[3390]: E1212 17:41:17.059744 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:41:17.060127 kubelet[3390]: E1212 17:41:17.059852 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s4xqg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7984dd694b-72thk_calico-apiserver(0083021a-0d5d-42a9-b6e2-679318c1ae2e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:17.061314 kubelet[3390]: E1212 17:41:17.061280 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-72thk" podUID="0083021a-0d5d-42a9-b6e2-679318c1ae2e" Dec 12 17:41:19.798736 containerd[1898]: time="2025-12-12T17:41:19.798650813Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:41:20.055442 containerd[1898]: time="2025-12-12T17:41:20.055329823Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:20.058672 containerd[1898]: time="2025-12-12T17:41:20.058637832Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:41:20.059257 containerd[1898]: time="2025-12-12T17:41:20.058745044Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:41:20.059347 kubelet[3390]: E1212 17:41:20.058918 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:41:20.059347 kubelet[3390]: E1212 17:41:20.058956 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:41:20.059347 kubelet[3390]: E1212 17:41:20.059108 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d8t2j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-ccgmd_calico-system(c62a98c7-a503-42c2-845c-ea1022fbba96): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:20.059686 containerd[1898]: time="2025-12-12T17:41:20.059388122Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:41:20.340999 containerd[1898]: time="2025-12-12T17:41:20.340867439Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:20.345047 containerd[1898]: time="2025-12-12T17:41:20.344958235Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:41:20.345047 containerd[1898]: time="2025-12-12T17:41:20.345011485Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:41:20.345212 kubelet[3390]: E1212 17:41:20.345165 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:41:20.345272 kubelet[3390]: E1212 17:41:20.345233 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:41:20.345500 kubelet[3390]: E1212 17:41:20.345461 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8sc8n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-lwtn5_calico-system(1a665283-3e69-4f2e-9c9b-8aae93e17ef9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:20.346447 containerd[1898]: time="2025-12-12T17:41:20.346422573Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:41:20.346693 kubelet[3390]: E1212 17:41:20.346662 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-lwtn5" podUID="1a665283-3e69-4f2e-9c9b-8aae93e17ef9" Dec 12 17:41:20.612466 containerd[1898]: time="2025-12-12T17:41:20.612333636Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:20.615463 containerd[1898]: time="2025-12-12T17:41:20.615416894Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:41:20.615620 containerd[1898]: time="2025-12-12T17:41:20.615448543Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:41:20.615804 kubelet[3390]: E1212 17:41:20.615767 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:41:20.615897 kubelet[3390]: E1212 17:41:20.615884 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:41:20.616198 kubelet[3390]: E1212 17:41:20.616140 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b6td9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-5fdd79f47b-hk8h4_calico-system(674dddc6-fb43-422e-85c0-76f7f4bb018f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:20.616798 containerd[1898]: time="2025-12-12T17:41:20.616681873Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:41:20.617648 kubelet[3390]: E1212 17:41:20.617614 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5fdd79f47b-hk8h4" podUID="674dddc6-fb43-422e-85c0-76f7f4bb018f" Dec 12 17:41:20.924504 containerd[1898]: time="2025-12-12T17:41:20.924377017Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:20.929692 containerd[1898]: time="2025-12-12T17:41:20.929624309Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:41:20.929820 containerd[1898]: time="2025-12-12T17:41:20.929708032Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:41:20.930258 kubelet[3390]: E1212 17:41:20.929848 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:41:20.930258 kubelet[3390]: E1212 17:41:20.929907 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:41:20.930258 kubelet[3390]: E1212 17:41:20.930005 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d8t2j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-ccgmd_calico-system(c62a98c7-a503-42c2-845c-ea1022fbba96): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:20.931446 kubelet[3390]: E1212 17:41:20.931413 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ccgmd" podUID="c62a98c7-a503-42c2-845c-ea1022fbba96" Dec 12 17:41:24.796170 kubelet[3390]: E1212 17:41:24.795889 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64584c4865-wbzd7" podUID="9727898f-0ef8-4f55-b4ae-4e451f2586ba" Dec 12 17:41:26.795425 kubelet[3390]: E1212 17:41:26.795377 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-tk5md" podUID="5d1301d2-72a1-41e5-ae8f-db3bb6f52314" Dec 12 17:41:29.796920 kubelet[3390]: E1212 17:41:29.796852 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-72thk" podUID="0083021a-0d5d-42a9-b6e2-679318c1ae2e" Dec 12 17:41:31.798017 kubelet[3390]: E1212 17:41:31.797810 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ccgmd" podUID="c62a98c7-a503-42c2-845c-ea1022fbba96" Dec 12 17:41:32.796281 kubelet[3390]: E1212 17:41:32.795857 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5fdd79f47b-hk8h4" podUID="674dddc6-fb43-422e-85c0-76f7f4bb018f" Dec 12 17:41:32.796889 kubelet[3390]: E1212 17:41:32.796846 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-lwtn5" podUID="1a665283-3e69-4f2e-9c9b-8aae93e17ef9" Dec 12 17:41:35.843974 systemd[1]: Started sshd@7-10.200.20.10:22-10.200.16.10:40946.service - OpenSSH per-connection server daemon (10.200.16.10:40946). Dec 12 17:41:36.348248 sshd[5548]: Accepted publickey for core from 10.200.16.10 port 40946 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:41:36.350964 sshd-session[5548]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:41:36.358930 systemd-logind[1875]: New session 10 of user core. Dec 12 17:41:36.366499 systemd[1]: Started session-10.scope - Session 10 of User core. Dec 12 17:41:36.785003 sshd[5551]: Connection closed by 10.200.16.10 port 40946 Dec 12 17:41:36.785642 sshd-session[5548]: pam_unix(sshd:session): session closed for user core Dec 12 17:41:36.789241 systemd[1]: sshd@7-10.200.20.10:22-10.200.16.10:40946.service: Deactivated successfully. Dec 12 17:41:36.789525 systemd-logind[1875]: Session 10 logged out. Waiting for processes to exit. Dec 12 17:41:36.791094 systemd[1]: session-10.scope: Deactivated successfully. Dec 12 17:41:36.792902 systemd-logind[1875]: Removed session 10. Dec 12 17:41:38.797738 kubelet[3390]: E1212 17:41:38.797617 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64584c4865-wbzd7" podUID="9727898f-0ef8-4f55-b4ae-4e451f2586ba" Dec 12 17:41:39.803427 kubelet[3390]: E1212 17:41:39.803318 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-tk5md" podUID="5d1301d2-72a1-41e5-ae8f-db3bb6f52314" Dec 12 17:41:41.880363 systemd[1]: Started sshd@8-10.200.20.10:22-10.200.16.10:45870.service - OpenSSH per-connection server daemon (10.200.16.10:45870). Dec 12 17:41:42.376873 sshd[5565]: Accepted publickey for core from 10.200.16.10 port 45870 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:41:42.378015 sshd-session[5565]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:41:42.385051 systemd-logind[1875]: New session 11 of user core. Dec 12 17:41:42.391043 systemd[1]: Started session-11.scope - Session 11 of User core. Dec 12 17:41:42.764669 sshd[5568]: Connection closed by 10.200.16.10 port 45870 Dec 12 17:41:42.765332 sshd-session[5565]: pam_unix(sshd:session): session closed for user core Dec 12 17:41:42.768758 systemd-logind[1875]: Session 11 logged out. Waiting for processes to exit. Dec 12 17:41:42.769495 systemd[1]: sshd@8-10.200.20.10:22-10.200.16.10:45870.service: Deactivated successfully. Dec 12 17:41:42.772664 systemd[1]: session-11.scope: Deactivated successfully. Dec 12 17:41:42.774610 systemd-logind[1875]: Removed session 11. Dec 12 17:41:42.796006 kubelet[3390]: E1212 17:41:42.795913 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-72thk" podUID="0083021a-0d5d-42a9-b6e2-679318c1ae2e" Dec 12 17:41:43.797329 kubelet[3390]: E1212 17:41:43.797214 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ccgmd" podUID="c62a98c7-a503-42c2-845c-ea1022fbba96" Dec 12 17:41:43.798378 kubelet[3390]: E1212 17:41:43.798344 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5fdd79f47b-hk8h4" podUID="674dddc6-fb43-422e-85c0-76f7f4bb018f" Dec 12 17:41:45.796543 kubelet[3390]: E1212 17:41:45.796464 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-lwtn5" podUID="1a665283-3e69-4f2e-9c9b-8aae93e17ef9" Dec 12 17:41:47.867204 systemd[1]: Started sshd@9-10.200.20.10:22-10.200.16.10:45872.service - OpenSSH per-connection server daemon (10.200.16.10:45872). Dec 12 17:41:48.360231 sshd[5581]: Accepted publickey for core from 10.200.16.10 port 45872 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:41:48.360995 sshd-session[5581]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:41:48.367899 systemd-logind[1875]: New session 12 of user core. Dec 12 17:41:48.372316 systemd[1]: Started session-12.scope - Session 12 of User core. Dec 12 17:41:48.777127 sshd[5584]: Connection closed by 10.200.16.10 port 45872 Dec 12 17:41:48.778002 sshd-session[5581]: pam_unix(sshd:session): session closed for user core Dec 12 17:41:48.782873 systemd[1]: sshd@9-10.200.20.10:22-10.200.16.10:45872.service: Deactivated successfully. Dec 12 17:41:48.785604 systemd[1]: session-12.scope: Deactivated successfully. Dec 12 17:41:48.789067 systemd-logind[1875]: Session 12 logged out. Waiting for processes to exit. Dec 12 17:41:48.790383 systemd-logind[1875]: Removed session 12. Dec 12 17:41:48.869020 systemd[1]: Started sshd@10-10.200.20.10:22-10.200.16.10:45874.service - OpenSSH per-connection server daemon (10.200.16.10:45874). Dec 12 17:41:49.356165 sshd[5596]: Accepted publickey for core from 10.200.16.10 port 45874 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:41:49.357299 sshd-session[5596]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:41:49.362473 systemd-logind[1875]: New session 13 of user core. Dec 12 17:41:49.367372 systemd[1]: Started session-13.scope - Session 13 of User core. Dec 12 17:41:49.771256 sshd[5599]: Connection closed by 10.200.16.10 port 45874 Dec 12 17:41:49.771983 sshd-session[5596]: pam_unix(sshd:session): session closed for user core Dec 12 17:41:49.775861 systemd-logind[1875]: Session 13 logged out. Waiting for processes to exit. Dec 12 17:41:49.776397 systemd[1]: sshd@10-10.200.20.10:22-10.200.16.10:45874.service: Deactivated successfully. Dec 12 17:41:49.778085 systemd[1]: session-13.scope: Deactivated successfully. Dec 12 17:41:49.780200 systemd-logind[1875]: Removed session 13. Dec 12 17:41:49.862360 systemd[1]: Started sshd@11-10.200.20.10:22-10.200.16.10:45884.service - OpenSSH per-connection server daemon (10.200.16.10:45884). Dec 12 17:41:50.322524 sshd[5610]: Accepted publickey for core from 10.200.16.10 port 45884 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:41:50.324259 sshd-session[5610]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:41:50.330265 systemd-logind[1875]: New session 14 of user core. Dec 12 17:41:50.335368 systemd[1]: Started session-14.scope - Session 14 of User core. Dec 12 17:41:50.714293 sshd[5613]: Connection closed by 10.200.16.10 port 45884 Dec 12 17:41:50.715358 sshd-session[5610]: pam_unix(sshd:session): session closed for user core Dec 12 17:41:50.718679 systemd[1]: sshd@11-10.200.20.10:22-10.200.16.10:45884.service: Deactivated successfully. Dec 12 17:41:50.721589 systemd[1]: session-14.scope: Deactivated successfully. Dec 12 17:41:50.722441 systemd-logind[1875]: Session 14 logged out. Waiting for processes to exit. Dec 12 17:41:50.723968 systemd-logind[1875]: Removed session 14. Dec 12 17:41:51.795886 containerd[1898]: time="2025-12-12T17:41:51.795799466Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 17:41:52.103351 containerd[1898]: time="2025-12-12T17:41:52.103209169Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:52.106282 containerd[1898]: time="2025-12-12T17:41:52.106239218Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 17:41:52.106387 containerd[1898]: time="2025-12-12T17:41:52.106237634Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 17:41:52.106678 kubelet[3390]: E1212 17:41:52.106476 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:41:52.106678 kubelet[3390]: E1212 17:41:52.106528 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 17:41:52.106678 kubelet[3390]: E1212 17:41:52.106635 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:bc07d3c9c9fe4f658351332e55d349eb,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q952d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-64584c4865-wbzd7_calico-system(9727898f-0ef8-4f55-b4ae-4e451f2586ba): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:52.108779 containerd[1898]: time="2025-12-12T17:41:52.108697646Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 17:41:52.360238 containerd[1898]: time="2025-12-12T17:41:52.357668859Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:41:52.363561 containerd[1898]: time="2025-12-12T17:41:52.363439402Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 17:41:52.363561 containerd[1898]: time="2025-12-12T17:41:52.363468843Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 17:41:52.363699 kubelet[3390]: E1212 17:41:52.363665 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:41:52.363730 kubelet[3390]: E1212 17:41:52.363712 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 17:41:52.363842 kubelet[3390]: E1212 17:41:52.363809 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q952d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-64584c4865-wbzd7_calico-system(9727898f-0ef8-4f55-b4ae-4e451f2586ba): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 17:41:52.365145 kubelet[3390]: E1212 17:41:52.365099 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64584c4865-wbzd7" podUID="9727898f-0ef8-4f55-b4ae-4e451f2586ba" Dec 12 17:41:52.795454 kubelet[3390]: E1212 17:41:52.795370 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-tk5md" podUID="5d1301d2-72a1-41e5-ae8f-db3bb6f52314" Dec 12 17:41:55.796137 kubelet[3390]: E1212 17:41:55.795966 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5fdd79f47b-hk8h4" podUID="674dddc6-fb43-422e-85c0-76f7f4bb018f" Dec 12 17:41:55.797114 kubelet[3390]: E1212 17:41:55.796750 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ccgmd" podUID="c62a98c7-a503-42c2-845c-ea1022fbba96" Dec 12 17:41:55.803948 systemd[1]: Started sshd@12-10.200.20.10:22-10.200.16.10:45468.service - OpenSSH per-connection server daemon (10.200.16.10:45468). Dec 12 17:41:56.296981 sshd[5634]: Accepted publickey for core from 10.200.16.10 port 45468 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:41:56.298450 sshd-session[5634]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:41:56.302784 systemd-logind[1875]: New session 15 of user core. Dec 12 17:41:56.308513 systemd[1]: Started session-15.scope - Session 15 of User core. Dec 12 17:41:56.692828 sshd[5637]: Connection closed by 10.200.16.10 port 45468 Dec 12 17:41:56.694590 sshd-session[5634]: pam_unix(sshd:session): session closed for user core Dec 12 17:41:56.698544 systemd[1]: session-15.scope: Deactivated successfully. Dec 12 17:41:56.699591 systemd[1]: sshd@12-10.200.20.10:22-10.200.16.10:45468.service: Deactivated successfully. Dec 12 17:41:56.706206 systemd-logind[1875]: Session 15 logged out. Waiting for processes to exit. Dec 12 17:41:56.708047 systemd-logind[1875]: Removed session 15. Dec 12 17:41:56.796332 kubelet[3390]: E1212 17:41:56.796102 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-72thk" podUID="0083021a-0d5d-42a9-b6e2-679318c1ae2e" Dec 12 17:41:58.795237 kubelet[3390]: E1212 17:41:58.795181 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-lwtn5" podUID="1a665283-3e69-4f2e-9c9b-8aae93e17ef9" Dec 12 17:42:01.773958 systemd[1]: Started sshd@13-10.200.20.10:22-10.200.16.10:47392.service - OpenSSH per-connection server daemon (10.200.16.10:47392). Dec 12 17:42:02.225207 sshd[5654]: Accepted publickey for core from 10.200.16.10 port 47392 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:42:02.225999 sshd-session[5654]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:42:02.229893 systemd-logind[1875]: New session 16 of user core. Dec 12 17:42:02.236352 systemd[1]: Started session-16.scope - Session 16 of User core. Dec 12 17:42:02.615003 sshd[5657]: Connection closed by 10.200.16.10 port 47392 Dec 12 17:42:02.615354 sshd-session[5654]: pam_unix(sshd:session): session closed for user core Dec 12 17:42:02.620668 systemd[1]: sshd@13-10.200.20.10:22-10.200.16.10:47392.service: Deactivated successfully. Dec 12 17:42:02.623860 systemd[1]: session-16.scope: Deactivated successfully. Dec 12 17:42:02.625043 systemd-logind[1875]: Session 16 logged out. Waiting for processes to exit. Dec 12 17:42:02.627449 systemd-logind[1875]: Removed session 16. Dec 12 17:42:03.795732 containerd[1898]: time="2025-12-12T17:42:03.795460636Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:42:04.077606 containerd[1898]: time="2025-12-12T17:42:04.077354462Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:42:04.080489 containerd[1898]: time="2025-12-12T17:42:04.080399518Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:42:04.080489 containerd[1898]: time="2025-12-12T17:42:04.080448728Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:42:04.080743 kubelet[3390]: E1212 17:42:04.080695 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:42:04.081405 kubelet[3390]: E1212 17:42:04.080748 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:42:04.081405 kubelet[3390]: E1212 17:42:04.080855 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2sgzd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7984dd694b-tk5md_calico-apiserver(5d1301d2-72a1-41e5-ae8f-db3bb6f52314): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:42:04.082645 kubelet[3390]: E1212 17:42:04.082608 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-tk5md" podUID="5d1301d2-72a1-41e5-ae8f-db3bb6f52314" Dec 12 17:42:06.795675 kubelet[3390]: E1212 17:42:06.795627 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64584c4865-wbzd7" podUID="9727898f-0ef8-4f55-b4ae-4e451f2586ba" Dec 12 17:42:07.704418 systemd[1]: Started sshd@14-10.200.20.10:22-10.200.16.10:47402.service - OpenSSH per-connection server daemon (10.200.16.10:47402). Dec 12 17:42:07.796104 containerd[1898]: time="2025-12-12T17:42:07.796068231Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 17:42:08.075495 containerd[1898]: time="2025-12-12T17:42:08.075199877Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:42:08.079241 containerd[1898]: time="2025-12-12T17:42:08.078679036Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 17:42:08.079241 containerd[1898]: time="2025-12-12T17:42:08.078735502Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 17:42:08.080333 kubelet[3390]: E1212 17:42:08.080297 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:42:08.081084 kubelet[3390]: E1212 17:42:08.080766 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 17:42:08.081210 kubelet[3390]: E1212 17:42:08.081182 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d8t2j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-ccgmd_calico-system(c62a98c7-a503-42c2-845c-ea1022fbba96): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 17:42:08.083421 containerd[1898]: time="2025-12-12T17:42:08.083394309Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 17:42:08.196035 sshd[5714]: Accepted publickey for core from 10.200.16.10 port 47402 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:42:08.215164 sshd-session[5714]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:42:08.219190 systemd-logind[1875]: New session 17 of user core. Dec 12 17:42:08.226404 systemd[1]: Started session-17.scope - Session 17 of User core. Dec 12 17:42:08.392561 containerd[1898]: time="2025-12-12T17:42:08.392352462Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:42:08.396214 containerd[1898]: time="2025-12-12T17:42:08.396169873Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 17:42:08.396507 containerd[1898]: time="2025-12-12T17:42:08.396260724Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 17:42:08.396553 kubelet[3390]: E1212 17:42:08.396524 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:42:08.396633 kubelet[3390]: E1212 17:42:08.396567 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 17:42:08.396764 kubelet[3390]: E1212 17:42:08.396656 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d8t2j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-ccgmd_calico-system(c62a98c7-a503-42c2-845c-ea1022fbba96): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 17:42:08.399341 kubelet[3390]: E1212 17:42:08.399307 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ccgmd" podUID="c62a98c7-a503-42c2-845c-ea1022fbba96" Dec 12 17:42:08.599274 sshd[5717]: Connection closed by 10.200.16.10 port 47402 Dec 12 17:42:08.599892 sshd-session[5714]: pam_unix(sshd:session): session closed for user core Dec 12 17:42:08.603747 systemd-logind[1875]: Session 17 logged out. Waiting for processes to exit. Dec 12 17:42:08.604057 systemd[1]: sshd@14-10.200.20.10:22-10.200.16.10:47402.service: Deactivated successfully. Dec 12 17:42:08.606141 systemd[1]: session-17.scope: Deactivated successfully. Dec 12 17:42:08.608109 systemd-logind[1875]: Removed session 17. Dec 12 17:42:08.796077 containerd[1898]: time="2025-12-12T17:42:08.795931111Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 17:42:09.062313 containerd[1898]: time="2025-12-12T17:42:09.061355961Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:42:09.064624 containerd[1898]: time="2025-12-12T17:42:09.064508157Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 17:42:09.064624 containerd[1898]: time="2025-12-12T17:42:09.064599064Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 17:42:09.064780 kubelet[3390]: E1212 17:42:09.064734 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:42:09.064846 kubelet[3390]: E1212 17:42:09.064790 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 17:42:09.064921 kubelet[3390]: E1212 17:42:09.064886 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b6td9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-5fdd79f47b-hk8h4_calico-system(674dddc6-fb43-422e-85c0-76f7f4bb018f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 17:42:09.066348 kubelet[3390]: E1212 17:42:09.066274 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5fdd79f47b-hk8h4" podUID="674dddc6-fb43-422e-85c0-76f7f4bb018f" Dec 12 17:42:09.797753 containerd[1898]: time="2025-12-12T17:42:09.797677735Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 17:42:10.068968 containerd[1898]: time="2025-12-12T17:42:10.068835541Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:42:10.072334 containerd[1898]: time="2025-12-12T17:42:10.072279459Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 17:42:10.072433 containerd[1898]: time="2025-12-12T17:42:10.072383190Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 17:42:10.074446 kubelet[3390]: E1212 17:42:10.074399 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:42:10.074714 kubelet[3390]: E1212 17:42:10.074464 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 17:42:10.074714 kubelet[3390]: E1212 17:42:10.074659 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s4xqg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7984dd694b-72thk_calico-apiserver(0083021a-0d5d-42a9-b6e2-679318c1ae2e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 17:42:10.075310 containerd[1898]: time="2025-12-12T17:42:10.075267313Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 17:42:10.076379 kubelet[3390]: E1212 17:42:10.076344 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-72thk" podUID="0083021a-0d5d-42a9-b6e2-679318c1ae2e" Dec 12 17:42:10.377190 containerd[1898]: time="2025-12-12T17:42:10.376927721Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 17:42:10.380592 containerd[1898]: time="2025-12-12T17:42:10.380468722Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 17:42:10.380592 containerd[1898]: time="2025-12-12T17:42:10.380560965Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 17:42:10.380741 kubelet[3390]: E1212 17:42:10.380677 3390 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:42:10.380741 kubelet[3390]: E1212 17:42:10.380728 3390 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 17:42:10.380875 kubelet[3390]: E1212 17:42:10.380829 3390 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8sc8n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-lwtn5_calico-system(1a665283-3e69-4f2e-9c9b-8aae93e17ef9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 17:42:10.382040 kubelet[3390]: E1212 17:42:10.382003 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-lwtn5" podUID="1a665283-3e69-4f2e-9c9b-8aae93e17ef9" Dec 12 17:42:13.689258 systemd[1]: Started sshd@15-10.200.20.10:22-10.200.16.10:44586.service - OpenSSH per-connection server daemon (10.200.16.10:44586). Dec 12 17:42:14.186903 sshd[5728]: Accepted publickey for core from 10.200.16.10 port 44586 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:42:14.188514 sshd-session[5728]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:42:14.194339 systemd-logind[1875]: New session 18 of user core. Dec 12 17:42:14.198382 systemd[1]: Started session-18.scope - Session 18 of User core. Dec 12 17:42:14.610441 sshd[5731]: Connection closed by 10.200.16.10 port 44586 Dec 12 17:42:14.610348 sshd-session[5728]: pam_unix(sshd:session): session closed for user core Dec 12 17:42:14.613674 systemd-logind[1875]: Session 18 logged out. Waiting for processes to exit. Dec 12 17:42:14.614281 systemd[1]: sshd@15-10.200.20.10:22-10.200.16.10:44586.service: Deactivated successfully. Dec 12 17:42:14.616876 systemd[1]: session-18.scope: Deactivated successfully. Dec 12 17:42:14.618575 systemd-logind[1875]: Removed session 18. Dec 12 17:42:14.697319 systemd[1]: Started sshd@16-10.200.20.10:22-10.200.16.10:44590.service - OpenSSH per-connection server daemon (10.200.16.10:44590). Dec 12 17:42:15.180750 sshd[5743]: Accepted publickey for core from 10.200.16.10 port 44590 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:42:15.182879 sshd-session[5743]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:42:15.189429 systemd-logind[1875]: New session 19 of user core. Dec 12 17:42:15.195432 systemd[1]: Started session-19.scope - Session 19 of User core. Dec 12 17:42:15.693392 sshd[5746]: Connection closed by 10.200.16.10 port 44590 Dec 12 17:42:15.694456 sshd-session[5743]: pam_unix(sshd:session): session closed for user core Dec 12 17:42:15.698544 systemd[1]: sshd@16-10.200.20.10:22-10.200.16.10:44590.service: Deactivated successfully. Dec 12 17:42:15.700484 systemd[1]: session-19.scope: Deactivated successfully. Dec 12 17:42:15.701508 systemd-logind[1875]: Session 19 logged out. Waiting for processes to exit. Dec 12 17:42:15.704167 systemd-logind[1875]: Removed session 19. Dec 12 17:42:15.782413 systemd[1]: Started sshd@17-10.200.20.10:22-10.200.16.10:44592.service - OpenSSH per-connection server daemon (10.200.16.10:44592). Dec 12 17:42:16.284523 sshd[5756]: Accepted publickey for core from 10.200.16.10 port 44592 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:42:16.285821 sshd-session[5756]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:42:16.290052 systemd-logind[1875]: New session 20 of user core. Dec 12 17:42:16.303353 systemd[1]: Started session-20.scope - Session 20 of User core. Dec 12 17:42:16.799227 kubelet[3390]: E1212 17:42:16.796757 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-tk5md" podUID="5d1301d2-72a1-41e5-ae8f-db3bb6f52314" Dec 12 17:42:17.085120 sshd[5759]: Connection closed by 10.200.16.10 port 44592 Dec 12 17:42:17.084504 sshd-session[5756]: pam_unix(sshd:session): session closed for user core Dec 12 17:42:17.087969 systemd[1]: session-20.scope: Deactivated successfully. Dec 12 17:42:17.089771 systemd[1]: sshd@17-10.200.20.10:22-10.200.16.10:44592.service: Deactivated successfully. Dec 12 17:42:17.093697 systemd-logind[1875]: Session 20 logged out. Waiting for processes to exit. Dec 12 17:42:17.095715 systemd-logind[1875]: Removed session 20. Dec 12 17:42:17.170326 systemd[1]: Started sshd@18-10.200.20.10:22-10.200.16.10:44606.service - OpenSSH per-connection server daemon (10.200.16.10:44606). Dec 12 17:42:17.670288 sshd[5781]: Accepted publickey for core from 10.200.16.10 port 44606 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:42:17.671050 sshd-session[5781]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:42:17.674827 systemd-logind[1875]: New session 21 of user core. Dec 12 17:42:17.683344 systemd[1]: Started session-21.scope - Session 21 of User core. Dec 12 17:42:18.170955 sshd[5784]: Connection closed by 10.200.16.10 port 44606 Dec 12 17:42:18.171548 sshd-session[5781]: pam_unix(sshd:session): session closed for user core Dec 12 17:42:18.176300 systemd[1]: sshd@18-10.200.20.10:22-10.200.16.10:44606.service: Deactivated successfully. Dec 12 17:42:18.180641 systemd[1]: session-21.scope: Deactivated successfully. Dec 12 17:42:18.181625 systemd-logind[1875]: Session 21 logged out. Waiting for processes to exit. Dec 12 17:42:18.183132 systemd-logind[1875]: Removed session 21. Dec 12 17:42:18.261145 systemd[1]: Started sshd@19-10.200.20.10:22-10.200.16.10:44622.service - OpenSSH per-connection server daemon (10.200.16.10:44622). Dec 12 17:42:18.754116 sshd[5794]: Accepted publickey for core from 10.200.16.10 port 44622 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:42:18.755506 sshd-session[5794]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:42:18.760590 systemd-logind[1875]: New session 22 of user core. Dec 12 17:42:18.767624 systemd[1]: Started session-22.scope - Session 22 of User core. Dec 12 17:42:19.172914 sshd[5797]: Connection closed by 10.200.16.10 port 44622 Dec 12 17:42:19.172745 sshd-session[5794]: pam_unix(sshd:session): session closed for user core Dec 12 17:42:19.176165 systemd[1]: sshd@19-10.200.20.10:22-10.200.16.10:44622.service: Deactivated successfully. Dec 12 17:42:19.180131 systemd[1]: session-22.scope: Deactivated successfully. Dec 12 17:42:19.181776 systemd-logind[1875]: Session 22 logged out. Waiting for processes to exit. Dec 12 17:42:19.182963 systemd-logind[1875]: Removed session 22. Dec 12 17:42:19.798211 kubelet[3390]: E1212 17:42:19.798153 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64584c4865-wbzd7" podUID="9727898f-0ef8-4f55-b4ae-4e451f2586ba" Dec 12 17:42:20.795790 kubelet[3390]: E1212 17:42:20.795743 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-72thk" podUID="0083021a-0d5d-42a9-b6e2-679318c1ae2e" Dec 12 17:42:20.796402 kubelet[3390]: E1212 17:42:20.795994 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5fdd79f47b-hk8h4" podUID="674dddc6-fb43-422e-85c0-76f7f4bb018f" Dec 12 17:42:20.796750 kubelet[3390]: E1212 17:42:20.796714 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ccgmd" podUID="c62a98c7-a503-42c2-845c-ea1022fbba96" Dec 12 17:42:24.259379 systemd[1]: Started sshd@20-10.200.20.10:22-10.200.16.10:60336.service - OpenSSH per-connection server daemon (10.200.16.10:60336). Dec 12 17:42:24.748022 sshd[5812]: Accepted publickey for core from 10.200.16.10 port 60336 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:42:24.749497 sshd-session[5812]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:42:24.754790 systemd-logind[1875]: New session 23 of user core. Dec 12 17:42:24.761339 systemd[1]: Started session-23.scope - Session 23 of User core. Dec 12 17:42:24.796343 kubelet[3390]: E1212 17:42:24.796303 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-lwtn5" podUID="1a665283-3e69-4f2e-9c9b-8aae93e17ef9" Dec 12 17:42:25.143553 sshd[5815]: Connection closed by 10.200.16.10 port 60336 Dec 12 17:42:25.142724 sshd-session[5812]: pam_unix(sshd:session): session closed for user core Dec 12 17:42:25.148521 systemd[1]: sshd@20-10.200.20.10:22-10.200.16.10:60336.service: Deactivated successfully. Dec 12 17:42:25.150072 systemd[1]: session-23.scope: Deactivated successfully. Dec 12 17:42:25.154390 systemd-logind[1875]: Session 23 logged out. Waiting for processes to exit. Dec 12 17:42:25.155844 systemd-logind[1875]: Removed session 23. Dec 12 17:42:30.233934 systemd[1]: Started sshd@21-10.200.20.10:22-10.200.16.10:56676.service - OpenSSH per-connection server daemon (10.200.16.10:56676). Dec 12 17:42:30.734313 sshd[5829]: Accepted publickey for core from 10.200.16.10 port 56676 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:42:30.735468 sshd-session[5829]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:42:30.739019 systemd-logind[1875]: New session 24 of user core. Dec 12 17:42:30.748492 systemd[1]: Started session-24.scope - Session 24 of User core. Dec 12 17:42:30.795071 kubelet[3390]: E1212 17:42:30.795022 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-tk5md" podUID="5d1301d2-72a1-41e5-ae8f-db3bb6f52314" Dec 12 17:42:31.124034 sshd[5832]: Connection closed by 10.200.16.10 port 56676 Dec 12 17:42:31.124664 sshd-session[5829]: pam_unix(sshd:session): session closed for user core Dec 12 17:42:31.128309 systemd-logind[1875]: Session 24 logged out. Waiting for processes to exit. Dec 12 17:42:31.128472 systemd[1]: sshd@21-10.200.20.10:22-10.200.16.10:56676.service: Deactivated successfully. Dec 12 17:42:31.131322 systemd[1]: session-24.scope: Deactivated successfully. Dec 12 17:42:31.132838 systemd-logind[1875]: Removed session 24. Dec 12 17:42:31.798377 kubelet[3390]: E1212 17:42:31.798287 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ccgmd" podUID="c62a98c7-a503-42c2-845c-ea1022fbba96" Dec 12 17:42:31.798754 kubelet[3390]: E1212 17:42:31.798635 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64584c4865-wbzd7" podUID="9727898f-0ef8-4f55-b4ae-4e451f2586ba" Dec 12 17:42:32.795851 kubelet[3390]: E1212 17:42:32.795395 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5fdd79f47b-hk8h4" podUID="674dddc6-fb43-422e-85c0-76f7f4bb018f" Dec 12 17:42:32.796574 kubelet[3390]: E1212 17:42:32.796546 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-72thk" podUID="0083021a-0d5d-42a9-b6e2-679318c1ae2e" Dec 12 17:42:36.210666 systemd[1]: Started sshd@22-10.200.20.10:22-10.200.16.10:56692.service - OpenSSH per-connection server daemon (10.200.16.10:56692). Dec 12 17:42:36.669167 sshd[5868]: Accepted publickey for core from 10.200.16.10 port 56692 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:42:36.670342 sshd-session[5868]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:42:36.676381 systemd-logind[1875]: New session 25 of user core. Dec 12 17:42:36.679646 systemd[1]: Started session-25.scope - Session 25 of User core. Dec 12 17:42:37.043256 sshd[5871]: Connection closed by 10.200.16.10 port 56692 Dec 12 17:42:37.043048 sshd-session[5868]: pam_unix(sshd:session): session closed for user core Dec 12 17:42:37.045770 systemd-logind[1875]: Session 25 logged out. Waiting for processes to exit. Dec 12 17:42:37.046697 systemd[1]: sshd@22-10.200.20.10:22-10.200.16.10:56692.service: Deactivated successfully. Dec 12 17:42:37.049549 systemd[1]: session-25.scope: Deactivated successfully. Dec 12 17:42:37.053118 systemd-logind[1875]: Removed session 25. Dec 12 17:42:37.796440 kubelet[3390]: E1212 17:42:37.796062 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-lwtn5" podUID="1a665283-3e69-4f2e-9c9b-8aae93e17ef9" Dec 12 17:42:42.129637 systemd[1]: Started sshd@23-10.200.20.10:22-10.200.16.10:35702.service - OpenSSH per-connection server daemon (10.200.16.10:35702). Dec 12 17:42:42.583661 sshd[5883]: Accepted publickey for core from 10.200.16.10 port 35702 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:42:42.584704 sshd-session[5883]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:42:42.588336 systemd-logind[1875]: New session 26 of user core. Dec 12 17:42:42.594425 systemd[1]: Started session-26.scope - Session 26 of User core. Dec 12 17:42:42.984791 sshd[5886]: Connection closed by 10.200.16.10 port 35702 Dec 12 17:42:42.985441 sshd-session[5883]: pam_unix(sshd:session): session closed for user core Dec 12 17:42:42.990859 systemd-logind[1875]: Session 26 logged out. Waiting for processes to exit. Dec 12 17:42:42.991852 systemd[1]: sshd@23-10.200.20.10:22-10.200.16.10:35702.service: Deactivated successfully. Dec 12 17:42:42.997069 systemd[1]: session-26.scope: Deactivated successfully. Dec 12 17:42:43.001431 systemd-logind[1875]: Removed session 26. Dec 12 17:42:44.795809 kubelet[3390]: E1212 17:42:44.795711 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-tk5md" podUID="5d1301d2-72a1-41e5-ae8f-db3bb6f52314" Dec 12 17:42:45.795845 kubelet[3390]: E1212 17:42:45.795802 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5fdd79f47b-hk8h4" podUID="674dddc6-fb43-422e-85c0-76f7f4bb018f" Dec 12 17:42:45.797762 kubelet[3390]: E1212 17:42:45.797717 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ccgmd" podUID="c62a98c7-a503-42c2-845c-ea1022fbba96" Dec 12 17:42:46.796040 kubelet[3390]: E1212 17:42:46.795995 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-64584c4865-wbzd7" podUID="9727898f-0ef8-4f55-b4ae-4e451f2586ba" Dec 12 17:42:47.795774 kubelet[3390]: E1212 17:42:47.794980 3390 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7984dd694b-72thk" podUID="0083021a-0d5d-42a9-b6e2-679318c1ae2e" Dec 12 17:42:48.076452 systemd[1]: Started sshd@24-10.200.20.10:22-10.200.16.10:35704.service - OpenSSH per-connection server daemon (10.200.16.10:35704). Dec 12 17:42:48.569106 sshd[5898]: Accepted publickey for core from 10.200.16.10 port 35704 ssh2: RSA SHA256:rv0ogpS37Fn9XgD1tbLDwSSen2nZukTXJG3iueJVyC4 Dec 12 17:42:48.570589 sshd-session[5898]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 17:42:48.575793 systemd-logind[1875]: New session 27 of user core. Dec 12 17:42:48.583350 systemd[1]: Started session-27.scope - Session 27 of User core. Dec 12 17:42:48.962323 sshd[5901]: Connection closed by 10.200.16.10 port 35704 Dec 12 17:42:48.963423 sshd-session[5898]: pam_unix(sshd:session): session closed for user core Dec 12 17:42:48.966620 systemd[1]: sshd@24-10.200.20.10:22-10.200.16.10:35704.service: Deactivated successfully. Dec 12 17:42:48.968460 systemd[1]: session-27.scope: Deactivated successfully. Dec 12 17:42:48.969294 systemd-logind[1875]: Session 27 logged out. Waiting for processes to exit. Dec 12 17:42:48.970979 systemd-logind[1875]: Removed session 27.