May 27 02:47:26.078293 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd490] May 27 02:47:26.078314 kernel: Linux version 6.12.30-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Tue May 27 01:20:04 -00 2025 May 27 02:47:26.078320 kernel: KASLR enabled May 27 02:47:26.078324 kernel: earlycon: pl11 at MMIO 0x00000000effec000 (options '') May 27 02:47:26.078329 kernel: printk: legacy bootconsole [pl11] enabled May 27 02:47:26.078333 kernel: efi: EFI v2.7 by EDK II May 27 02:47:26.078338 kernel: efi: ACPI 2.0=0x3fd5f018 SMBIOS=0x3e580000 SMBIOS 3.0=0x3e560000 MEMATTR=0x3f20e018 RNG=0x3fd5f998 MEMRESERVE=0x3e471598 May 27 02:47:26.078342 kernel: random: crng init done May 27 02:47:26.078346 kernel: secureboot: Secure boot disabled May 27 02:47:26.078350 kernel: ACPI: Early table checksum verification disabled May 27 02:47:26.078354 kernel: ACPI: RSDP 0x000000003FD5F018 000024 (v02 VRTUAL) May 27 02:47:26.078358 kernel: ACPI: XSDT 0x000000003FD5FF18 00006C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) May 27 02:47:26.078362 kernel: ACPI: FACP 0x000000003FD5FC18 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) May 27 02:47:26.078367 kernel: ACPI: DSDT 0x000000003FD41018 01DFCD (v02 MSFTVM DSDT01 00000001 INTL 20230628) May 27 02:47:26.078372 kernel: ACPI: DBG2 0x000000003FD5FB18 000072 (v00 VRTUAL MICROSFT 00000001 MSFT 00000001) May 27 02:47:26.078376 kernel: ACPI: GTDT 0x000000003FD5FD98 000060 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) May 27 02:47:26.078381 kernel: ACPI: OEM0 0x000000003FD5F098 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) May 27 02:47:26.078385 kernel: ACPI: SPCR 0x000000003FD5FA98 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) May 27 02:47:26.078390 kernel: ACPI: APIC 0x000000003FD5F818 0000FC (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) May 27 02:47:26.078394 kernel: ACPI: SRAT 0x000000003FD5F198 000234 (v03 VRTUAL MICROSFT 00000001 MSFT 00000001) May 27 02:47:26.078398 kernel: ACPI: PPTT 0x000000003FD5F418 000120 (v01 VRTUAL MICROSFT 00000000 MSFT 00000000) May 27 02:47:26.078402 kernel: ACPI: BGRT 0x000000003FD5FE98 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) May 27 02:47:26.078406 kernel: ACPI: SPCR: console: pl011,mmio32,0xeffec000,115200 May 27 02:47:26.078410 kernel: ACPI: Use ACPI SPCR as default console: Yes May 27 02:47:26.078414 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] hotplug May 27 02:47:26.078418 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x1bfffffff] hotplug May 27 02:47:26.078422 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1c0000000-0xfbfffffff] hotplug May 27 02:47:26.078426 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] hotplug May 27 02:47:26.078431 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] hotplug May 27 02:47:26.078435 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] hotplug May 27 02:47:26.078439 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] hotplug May 27 02:47:26.078444 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] hotplug May 27 02:47:26.078448 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] hotplug May 27 02:47:26.078452 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] hotplug May 27 02:47:26.078456 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] hotplug May 27 02:47:26.078460 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] hotplug May 27 02:47:26.078464 kernel: NUMA: Node 0 [mem 0x00000000-0x3fffffff] + [mem 0x100000000-0x1bfffffff] -> [mem 0x00000000-0x1bfffffff] May 27 02:47:26.078469 kernel: NODE_DATA(0) allocated [mem 0x1bf7fcdc0-0x1bf803fff] May 27 02:47:26.078473 kernel: Zone ranges: May 27 02:47:26.078477 kernel: DMA [mem 0x0000000000000000-0x00000000ffffffff] May 27 02:47:26.078484 kernel: DMA32 empty May 27 02:47:26.078488 kernel: Normal [mem 0x0000000100000000-0x00000001bfffffff] May 27 02:47:26.078492 kernel: Device empty May 27 02:47:26.078497 kernel: Movable zone start for each node May 27 02:47:26.078501 kernel: Early memory node ranges May 27 02:47:26.078506 kernel: node 0: [mem 0x0000000000000000-0x00000000007fffff] May 27 02:47:26.078511 kernel: node 0: [mem 0x0000000000824000-0x000000003e45ffff] May 27 02:47:26.078515 kernel: node 0: [mem 0x000000003e460000-0x000000003e46ffff] May 27 02:47:26.078519 kernel: node 0: [mem 0x000000003e470000-0x000000003e54ffff] May 27 02:47:26.078524 kernel: node 0: [mem 0x000000003e550000-0x000000003e87ffff] May 27 02:47:26.078528 kernel: node 0: [mem 0x000000003e880000-0x000000003fc7ffff] May 27 02:47:26.078532 kernel: node 0: [mem 0x000000003fc80000-0x000000003fcfffff] May 27 02:47:26.078536 kernel: node 0: [mem 0x000000003fd00000-0x000000003fffffff] May 27 02:47:26.078541 kernel: node 0: [mem 0x0000000100000000-0x00000001bfffffff] May 27 02:47:26.078545 kernel: Initmem setup node 0 [mem 0x0000000000000000-0x00000001bfffffff] May 27 02:47:26.078549 kernel: On node 0, zone DMA: 36 pages in unavailable ranges May 27 02:47:26.078554 kernel: psci: probing for conduit method from ACPI. May 27 02:47:26.078559 kernel: psci: PSCIv1.1 detected in firmware. May 27 02:47:26.078563 kernel: psci: Using standard PSCI v0.2 function IDs May 27 02:47:26.078568 kernel: psci: MIGRATE_INFO_TYPE not supported. May 27 02:47:26.078572 kernel: psci: SMC Calling Convention v1.4 May 27 02:47:26.078633 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 May 27 02:47:26.078638 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 May 27 02:47:26.078643 kernel: percpu: Embedded 33 pages/cpu s98136 r8192 d28840 u135168 May 27 02:47:26.078647 kernel: pcpu-alloc: s98136 r8192 d28840 u135168 alloc=33*4096 May 27 02:47:26.078652 kernel: pcpu-alloc: [0] 0 [0] 1 May 27 02:47:26.078656 kernel: Detected PIPT I-cache on CPU0 May 27 02:47:26.078660 kernel: CPU features: detected: Address authentication (architected QARMA5 algorithm) May 27 02:47:26.078666 kernel: CPU features: detected: GIC system register CPU interface May 27 02:47:26.078671 kernel: CPU features: detected: Spectre-v4 May 27 02:47:26.078675 kernel: CPU features: detected: Spectre-BHB May 27 02:47:26.078680 kernel: CPU features: kernel page table isolation forced ON by KASLR May 27 02:47:26.078684 kernel: CPU features: detected: Kernel page table isolation (KPTI) May 27 02:47:26.078688 kernel: CPU features: detected: ARM erratum 2067961 or 2054223 May 27 02:47:26.078693 kernel: CPU features: detected: SSBS not fully self-synchronizing May 27 02:47:26.078697 kernel: alternatives: applying boot alternatives May 27 02:47:26.078702 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=4c3f98aae7a61b3dcbab6391ba922461adab29dbcb79fd6e18169f93c5a4ab5a May 27 02:47:26.078707 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 27 02:47:26.078711 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 27 02:47:26.078717 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 27 02:47:26.078721 kernel: Fallback order for Node 0: 0 May 27 02:47:26.078726 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1048540 May 27 02:47:26.078730 kernel: Policy zone: Normal May 27 02:47:26.078734 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 27 02:47:26.078738 kernel: software IO TLB: area num 2. May 27 02:47:26.078743 kernel: software IO TLB: mapped [mem 0x000000003a460000-0x000000003e460000] (64MB) May 27 02:47:26.078747 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 May 27 02:47:26.078752 kernel: rcu: Preemptible hierarchical RCU implementation. May 27 02:47:26.078757 kernel: rcu: RCU event tracing is enabled. May 27 02:47:26.078761 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. May 27 02:47:26.078771 kernel: Trampoline variant of Tasks RCU enabled. May 27 02:47:26.078775 kernel: Tracing variant of Tasks RCU enabled. May 27 02:47:26.078779 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 27 02:47:26.078784 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 May 27 02:47:26.078788 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 27 02:47:26.078793 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 27 02:47:26.078797 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 May 27 02:47:26.078801 kernel: GICv3: 960 SPIs implemented May 27 02:47:26.078806 kernel: GICv3: 0 Extended SPIs implemented May 27 02:47:26.078810 kernel: Root IRQ handler: gic_handle_irq May 27 02:47:26.078814 kernel: GICv3: GICv3 features: 16 PPIs, RSS May 27 02:47:26.078819 kernel: GICv3: GICD_CTRL.DS=0, SCR_EL3.FIQ=0 May 27 02:47:26.078824 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000effee000 May 27 02:47:26.078828 kernel: ITS: No ITS available, not enabling LPIs May 27 02:47:26.078833 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 27 02:47:26.078837 kernel: arch_timer: cp15 timer(s) running at 1000.00MHz (virt). May 27 02:47:26.078842 kernel: clocksource: arch_sys_counter: mask: 0x1fffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns May 27 02:47:26.078846 kernel: sched_clock: 61 bits at 1000MHz, resolution 1ns, wraps every 4398046511103ns May 27 02:47:26.078851 kernel: Console: colour dummy device 80x25 May 27 02:47:26.078855 kernel: printk: legacy console [tty1] enabled May 27 02:47:26.078860 kernel: ACPI: Core revision 20240827 May 27 02:47:26.078864 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 2000.00 BogoMIPS (lpj=1000000) May 27 02:47:26.078870 kernel: pid_max: default: 32768 minimum: 301 May 27 02:47:26.078875 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima May 27 02:47:26.078879 kernel: landlock: Up and running. May 27 02:47:26.078883 kernel: SELinux: Initializing. May 27 02:47:26.078888 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 27 02:47:26.078893 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 27 02:47:26.078901 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3b8030, hints 0x1a0000e, misc 0x31e1 May 27 02:47:26.078907 kernel: Hyper-V: Host Build 10.0.26100.1254-1-0 May 27 02:47:26.078911 kernel: Hyper-V: enabling crash_kexec_post_notifiers May 27 02:47:26.078916 kernel: rcu: Hierarchical SRCU implementation. May 27 02:47:26.078921 kernel: rcu: Max phase no-delay instances is 400. May 27 02:47:26.078925 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level May 27 02:47:26.078931 kernel: Remapping and enabling EFI services. May 27 02:47:26.078936 kernel: smp: Bringing up secondary CPUs ... May 27 02:47:26.078941 kernel: Detected PIPT I-cache on CPU1 May 27 02:47:26.078945 kernel: GICv3: CPU1: found redistributor 1 region 1:0x00000000f000e000 May 27 02:47:26.078950 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd490] May 27 02:47:26.078956 kernel: smp: Brought up 1 node, 2 CPUs May 27 02:47:26.078960 kernel: SMP: Total of 2 processors activated. May 27 02:47:26.078965 kernel: CPU: All CPU(s) started at EL1 May 27 02:47:26.078970 kernel: CPU features: detected: 32-bit EL0 Support May 27 02:47:26.078975 kernel: CPU features: detected: Instruction cache invalidation not required for I/D coherence May 27 02:47:26.078980 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence May 27 02:47:26.078984 kernel: CPU features: detected: Common not Private translations May 27 02:47:26.078989 kernel: CPU features: detected: CRC32 instructions May 27 02:47:26.078994 kernel: CPU features: detected: Generic authentication (architected QARMA5 algorithm) May 27 02:47:26.078999 kernel: CPU features: detected: RCpc load-acquire (LDAPR) May 27 02:47:26.079004 kernel: CPU features: detected: LSE atomic instructions May 27 02:47:26.079009 kernel: CPU features: detected: Privileged Access Never May 27 02:47:26.079014 kernel: CPU features: detected: Speculation barrier (SB) May 27 02:47:26.079018 kernel: CPU features: detected: TLB range maintenance instructions May 27 02:47:26.079023 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) May 27 02:47:26.079028 kernel: CPU features: detected: Scalable Vector Extension May 27 02:47:26.079032 kernel: alternatives: applying system-wide alternatives May 27 02:47:26.079037 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 May 27 02:47:26.079043 kernel: SVE: maximum available vector length 16 bytes per vector May 27 02:47:26.079048 kernel: SVE: default vector length 16 bytes per vector May 27 02:47:26.079053 kernel: Memory: 3976108K/4194160K available (11072K kernel code, 2276K rwdata, 8936K rodata, 39424K init, 1034K bss, 213436K reserved, 0K cma-reserved) May 27 02:47:26.079057 kernel: devtmpfs: initialized May 27 02:47:26.079062 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 27 02:47:26.079067 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) May 27 02:47:26.079072 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL May 27 02:47:26.079076 kernel: 0 pages in range for non-PLT usage May 27 02:47:26.079081 kernel: 508544 pages in range for PLT usage May 27 02:47:26.079087 kernel: pinctrl core: initialized pinctrl subsystem May 27 02:47:26.079092 kernel: SMBIOS 3.1.0 present. May 27 02:47:26.079096 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 09/28/2024 May 27 02:47:26.079101 kernel: DMI: Memory slots populated: 2/2 May 27 02:47:26.079106 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 27 02:47:26.079111 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations May 27 02:47:26.079115 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations May 27 02:47:26.079120 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations May 27 02:47:26.079125 kernel: audit: initializing netlink subsys (disabled) May 27 02:47:26.079131 kernel: audit: type=2000 audit(0.062:1): state=initialized audit_enabled=0 res=1 May 27 02:47:26.079136 kernel: thermal_sys: Registered thermal governor 'step_wise' May 27 02:47:26.079140 kernel: cpuidle: using governor menu May 27 02:47:26.079145 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. May 27 02:47:26.079150 kernel: ASID allocator initialised with 32768 entries May 27 02:47:26.079154 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 27 02:47:26.079159 kernel: Serial: AMBA PL011 UART driver May 27 02:47:26.079164 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 27 02:47:26.079169 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page May 27 02:47:26.079174 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages May 27 02:47:26.079179 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page May 27 02:47:26.079184 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 27 02:47:26.079189 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page May 27 02:47:26.079193 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages May 27 02:47:26.079198 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page May 27 02:47:26.079203 kernel: ACPI: Added _OSI(Module Device) May 27 02:47:26.079207 kernel: ACPI: Added _OSI(Processor Device) May 27 02:47:26.079212 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 27 02:47:26.079217 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 27 02:47:26.079222 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 27 02:47:26.079227 kernel: ACPI: Interpreter enabled May 27 02:47:26.079232 kernel: ACPI: Using GIC for interrupt routing May 27 02:47:26.079236 kernel: ARMH0011:00: ttyAMA0 at MMIO 0xeffec000 (irq = 12, base_baud = 0) is a SBSA May 27 02:47:26.079241 kernel: printk: legacy console [ttyAMA0] enabled May 27 02:47:26.079246 kernel: printk: legacy bootconsole [pl11] disabled May 27 02:47:26.079250 kernel: ARMH0011:01: ttyAMA1 at MMIO 0xeffeb000 (irq = 13, base_baud = 0) is a SBSA May 27 02:47:26.079255 kernel: ACPI: CPU0 has been hot-added May 27 02:47:26.079261 kernel: ACPI: CPU1 has been hot-added May 27 02:47:26.079265 kernel: iommu: Default domain type: Translated May 27 02:47:26.079270 kernel: iommu: DMA domain TLB invalidation policy: strict mode May 27 02:47:26.079275 kernel: efivars: Registered efivars operations May 27 02:47:26.079280 kernel: vgaarb: loaded May 27 02:47:26.079284 kernel: clocksource: Switched to clocksource arch_sys_counter May 27 02:47:26.079289 kernel: VFS: Disk quotas dquot_6.6.0 May 27 02:47:26.079294 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 27 02:47:26.079299 kernel: pnp: PnP ACPI init May 27 02:47:26.079304 kernel: pnp: PnP ACPI: found 0 devices May 27 02:47:26.079309 kernel: NET: Registered PF_INET protocol family May 27 02:47:26.079314 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) May 27 02:47:26.079318 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) May 27 02:47:26.079323 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 27 02:47:26.079328 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) May 27 02:47:26.079333 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) May 27 02:47:26.079337 kernel: TCP: Hash tables configured (established 32768 bind 32768) May 27 02:47:26.079342 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) May 27 02:47:26.079348 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) May 27 02:47:26.079353 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 27 02:47:26.079357 kernel: PCI: CLS 0 bytes, default 64 May 27 02:47:26.079362 kernel: kvm [1]: HYP mode not available May 27 02:47:26.079367 kernel: Initialise system trusted keyrings May 27 02:47:26.079371 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 May 27 02:47:26.079376 kernel: Key type asymmetric registered May 27 02:47:26.079381 kernel: Asymmetric key parser 'x509' registered May 27 02:47:26.079385 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) May 27 02:47:26.079391 kernel: io scheduler mq-deadline registered May 27 02:47:26.079396 kernel: io scheduler kyber registered May 27 02:47:26.079400 kernel: io scheduler bfq registered May 27 02:47:26.079405 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 27 02:47:26.079410 kernel: thunder_xcv, ver 1.0 May 27 02:47:26.079415 kernel: thunder_bgx, ver 1.0 May 27 02:47:26.079419 kernel: nicpf, ver 1.0 May 27 02:47:26.079424 kernel: nicvf, ver 1.0 May 27 02:47:26.079559 kernel: rtc-efi rtc-efi.0: registered as rtc0 May 27 02:47:26.079625 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-05-27T02:47:25 UTC (1748314045) May 27 02:47:26.079632 kernel: efifb: probing for efifb May 27 02:47:26.079637 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k May 27 02:47:26.079641 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 May 27 02:47:26.079646 kernel: efifb: scrolling: redraw May 27 02:47:26.079651 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 May 27 02:47:26.079656 kernel: Console: switching to colour frame buffer device 128x48 May 27 02:47:26.079661 kernel: fb0: EFI VGA frame buffer device May 27 02:47:26.079667 kernel: SMCCC: SOC_ID: ARCH_SOC_ID not implemented, skipping .... May 27 02:47:26.079672 kernel: hid: raw HID events driver (C) Jiri Kosina May 27 02:47:26.079677 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available May 27 02:47:26.079681 kernel: watchdog: NMI not fully supported May 27 02:47:26.079686 kernel: watchdog: Hard watchdog permanently disabled May 27 02:47:26.079691 kernel: NET: Registered PF_INET6 protocol family May 27 02:47:26.079696 kernel: Segment Routing with IPv6 May 27 02:47:26.079700 kernel: In-situ OAM (IOAM) with IPv6 May 27 02:47:26.079705 kernel: NET: Registered PF_PACKET protocol family May 27 02:47:26.079711 kernel: Key type dns_resolver registered May 27 02:47:26.079716 kernel: registered taskstats version 1 May 27 02:47:26.079721 kernel: Loading compiled-in X.509 certificates May 27 02:47:26.079725 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.30-flatcar: 6bbf5412ef1f8a32378a640b6d048f74e6d74df0' May 27 02:47:26.079730 kernel: Demotion targets for Node 0: null May 27 02:47:26.079735 kernel: Key type .fscrypt registered May 27 02:47:26.079739 kernel: Key type fscrypt-provisioning registered May 27 02:47:26.079744 kernel: ima: No TPM chip found, activating TPM-bypass! May 27 02:47:26.079749 kernel: ima: Allocated hash algorithm: sha1 May 27 02:47:26.079755 kernel: ima: No architecture policies found May 27 02:47:26.079759 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) May 27 02:47:26.079764 kernel: clk: Disabling unused clocks May 27 02:47:26.079769 kernel: PM: genpd: Disabling unused power domains May 27 02:47:26.079773 kernel: Warning: unable to open an initial console. May 27 02:47:26.079778 kernel: Freeing unused kernel memory: 39424K May 27 02:47:26.079783 kernel: Run /init as init process May 27 02:47:26.079788 kernel: with arguments: May 27 02:47:26.079792 kernel: /init May 27 02:47:26.079798 kernel: with environment: May 27 02:47:26.079802 kernel: HOME=/ May 27 02:47:26.079807 kernel: TERM=linux May 27 02:47:26.079811 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 27 02:47:26.079817 systemd[1]: Successfully made /usr/ read-only. May 27 02:47:26.079824 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 27 02:47:26.079830 systemd[1]: Detected virtualization microsoft. May 27 02:47:26.079836 systemd[1]: Detected architecture arm64. May 27 02:47:26.079841 systemd[1]: Running in initrd. May 27 02:47:26.079846 systemd[1]: No hostname configured, using default hostname. May 27 02:47:26.079851 systemd[1]: Hostname set to . May 27 02:47:26.079856 systemd[1]: Initializing machine ID from random generator. May 27 02:47:26.079862 systemd[1]: Queued start job for default target initrd.target. May 27 02:47:26.079867 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 02:47:26.079872 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 02:47:26.079879 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 27 02:47:26.079884 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 27 02:47:26.079889 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 27 02:47:26.079895 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 27 02:47:26.079901 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 27 02:47:26.079906 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 27 02:47:26.079911 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 02:47:26.079918 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 27 02:47:26.079923 systemd[1]: Reached target paths.target - Path Units. May 27 02:47:26.079928 systemd[1]: Reached target slices.target - Slice Units. May 27 02:47:26.079933 systemd[1]: Reached target swap.target - Swaps. May 27 02:47:26.079938 systemd[1]: Reached target timers.target - Timer Units. May 27 02:47:26.079943 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 27 02:47:26.079948 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 27 02:47:26.079954 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 27 02:47:26.079959 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 27 02:47:26.079965 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 27 02:47:26.079970 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 27 02:47:26.079976 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 27 02:47:26.079981 systemd[1]: Reached target sockets.target - Socket Units. May 27 02:47:26.079986 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 27 02:47:26.079991 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 27 02:47:26.079996 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 27 02:47:26.080002 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). May 27 02:47:26.080008 systemd[1]: Starting systemd-fsck-usr.service... May 27 02:47:26.080013 systemd[1]: Starting systemd-journald.service - Journal Service... May 27 02:47:26.080019 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 27 02:47:26.080037 systemd-journald[224]: Collecting audit messages is disabled. May 27 02:47:26.080053 systemd-journald[224]: Journal started May 27 02:47:26.080067 systemd-journald[224]: Runtime Journal (/run/log/journal/b2ad15cf220d446b999507eadcf563e0) is 8M, max 78.5M, 70.5M free. May 27 02:47:26.086998 systemd-modules-load[226]: Inserted module 'overlay' May 27 02:47:26.094864 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 02:47:26.107595 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 27 02:47:26.111010 systemd-modules-load[226]: Inserted module 'br_netfilter' May 27 02:47:26.118895 kernel: Bridge firewalling registered May 27 02:47:26.118919 systemd[1]: Started systemd-journald.service - Journal Service. May 27 02:47:26.125226 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 27 02:47:26.131205 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 27 02:47:26.149650 systemd[1]: Finished systemd-fsck-usr.service. May 27 02:47:26.154050 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 27 02:47:26.161944 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 02:47:26.173079 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 27 02:47:26.187874 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 27 02:47:26.198737 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 27 02:47:26.214007 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 27 02:47:26.234085 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 27 02:47:26.246198 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 02:47:26.249728 systemd-tmpfiles[249]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. May 27 02:47:26.255628 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 27 02:47:26.266344 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 02:47:26.280057 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 27 02:47:26.308870 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 27 02:47:26.320072 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 27 02:47:26.337218 dracut-cmdline[261]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=4c3f98aae7a61b3dcbab6391ba922461adab29dbcb79fd6e18169f93c5a4ab5a May 27 02:47:26.362520 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 02:47:26.384438 systemd-resolved[262]: Positive Trust Anchors: May 27 02:47:26.384453 systemd-resolved[262]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 27 02:47:26.384473 systemd-resolved[262]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 27 02:47:26.386270 systemd-resolved[262]: Defaulting to hostname 'linux'. May 27 02:47:26.387788 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 27 02:47:26.397010 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 27 02:47:26.493640 kernel: SCSI subsystem initialized May 27 02:47:26.499596 kernel: Loading iSCSI transport class v2.0-870. May 27 02:47:26.507607 kernel: iscsi: registered transport (tcp) May 27 02:47:26.520093 kernel: iscsi: registered transport (qla4xxx) May 27 02:47:26.520143 kernel: QLogic iSCSI HBA Driver May 27 02:47:26.534498 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 27 02:47:26.558663 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 27 02:47:26.566092 systemd[1]: Reached target network-pre.target - Preparation for Network. May 27 02:47:26.617886 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 27 02:47:26.623743 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 27 02:47:26.685634 kernel: raid6: neonx8 gen() 18546 MB/s May 27 02:47:26.701612 kernel: raid6: neonx4 gen() 18559 MB/s May 27 02:47:26.720601 kernel: raid6: neonx2 gen() 17078 MB/s May 27 02:47:26.740619 kernel: raid6: neonx1 gen() 15020 MB/s May 27 02:47:26.759612 kernel: raid6: int64x8 gen() 10545 MB/s May 27 02:47:26.778614 kernel: raid6: int64x4 gen() 10595 MB/s May 27 02:47:26.798619 kernel: raid6: int64x2 gen() 8985 MB/s May 27 02:47:26.819829 kernel: raid6: int64x1 gen() 7013 MB/s May 27 02:47:26.819878 kernel: raid6: using algorithm neonx4 gen() 18559 MB/s May 27 02:47:26.841934 kernel: raid6: .... xor() 15147 MB/s, rmw enabled May 27 02:47:26.842004 kernel: raid6: using neon recovery algorithm May 27 02:47:26.850676 kernel: xor: measuring software checksum speed May 27 02:47:26.850737 kernel: 8regs : 28569 MB/sec May 27 02:47:26.853191 kernel: 32regs : 28818 MB/sec May 27 02:47:26.855712 kernel: arm64_neon : 37604 MB/sec May 27 02:47:26.858734 kernel: xor: using function: arm64_neon (37604 MB/sec) May 27 02:47:26.897598 kernel: Btrfs loaded, zoned=no, fsverity=no May 27 02:47:26.903647 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 27 02:47:26.912752 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 02:47:26.946002 systemd-udevd[474]: Using default interface naming scheme 'v255'. May 27 02:47:26.950140 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 02:47:26.962659 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 27 02:47:26.988818 dracut-pre-trigger[484]: rd.md=0: removing MD RAID activation May 27 02:47:27.012319 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 27 02:47:27.018088 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 27 02:47:27.068415 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 27 02:47:27.079869 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 27 02:47:27.147602 kernel: hv_vmbus: Vmbus version:5.3 May 27 02:47:27.152747 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 02:47:27.156967 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 02:47:27.167737 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 27 02:47:27.178097 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 02:47:27.197721 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 27 02:47:27.226953 kernel: pps_core: LinuxPPS API ver. 1 registered May 27 02:47:27.226977 kernel: hv_vmbus: registering driver hyperv_keyboard May 27 02:47:27.226984 kernel: hv_vmbus: registering driver hid_hyperv May 27 02:47:27.227003 kernel: hv_vmbus: registering driver hv_storvsc May 27 02:47:27.227009 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti May 27 02:47:27.227016 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input0 May 27 02:47:27.227023 kernel: scsi host1: storvsc_host_t May 27 02:47:27.220102 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 02:47:27.246755 kernel: hv_vmbus: registering driver hv_netvsc May 27 02:47:27.246780 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input1 May 27 02:47:27.246802 kernel: scsi host0: storvsc_host_t May 27 02:47:27.246984 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 May 27 02:47:27.220187 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 02:47:27.265677 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on May 27 02:47:27.256242 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 02:47:27.276610 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 May 27 02:47:27.293600 kernel: PTP clock support registered May 27 02:47:27.287172 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 02:47:27.306885 kernel: hv_utils: Registering HyperV Utility Driver May 27 02:47:27.306905 kernel: hv_vmbus: registering driver hv_utils May 27 02:47:27.599683 kernel: hv_utils: Heartbeat IC version 3.0 May 27 02:47:27.599754 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) May 27 02:47:27.599947 kernel: hv_utils: Shutdown IC version 3.2 May 27 02:47:27.599956 kernel: hv_utils: TimeSync IC version 4.0 May 27 02:47:27.599699 systemd-resolved[262]: Clock change detected. Flushing caches. May 27 02:47:27.613079 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks May 27 02:47:27.613318 kernel: hv_netvsc 000d3ac5-3511-000d-3ac5-3511000d3ac5 eth0: VF slot 1 added May 27 02:47:27.614550 kernel: sd 0:0:0:0: [sda] Write Protect is off May 27 02:47:27.620865 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 May 27 02:47:27.621072 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA May 27 02:47:27.621142 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#3 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 May 27 02:47:27.635500 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#10 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 May 27 02:47:27.646865 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 27 02:47:27.646918 kernel: sd 0:0:0:0: [sda] Attached SCSI disk May 27 02:47:27.649805 kernel: sr 0:0:0:2: [sr0] scsi-1 drive May 27 02:47:27.650030 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 May 27 02:47:27.655923 kernel: hv_vmbus: registering driver hv_pci May 27 02:47:27.656503 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 May 27 02:47:27.664099 kernel: hv_pci 3257316a-0f90-46e2-89c4-560c48e635ee: PCI VMBus probing: Using version 0x10004 May 27 02:47:27.673838 kernel: hv_pci 3257316a-0f90-46e2-89c4-560c48e635ee: PCI host bridge to bus 0f90:00 May 27 02:47:27.674074 kernel: pci_bus 0f90:00: root bus resource [mem 0xfc0000000-0xfc00fffff window] May 27 02:47:27.674158 kernel: pci_bus 0f90:00: No busn resource found for root bus, will use [bus 00-ff] May 27 02:47:27.684811 kernel: pci 0f90:00:02.0: [15b3:101a] type 00 class 0x020000 PCIe Endpoint May 27 02:47:27.690513 kernel: pci 0f90:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref] May 27 02:47:27.696538 kernel: pci 0f90:00:02.0: enabling Extended Tags May 27 02:47:27.710539 kernel: pci 0f90:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at 0f90:00:02.0 (capable of 252.048 Gb/s with 16.0 GT/s PCIe x16 link) May 27 02:47:27.719306 kernel: pci_bus 0f90:00: busn_res: [bus 00-ff] end is updated to 00 May 27 02:47:27.719525 kernel: pci 0f90:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref]: assigned May 27 02:47:27.741508 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#56 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 May 27 02:47:27.766498 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#86 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 May 27 02:47:27.791800 kernel: mlx5_core 0f90:00:02.0: enabling device (0000 -> 0002) May 27 02:47:27.799967 kernel: mlx5_core 0f90:00:02.0: PTM is not supported by PCIe May 27 02:47:27.800149 kernel: mlx5_core 0f90:00:02.0: firmware version: 16.30.5006 May 27 02:47:27.972552 kernel: hv_netvsc 000d3ac5-3511-000d-3ac5-3511000d3ac5 eth0: VF registering: eth1 May 27 02:47:27.972798 kernel: mlx5_core 0f90:00:02.0 eth1: joined to eth0 May 27 02:47:27.977778 kernel: mlx5_core 0f90:00:02.0: MLX5E: StrdRq(1) RqSz(8) StrdSz(2048) RxCqeCmprss(0 basic) May 27 02:47:27.988316 kernel: mlx5_core 0f90:00:02.0 enP3984s1: renamed from eth1 May 27 02:47:28.225589 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. May 27 02:47:28.321354 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. May 27 02:47:28.347306 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. May 27 02:47:28.363774 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. May 27 02:47:28.368955 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. May 27 02:47:28.380045 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 27 02:47:28.391861 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 27 02:47:28.400171 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 02:47:28.409672 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 27 02:47:28.423882 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 27 02:47:28.434597 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 27 02:47:28.456194 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 27 02:47:28.468925 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#72 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 May 27 02:47:28.469092 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 27 02:47:28.482504 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#89 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 May 27 02:47:28.492502 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 27 02:47:29.483043 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#54 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 May 27 02:47:29.494550 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 27 02:47:29.495532 disk-uuid[657]: The operation has completed successfully. May 27 02:47:29.567182 systemd[1]: disk-uuid.service: Deactivated successfully. May 27 02:47:29.569360 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 27 02:47:29.600218 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 27 02:47:29.620115 sh[821]: Success May 27 02:47:29.655831 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 27 02:47:29.655905 kernel: device-mapper: uevent: version 1.0.3 May 27 02:47:29.660807 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev May 27 02:47:29.671512 kernel: device-mapper: verity: sha256 using shash "sha256-ce" May 27 02:47:29.871753 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 27 02:47:29.880402 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 27 02:47:29.891665 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 27 02:47:29.915979 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' May 27 02:47:29.916050 kernel: BTRFS: device fsid 5c6341ea-4eb5-44b6-ac57-c4d29847e384 devid 1 transid 41 /dev/mapper/usr (254:0) scanned by mount (839) May 27 02:47:29.921503 kernel: BTRFS info (device dm-0): first mount of filesystem 5c6341ea-4eb5-44b6-ac57-c4d29847e384 May 27 02:47:29.926045 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm May 27 02:47:29.929065 kernel: BTRFS info (device dm-0): using free-space-tree May 27 02:47:30.274639 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 27 02:47:30.280155 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. May 27 02:47:30.289435 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 27 02:47:30.290887 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 27 02:47:30.316913 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 27 02:47:30.345026 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 (8:6) scanned by mount (862) May 27 02:47:30.345093 kernel: BTRFS info (device sda6): first mount of filesystem eabe2c18-04ac-4289-8962-26387aada3f9 May 27 02:47:30.349872 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm May 27 02:47:30.353154 kernel: BTRFS info (device sda6): using free-space-tree May 27 02:47:30.401560 kernel: BTRFS info (device sda6): last unmount of filesystem eabe2c18-04ac-4289-8962-26387aada3f9 May 27 02:47:30.404880 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 27 02:47:30.412026 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 27 02:47:30.459332 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 27 02:47:30.471602 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 27 02:47:30.507173 systemd-networkd[1008]: lo: Link UP May 27 02:47:30.507184 systemd-networkd[1008]: lo: Gained carrier May 27 02:47:30.508707 systemd-networkd[1008]: Enumeration completed May 27 02:47:30.509329 systemd-networkd[1008]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 02:47:30.509333 systemd-networkd[1008]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 02:47:30.510040 systemd[1]: Started systemd-networkd.service - Network Configuration. May 27 02:47:30.517641 systemd[1]: Reached target network.target - Network. May 27 02:47:30.567508 kernel: mlx5_core 0f90:00:02.0 enP3984s1: Link up May 27 02:47:30.601517 kernel: hv_netvsc 000d3ac5-3511-000d-3ac5-3511000d3ac5 eth0: Data path switched to VF: enP3984s1 May 27 02:47:30.601801 systemd-networkd[1008]: enP3984s1: Link UP May 27 02:47:30.601890 systemd-networkd[1008]: eth0: Link UP May 27 02:47:30.601991 systemd-networkd[1008]: eth0: Gained carrier May 27 02:47:30.602003 systemd-networkd[1008]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 02:47:30.617803 systemd-networkd[1008]: enP3984s1: Gained carrier May 27 02:47:30.630546 systemd-networkd[1008]: eth0: DHCPv4 address 10.200.20.22/24, gateway 10.200.20.1 acquired from 168.63.129.16 May 27 02:47:31.540147 ignition[951]: Ignition 2.21.0 May 27 02:47:31.540164 ignition[951]: Stage: fetch-offline May 27 02:47:31.540243 ignition[951]: no configs at "/usr/lib/ignition/base.d" May 27 02:47:31.545261 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 27 02:47:31.540249 ignition[951]: no config dir at "/usr/lib/ignition/base.platform.d/azure" May 27 02:47:31.554703 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... May 27 02:47:31.540347 ignition[951]: parsed url from cmdline: "" May 27 02:47:31.540350 ignition[951]: no config URL provided May 27 02:47:31.540353 ignition[951]: reading system config file "/usr/lib/ignition/user.ign" May 27 02:47:31.540357 ignition[951]: no config at "/usr/lib/ignition/user.ign" May 27 02:47:31.540361 ignition[951]: failed to fetch config: resource requires networking May 27 02:47:31.543037 ignition[951]: Ignition finished successfully May 27 02:47:31.599537 ignition[1019]: Ignition 2.21.0 May 27 02:47:31.599543 ignition[1019]: Stage: fetch May 27 02:47:31.599749 ignition[1019]: no configs at "/usr/lib/ignition/base.d" May 27 02:47:31.599757 ignition[1019]: no config dir at "/usr/lib/ignition/base.platform.d/azure" May 27 02:47:31.599834 ignition[1019]: parsed url from cmdline: "" May 27 02:47:31.599836 ignition[1019]: no config URL provided May 27 02:47:31.599839 ignition[1019]: reading system config file "/usr/lib/ignition/user.ign" May 27 02:47:31.599844 ignition[1019]: no config at "/usr/lib/ignition/user.ign" May 27 02:47:31.599876 ignition[1019]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 May 27 02:47:31.677910 ignition[1019]: GET result: OK May 27 02:47:31.678021 ignition[1019]: config has been read from IMDS userdata May 27 02:47:31.678044 ignition[1019]: parsing config with SHA512: 320b9aa7278e5482d6dd881a9c957cfd3ba8a19e41af9f526c50b27d920bab8c2a639cbc45ab1b149ae79ab2dde2f35f01bd25ad1a5d53b6de61eac994a12969 May 27 02:47:31.682231 unknown[1019]: fetched base config from "system" May 27 02:47:31.682567 ignition[1019]: fetch: fetch complete May 27 02:47:31.682240 unknown[1019]: fetched base config from "system" May 27 02:47:31.682571 ignition[1019]: fetch: fetch passed May 27 02:47:31.682244 unknown[1019]: fetched user config from "azure" May 27 02:47:31.682616 ignition[1019]: Ignition finished successfully May 27 02:47:31.686165 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). May 27 02:47:31.694741 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 27 02:47:31.730991 ignition[1026]: Ignition 2.21.0 May 27 02:47:31.731002 ignition[1026]: Stage: kargs May 27 02:47:31.733708 ignition[1026]: no configs at "/usr/lib/ignition/base.d" May 27 02:47:31.733719 ignition[1026]: no config dir at "/usr/lib/ignition/base.platform.d/azure" May 27 02:47:31.743541 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 27 02:47:31.738313 ignition[1026]: kargs: kargs passed May 27 02:47:31.753304 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 27 02:47:31.738396 ignition[1026]: Ignition finished successfully May 27 02:47:31.760586 systemd-networkd[1008]: eth0: Gained IPv6LL May 27 02:47:31.782811 ignition[1033]: Ignition 2.21.0 May 27 02:47:31.782823 ignition[1033]: Stage: disks May 27 02:47:31.783033 ignition[1033]: no configs at "/usr/lib/ignition/base.d" May 27 02:47:31.789174 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 27 02:47:31.783042 ignition[1033]: no config dir at "/usr/lib/ignition/base.platform.d/azure" May 27 02:47:31.797934 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 27 02:47:31.784878 ignition[1033]: disks: disks passed May 27 02:47:31.806008 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 27 02:47:31.784984 ignition[1033]: Ignition finished successfully May 27 02:47:31.815092 systemd[1]: Reached target local-fs.target - Local File Systems. May 27 02:47:31.823431 systemd[1]: Reached target sysinit.target - System Initialization. May 27 02:47:31.829760 systemd[1]: Reached target basic.target - Basic System. May 27 02:47:31.839497 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 27 02:47:31.887674 systemd-networkd[1008]: enP3984s1: Gained IPv6LL May 27 02:47:31.921822 systemd-fsck[1041]: ROOT: clean, 15/7326000 files, 477845/7359488 blocks May 27 02:47:31.930080 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 27 02:47:31.936177 systemd[1]: Mounting sysroot.mount - /sysroot... May 27 02:47:32.362488 kernel: EXT4-fs (sda9): mounted filesystem 5656cec4-efbd-4a2d-be98-2263e6ae16bd r/w with ordered data mode. Quota mode: none. May 27 02:47:32.363784 systemd[1]: Mounted sysroot.mount - /sysroot. May 27 02:47:32.368076 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 27 02:47:32.391458 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 27 02:47:32.395891 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 27 02:47:32.414800 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... May 27 02:47:32.421039 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 27 02:47:32.456965 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 (8:6) scanned by mount (1055) May 27 02:47:32.456987 kernel: BTRFS info (device sda6): first mount of filesystem eabe2c18-04ac-4289-8962-26387aada3f9 May 27 02:47:32.421083 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 27 02:47:32.466394 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm May 27 02:47:32.442250 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 27 02:47:32.476996 kernel: BTRFS info (device sda6): using free-space-tree May 27 02:47:32.479092 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 27 02:47:32.490175 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 27 02:47:33.134793 coreos-metadata[1057]: May 27 02:47:33.134 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 May 27 02:47:33.145631 coreos-metadata[1057]: May 27 02:47:33.144 INFO Fetch successful May 27 02:47:33.145631 coreos-metadata[1057]: May 27 02:47:33.144 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 May 27 02:47:33.161143 coreos-metadata[1057]: May 27 02:47:33.160 INFO Fetch successful May 27 02:47:33.175910 coreos-metadata[1057]: May 27 02:47:33.175 INFO wrote hostname ci-4344.0.0-a-583de22c75 to /sysroot/etc/hostname May 27 02:47:33.184543 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 27 02:47:33.333095 initrd-setup-root[1085]: cut: /sysroot/etc/passwd: No such file or directory May 27 02:47:33.353723 initrd-setup-root[1092]: cut: /sysroot/etc/group: No such file or directory May 27 02:47:33.360884 initrd-setup-root[1099]: cut: /sysroot/etc/shadow: No such file or directory May 27 02:47:33.368106 initrd-setup-root[1106]: cut: /sysroot/etc/gshadow: No such file or directory May 27 02:47:34.322521 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 27 02:47:34.328857 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 27 02:47:34.359186 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 27 02:47:34.376279 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 27 02:47:34.382802 kernel: BTRFS info (device sda6): last unmount of filesystem eabe2c18-04ac-4289-8962-26387aada3f9 May 27 02:47:34.400992 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 27 02:47:34.409240 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 27 02:47:34.429278 ignition[1173]: INFO : Ignition 2.21.0 May 27 02:47:34.429278 ignition[1173]: INFO : Stage: mount May 27 02:47:34.429278 ignition[1173]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 02:47:34.429278 ignition[1173]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" May 27 02:47:34.429278 ignition[1173]: INFO : mount: mount passed May 27 02:47:34.429278 ignition[1173]: INFO : Ignition finished successfully May 27 02:47:34.417193 systemd[1]: Starting ignition-files.service - Ignition (files)... May 27 02:47:34.445666 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 27 02:47:34.482125 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 (8:6) scanned by mount (1186) May 27 02:47:34.482189 kernel: BTRFS info (device sda6): first mount of filesystem eabe2c18-04ac-4289-8962-26387aada3f9 May 27 02:47:34.486529 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm May 27 02:47:34.489942 kernel: BTRFS info (device sda6): using free-space-tree May 27 02:47:34.505926 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 27 02:47:34.532623 ignition[1202]: INFO : Ignition 2.21.0 May 27 02:47:34.532623 ignition[1202]: INFO : Stage: files May 27 02:47:34.539391 ignition[1202]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 02:47:34.539391 ignition[1202]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" May 27 02:47:34.539391 ignition[1202]: DEBUG : files: compiled without relabeling support, skipping May 27 02:47:34.554369 ignition[1202]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 27 02:47:34.554369 ignition[1202]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 27 02:47:34.554369 ignition[1202]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 27 02:47:34.554369 ignition[1202]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 27 02:47:34.554369 ignition[1202]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 27 02:47:34.553980 unknown[1202]: wrote ssh authorized keys file for user: core May 27 02:47:34.588620 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" May 27 02:47:34.588620 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-arm64.tar.gz: attempt #1 May 27 02:47:34.703771 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 27 02:47:34.823438 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" May 27 02:47:34.823438 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 27 02:47:34.823438 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 27 02:47:34.823438 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 27 02:47:34.823438 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 27 02:47:34.823438 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 27 02:47:34.823438 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 27 02:47:34.823438 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 27 02:47:34.823438 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 27 02:47:34.892009 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 27 02:47:34.892009 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 27 02:47:34.892009 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" May 27 02:47:34.892009 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" May 27 02:47:34.892009 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" May 27 02:47:34.892009 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-arm64.raw: attempt #1 May 27 02:47:35.610618 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 27 02:47:35.889918 ignition[1202]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" May 27 02:47:35.889918 ignition[1202]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 27 02:47:35.924750 ignition[1202]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 27 02:47:35.933665 ignition[1202]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 27 02:47:35.933665 ignition[1202]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 27 02:47:35.933665 ignition[1202]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" May 27 02:47:35.933665 ignition[1202]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" May 27 02:47:35.933665 ignition[1202]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" May 27 02:47:35.933665 ignition[1202]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" May 27 02:47:35.933665 ignition[1202]: INFO : files: files passed May 27 02:47:35.933665 ignition[1202]: INFO : Ignition finished successfully May 27 02:47:35.934189 systemd[1]: Finished ignition-files.service - Ignition (files). May 27 02:47:35.946508 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 27 02:47:35.984087 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 27 02:47:35.994098 systemd[1]: ignition-quench.service: Deactivated successfully. May 27 02:47:36.004857 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 27 02:47:36.037200 initrd-setup-root-after-ignition[1233]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 27 02:47:36.037200 initrd-setup-root-after-ignition[1233]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 27 02:47:36.050698 initrd-setup-root-after-ignition[1237]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 27 02:47:36.054147 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 27 02:47:36.062895 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 27 02:47:36.076573 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 27 02:47:36.127498 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 27 02:47:36.127621 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 27 02:47:36.136832 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 27 02:47:36.145460 systemd[1]: Reached target initrd.target - Initrd Default Target. May 27 02:47:36.154753 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 27 02:47:36.155560 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 27 02:47:36.185378 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 27 02:47:36.192437 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 27 02:47:36.217217 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 27 02:47:36.222180 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 02:47:36.231680 systemd[1]: Stopped target timers.target - Timer Units. May 27 02:47:36.239790 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 27 02:47:36.239920 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 27 02:47:36.252628 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 27 02:47:36.256662 systemd[1]: Stopped target basic.target - Basic System. May 27 02:47:36.265049 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 27 02:47:36.273292 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 27 02:47:36.282652 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 27 02:47:36.291986 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. May 27 02:47:36.300960 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 27 02:47:36.309794 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 27 02:47:36.319055 systemd[1]: Stopped target sysinit.target - System Initialization. May 27 02:47:36.327209 systemd[1]: Stopped target local-fs.target - Local File Systems. May 27 02:47:36.336169 systemd[1]: Stopped target swap.target - Swaps. May 27 02:47:36.343280 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 27 02:47:36.343400 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 27 02:47:36.354206 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 27 02:47:36.358768 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 02:47:36.367503 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 27 02:47:36.371086 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 02:47:36.376582 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 27 02:47:36.376713 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 27 02:47:36.389356 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 27 02:47:36.389533 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 27 02:47:36.398070 systemd[1]: ignition-files.service: Deactivated successfully. May 27 02:47:36.398158 systemd[1]: Stopped ignition-files.service - Ignition (files). May 27 02:47:36.408542 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. May 27 02:47:36.408626 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 27 02:47:36.418546 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 27 02:47:36.443831 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 27 02:47:36.477528 ignition[1257]: INFO : Ignition 2.21.0 May 27 02:47:36.477528 ignition[1257]: INFO : Stage: umount May 27 02:47:36.477528 ignition[1257]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 02:47:36.477528 ignition[1257]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" May 27 02:47:36.477528 ignition[1257]: INFO : umount: umount passed May 27 02:47:36.477528 ignition[1257]: INFO : Ignition finished successfully May 27 02:47:36.454497 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 27 02:47:36.454676 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 27 02:47:36.472511 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 27 02:47:36.472634 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 27 02:47:36.486070 systemd[1]: ignition-mount.service: Deactivated successfully. May 27 02:47:36.488237 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 27 02:47:36.494154 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 27 02:47:36.494255 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 27 02:47:36.509763 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 27 02:47:36.514634 systemd[1]: ignition-disks.service: Deactivated successfully. May 27 02:47:36.514703 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 27 02:47:36.520163 systemd[1]: ignition-kargs.service: Deactivated successfully. May 27 02:47:36.520223 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 27 02:47:36.532986 systemd[1]: ignition-fetch.service: Deactivated successfully. May 27 02:47:36.533043 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). May 27 02:47:36.542033 systemd[1]: Stopped target network.target - Network. May 27 02:47:36.549606 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 27 02:47:36.549663 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 27 02:47:36.558375 systemd[1]: Stopped target paths.target - Path Units. May 27 02:47:36.565532 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 27 02:47:36.565584 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 02:47:36.575121 systemd[1]: Stopped target slices.target - Slice Units. May 27 02:47:36.583454 systemd[1]: Stopped target sockets.target - Socket Units. May 27 02:47:36.592502 systemd[1]: iscsid.socket: Deactivated successfully. May 27 02:47:36.592550 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 27 02:47:36.600939 systemd[1]: iscsiuio.socket: Deactivated successfully. May 27 02:47:36.600979 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 27 02:47:36.608970 systemd[1]: ignition-setup.service: Deactivated successfully. May 27 02:47:36.609026 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 27 02:47:36.616464 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 27 02:47:36.616503 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 27 02:47:36.628264 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 27 02:47:36.635905 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 27 02:47:36.647811 systemd[1]: systemd-resolved.service: Deactivated successfully. May 27 02:47:36.647948 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 27 02:47:36.664782 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 27 02:47:36.665140 systemd[1]: systemd-networkd.service: Deactivated successfully. May 27 02:47:36.665250 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 27 02:47:36.678453 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 27 02:47:36.679116 systemd[1]: Stopped target network-pre.target - Preparation for Network. May 27 02:47:36.689082 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 27 02:47:36.689138 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 27 02:47:36.697649 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 27 02:47:36.708656 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 27 02:47:36.854170 kernel: hv_netvsc 000d3ac5-3511-000d-3ac5-3511000d3ac5 eth0: Data path switched from VF: enP3984s1 May 27 02:47:36.708731 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 27 02:47:36.721583 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 27 02:47:36.721664 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 27 02:47:36.735110 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 27 02:47:36.735179 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 27 02:47:36.740111 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 27 02:47:36.740180 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 02:47:36.752979 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 02:47:36.767121 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 27 02:47:36.767198 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 27 02:47:36.780832 systemd[1]: systemd-udevd.service: Deactivated successfully. May 27 02:47:36.795872 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 02:47:36.803315 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 27 02:47:36.803373 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 27 02:47:36.812977 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 27 02:47:36.813011 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 27 02:47:36.821281 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 27 02:47:36.821336 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 27 02:47:36.841630 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 27 02:47:36.841729 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 27 02:47:36.854186 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 27 02:47:36.854251 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 27 02:47:36.866354 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 27 02:47:36.880982 systemd[1]: systemd-network-generator.service: Deactivated successfully. May 27 02:47:36.881077 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. May 27 02:47:36.894731 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 27 02:47:36.894796 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 02:47:36.908873 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. May 27 02:47:36.908948 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 02:47:36.915269 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 27 02:47:36.915331 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 27 02:47:36.926942 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 02:47:37.084838 systemd-journald[224]: Received SIGTERM from PID 1 (systemd). May 27 02:47:36.927020 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 02:47:36.942828 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. May 27 02:47:36.942889 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. May 27 02:47:36.942913 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. May 27 02:47:36.942936 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 27 02:47:36.943256 systemd[1]: sysroot-boot.service: Deactivated successfully. May 27 02:47:36.943359 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 27 02:47:36.949722 systemd[1]: network-cleanup.service: Deactivated successfully. May 27 02:47:36.949819 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 27 02:47:36.959337 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 27 02:47:36.959418 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 27 02:47:36.969087 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 27 02:47:36.976858 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 27 02:47:36.976966 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 27 02:47:36.987583 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 27 02:47:37.028123 systemd[1]: Switching root. May 27 02:47:37.156335 systemd-journald[224]: Journal stopped May 27 02:47:41.581927 kernel: SELinux: policy capability network_peer_controls=1 May 27 02:47:41.581949 kernel: SELinux: policy capability open_perms=1 May 27 02:47:41.581958 kernel: SELinux: policy capability extended_socket_class=1 May 27 02:47:41.581964 kernel: SELinux: policy capability always_check_network=0 May 27 02:47:41.581971 kernel: SELinux: policy capability cgroup_seclabel=1 May 27 02:47:41.581976 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 27 02:47:41.581982 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 27 02:47:41.581987 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 27 02:47:41.581993 kernel: SELinux: policy capability userspace_initial_context=0 May 27 02:47:41.581998 kernel: audit: type=1403 audit(1748314058.178:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 27 02:47:41.582006 systemd[1]: Successfully loaded SELinux policy in 140.116ms. May 27 02:47:41.582015 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 8.899ms. May 27 02:47:41.582022 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 27 02:47:41.582028 systemd[1]: Detected virtualization microsoft. May 27 02:47:41.582034 systemd[1]: Detected architecture arm64. May 27 02:47:41.582041 systemd[1]: Detected first boot. May 27 02:47:41.582047 systemd[1]: Hostname set to . May 27 02:47:41.582053 systemd[1]: Initializing machine ID from random generator. May 27 02:47:41.582060 zram_generator::config[1301]: No configuration found. May 27 02:47:41.582066 kernel: NET: Registered PF_VSOCK protocol family May 27 02:47:41.582072 systemd[1]: Populated /etc with preset unit settings. May 27 02:47:41.582079 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 27 02:47:41.582087 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 27 02:47:41.582093 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 27 02:47:41.582098 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 27 02:47:41.582104 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 27 02:47:41.582111 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 27 02:47:41.582118 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 27 02:47:41.582124 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 27 02:47:41.582131 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 27 02:47:41.582137 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 27 02:47:41.582143 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 27 02:47:41.582149 systemd[1]: Created slice user.slice - User and Session Slice. May 27 02:47:41.582155 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 02:47:41.582161 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 02:47:41.582166 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 27 02:47:41.582172 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 27 02:47:41.582178 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 27 02:47:41.582185 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 27 02:47:41.582191 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... May 27 02:47:41.582199 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 02:47:41.582205 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 27 02:47:41.582211 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 27 02:47:41.582217 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 27 02:47:41.582223 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 27 02:47:41.582230 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 27 02:47:41.582237 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 02:47:41.582243 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 27 02:47:41.582249 systemd[1]: Reached target slices.target - Slice Units. May 27 02:47:41.582256 systemd[1]: Reached target swap.target - Swaps. May 27 02:47:41.582262 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 27 02:47:41.582268 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 27 02:47:41.582275 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 27 02:47:41.582281 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 27 02:47:41.582287 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 27 02:47:41.582293 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 27 02:47:41.582299 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 27 02:47:41.582306 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 27 02:47:41.582312 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 27 02:47:41.582318 systemd[1]: Mounting media.mount - External Media Directory... May 27 02:47:41.582325 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 27 02:47:41.582331 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 27 02:47:41.582337 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 27 02:47:41.582343 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 27 02:47:41.582349 systemd[1]: Reached target machines.target - Containers. May 27 02:47:41.582356 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 27 02:47:41.582363 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 02:47:41.582369 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 27 02:47:41.582375 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 27 02:47:41.582382 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 02:47:41.582388 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 27 02:47:41.582394 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 02:47:41.582400 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 27 02:47:41.582406 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 02:47:41.582413 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 27 02:47:41.582420 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 27 02:47:41.582426 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 27 02:47:41.582432 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 27 02:47:41.582438 systemd[1]: Stopped systemd-fsck-usr.service. May 27 02:47:41.582444 kernel: fuse: init (API version 7.41) May 27 02:47:41.582450 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 02:47:41.582456 systemd[1]: Starting systemd-journald.service - Journal Service... May 27 02:47:41.582462 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 27 02:47:41.582469 kernel: loop: module loaded May 27 02:47:41.582487 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 27 02:47:41.582494 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 27 02:47:41.582500 kernel: ACPI: bus type drm_connector registered May 27 02:47:41.582506 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 27 02:47:41.582512 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 27 02:47:41.582518 systemd[1]: verity-setup.service: Deactivated successfully. May 27 02:47:41.582525 systemd[1]: Stopped verity-setup.service. May 27 02:47:41.582532 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 27 02:47:41.582538 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 27 02:47:41.582558 systemd-journald[1391]: Collecting audit messages is disabled. May 27 02:47:41.582573 systemd-journald[1391]: Journal started May 27 02:47:41.582590 systemd-journald[1391]: Runtime Journal (/run/log/journal/373d22c474e84138976b6d5fc813c898) is 8M, max 78.5M, 70.5M free. May 27 02:47:40.793009 systemd[1]: Queued start job for default target multi-user.target. May 27 02:47:40.811169 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. May 27 02:47:40.811649 systemd[1]: systemd-journald.service: Deactivated successfully. May 27 02:47:40.811976 systemd[1]: systemd-journald.service: Consumed 2.515s CPU time. May 27 02:47:41.596884 systemd[1]: Started systemd-journald.service - Journal Service. May 27 02:47:41.597793 systemd[1]: Mounted media.mount - External Media Directory. May 27 02:47:41.601957 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 27 02:47:41.607005 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 27 02:47:41.612028 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 27 02:47:41.616414 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 27 02:47:41.622021 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 27 02:47:41.627355 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 27 02:47:41.627535 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 27 02:47:41.632241 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 02:47:41.632378 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 02:47:41.637168 systemd[1]: modprobe@drm.service: Deactivated successfully. May 27 02:47:41.637308 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 27 02:47:41.642112 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 02:47:41.642260 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 02:47:41.647378 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 27 02:47:41.647619 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 27 02:47:41.653082 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 02:47:41.653231 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 02:47:41.657858 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 27 02:47:41.662918 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 27 02:47:41.669241 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 27 02:47:41.674721 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 27 02:47:41.680121 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 27 02:47:41.694497 systemd[1]: Reached target network-pre.target - Preparation for Network. May 27 02:47:41.701447 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 27 02:47:41.715622 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 27 02:47:41.719976 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 27 02:47:41.720013 systemd[1]: Reached target local-fs.target - Local File Systems. May 27 02:47:41.724915 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 27 02:47:41.731303 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 27 02:47:41.735576 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 02:47:41.736729 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 27 02:47:41.742154 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 27 02:47:41.746873 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 27 02:47:41.748857 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 27 02:47:41.754368 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 27 02:47:41.755423 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 27 02:47:41.760823 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 27 02:47:41.767222 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 27 02:47:41.774471 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 27 02:47:41.780996 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 27 02:47:41.784528 systemd-journald[1391]: Time spent on flushing to /var/log/journal/373d22c474e84138976b6d5fc813c898 is 12.651ms for 941 entries. May 27 02:47:41.784528 systemd-journald[1391]: System Journal (/var/log/journal/373d22c474e84138976b6d5fc813c898) is 8M, max 2.6G, 2.6G free. May 27 02:47:41.862396 systemd-journald[1391]: Received client request to flush runtime journal. May 27 02:47:41.862472 kernel: loop0: detected capacity change from 0 to 28640 May 27 02:47:41.794452 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 27 02:47:41.806157 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 27 02:47:41.816647 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 27 02:47:41.864545 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 27 02:47:41.886709 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 27 02:47:41.911914 systemd-tmpfiles[1442]: ACLs are not supported, ignoring. May 27 02:47:41.911930 systemd-tmpfiles[1442]: ACLs are not supported, ignoring. May 27 02:47:41.915931 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 02:47:41.923803 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 27 02:47:41.930183 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 27 02:47:41.931934 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 27 02:47:42.261517 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 27 02:47:42.302795 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 27 02:47:42.311215 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 27 02:47:42.337309 systemd-tmpfiles[1459]: ACLs are not supported, ignoring. May 27 02:47:42.337327 systemd-tmpfiles[1459]: ACLs are not supported, ignoring. May 27 02:47:42.340927 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 02:47:42.351540 kernel: loop1: detected capacity change from 0 to 107312 May 27 02:47:42.720503 kernel: loop2: detected capacity change from 0 to 138376 May 27 02:47:42.854611 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 27 02:47:42.862400 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 02:47:42.890544 systemd-udevd[1464]: Using default interface naming scheme 'v255'. May 27 02:47:43.052597 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 02:47:43.072055 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 27 02:47:43.125050 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 27 02:47:43.145782 kernel: loop3: detected capacity change from 0 to 207008 May 27 02:47:43.164805 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. May 27 02:47:43.185502 kernel: loop4: detected capacity change from 0 to 28640 May 27 02:47:43.196510 kernel: loop5: detected capacity change from 0 to 107312 May 27 02:47:43.206502 kernel: loop6: detected capacity change from 0 to 138376 May 27 02:47:43.224508 kernel: loop7: detected capacity change from 0 to 207008 May 27 02:47:43.230768 (sd-merge)[1498]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. May 27 02:47:43.231205 (sd-merge)[1498]: Merged extensions into '/usr'. May 27 02:47:43.236693 systemd[1]: Reload requested from client PID 1440 ('systemd-sysext') (unit systemd-sysext.service)... May 27 02:47:43.236708 systemd[1]: Reloading... May 27 02:47:43.259909 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#54 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 May 27 02:47:43.275763 kernel: mousedev: PS/2 mouse device common for all mice May 27 02:47:43.330558 zram_generator::config[1544]: No configuration found. May 27 02:47:43.391802 kernel: hv_vmbus: registering driver hyperv_fb May 27 02:47:43.391936 kernel: hv_vmbus: registering driver hv_balloon May 27 02:47:43.391954 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 May 27 02:47:43.400082 kernel: hv_balloon: Memory hot add disabled on ARM64 May 27 02:47:43.443051 kernel: hyperv_fb: Synthvid Version major 3, minor 5 May 27 02:47:43.450153 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 May 27 02:47:43.459541 kernel: Console: switching to colour dummy device 80x25 May 27 02:47:43.459831 kernel: Console: switching to colour frame buffer device 128x48 May 27 02:47:43.453342 systemd-networkd[1487]: lo: Link UP May 27 02:47:43.453352 systemd-networkd[1487]: lo: Gained carrier May 27 02:47:43.469879 systemd-networkd[1487]: Enumeration completed May 27 02:47:43.474027 systemd-networkd[1487]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 02:47:43.474115 systemd-networkd[1487]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 02:47:43.517320 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 02:47:43.557525 kernel: mlx5_core 0f90:00:02.0 enP3984s1: Link up May 27 02:47:43.578498 kernel: hv_netvsc 000d3ac5-3511-000d-3ac5-3511000d3ac5 eth0: Data path switched to VF: enP3984s1 May 27 02:47:43.579958 systemd-networkd[1487]: enP3984s1: Link UP May 27 02:47:43.580837 systemd-networkd[1487]: eth0: Link UP May 27 02:47:43.581566 systemd-networkd[1487]: eth0: Gained carrier May 27 02:47:43.581683 systemd-networkd[1487]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 02:47:43.591834 systemd-networkd[1487]: enP3984s1: Gained carrier May 27 02:47:43.601615 systemd-networkd[1487]: eth0: DHCPv4 address 10.200.20.22/24, gateway 10.200.20.1 acquired from 168.63.129.16 May 27 02:47:43.652505 kernel: MACsec IEEE 802.1AE May 27 02:47:43.662191 systemd[1]: Reloading finished in 425 ms. May 27 02:47:43.684197 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 27 02:47:43.688897 systemd[1]: Started systemd-networkd.service - Network Configuration. May 27 02:47:43.693905 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 27 02:47:43.731279 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. May 27 02:47:43.748936 systemd[1]: Starting ensure-sysext.service... May 27 02:47:43.754737 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 27 02:47:43.763590 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 27 02:47:43.772732 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 27 02:47:43.786065 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 27 02:47:43.794212 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 02:47:43.808363 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 27 02:47:43.821216 systemd[1]: Reload requested from client PID 1679 ('systemctl') (unit ensure-sysext.service)... May 27 02:47:43.821374 systemd[1]: Reloading... May 27 02:47:43.829347 systemd-tmpfiles[1684]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 27 02:47:43.829373 systemd-tmpfiles[1684]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 27 02:47:43.829573 systemd-tmpfiles[1684]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 27 02:47:43.829716 systemd-tmpfiles[1684]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 27 02:47:43.830129 systemd-tmpfiles[1684]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 27 02:47:43.830274 systemd-tmpfiles[1684]: ACLs are not supported, ignoring. May 27 02:47:43.830307 systemd-tmpfiles[1684]: ACLs are not supported, ignoring. May 27 02:47:43.864215 systemd-tmpfiles[1684]: Detected autofs mount point /boot during canonicalization of boot. May 27 02:47:43.864229 systemd-tmpfiles[1684]: Skipping /boot May 27 02:47:43.878454 systemd-tmpfiles[1684]: Detected autofs mount point /boot during canonicalization of boot. May 27 02:47:43.878471 systemd-tmpfiles[1684]: Skipping /boot May 27 02:47:43.901498 zram_generator::config[1726]: No configuration found. May 27 02:47:43.973067 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 02:47:44.053361 systemd[1]: Reloading finished in 231 ms. May 27 02:47:44.064973 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 27 02:47:44.084131 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 02:47:44.098230 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 27 02:47:44.108291 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 27 02:47:44.114664 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 27 02:47:44.133814 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 27 02:47:44.139642 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 27 02:47:44.151981 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 02:47:44.154726 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 02:47:44.164795 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 02:47:44.173502 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 02:47:44.181434 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 02:47:44.181593 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 02:47:44.185874 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 02:47:44.187439 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 02:47:44.197183 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 02:47:44.197663 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 02:47:44.206546 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 02:47:44.206976 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 02:47:44.219537 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 02:47:44.236120 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 02:47:44.239818 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 02:47:44.252647 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 27 02:47:44.259540 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 02:47:44.267737 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 02:47:44.273284 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 02:47:44.273430 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 02:47:44.273549 systemd[1]: Reached target time-set.target - System Time Set. May 27 02:47:44.276960 systemd-resolved[1782]: Positive Trust Anchors: May 27 02:47:44.277297 systemd-resolved[1782]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 27 02:47:44.277321 systemd-resolved[1782]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 27 02:47:44.281849 systemd-resolved[1782]: Using system hostname 'ci-4344.0.0-a-583de22c75'. May 27 02:47:44.285113 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 27 02:47:44.291289 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 27 02:47:44.297054 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 27 02:47:44.303157 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 02:47:44.303387 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 02:47:44.308663 systemd[1]: modprobe@drm.service: Deactivated successfully. May 27 02:47:44.308822 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 27 02:47:44.313897 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 02:47:44.314052 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 02:47:44.319897 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 02:47:44.320062 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 02:47:44.333077 systemd[1]: Finished ensure-sysext.service. May 27 02:47:44.339883 systemd[1]: Reached target network.target - Network. May 27 02:47:44.340117 augenrules[1821]: No rules May 27 02:47:44.343750 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 27 02:47:44.348776 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 27 02:47:44.348846 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 27 02:47:44.349117 systemd[1]: audit-rules.service: Deactivated successfully. May 27 02:47:44.350518 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 27 02:47:44.589990 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 27 02:47:44.595543 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 27 02:47:45.199723 systemd-networkd[1487]: eth0: Gained IPv6LL May 27 02:47:45.200139 systemd-networkd[1487]: enP3984s1: Gained IPv6LL May 27 02:47:45.201949 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 27 02:47:45.207996 systemd[1]: Reached target network-online.target - Network is Online. May 27 02:47:47.819502 ldconfig[1435]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 27 02:47:47.836992 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 27 02:47:47.843765 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 27 02:47:47.862857 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 27 02:47:47.868207 systemd[1]: Reached target sysinit.target - System Initialization. May 27 02:47:47.873450 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 27 02:47:47.879163 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 27 02:47:47.885625 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 27 02:47:47.890143 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 27 02:47:47.895234 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 27 02:47:47.900668 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 27 02:47:47.900707 systemd[1]: Reached target paths.target - Path Units. May 27 02:47:47.904549 systemd[1]: Reached target timers.target - Timer Units. May 27 02:47:47.909735 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 27 02:47:47.916286 systemd[1]: Starting docker.socket - Docker Socket for the API... May 27 02:47:47.922775 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 27 02:47:47.928575 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 27 02:47:47.934039 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 27 02:47:47.940549 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 27 02:47:47.945935 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 27 02:47:47.951640 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 27 02:47:47.955959 systemd[1]: Reached target sockets.target - Socket Units. May 27 02:47:47.960124 systemd[1]: Reached target basic.target - Basic System. May 27 02:47:47.964071 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 27 02:47:47.964096 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 27 02:47:47.966346 systemd[1]: Starting chronyd.service - NTP client/server... May 27 02:47:47.979432 systemd[1]: Starting containerd.service - containerd container runtime... May 27 02:47:47.997021 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 27 02:47:48.005826 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 27 02:47:48.010709 (chronyd)[1836]: chronyd.service: Referenced but unset environment variable evaluates to an empty string: OPTIONS May 27 02:47:48.012257 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 27 02:47:48.025468 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 27 02:47:48.031655 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 27 02:47:48.034101 jq[1844]: false May 27 02:47:48.036097 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 27 02:47:48.039419 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 02:47:48.050705 chronyd[1848]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG) May 27 02:47:48.053144 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 27 02:47:48.062665 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 27 02:47:48.069674 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 27 02:47:48.083924 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 27 02:47:48.093733 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 27 02:47:48.100282 chronyd[1848]: Timezone right/UTC failed leap second check, ignoring May 27 02:47:48.100494 chronyd[1848]: Loaded seccomp filter (level 2) May 27 02:47:48.105694 systemd[1]: Starting systemd-logind.service - User Login Management... May 27 02:47:48.112513 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 27 02:47:48.113013 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 27 02:47:48.114746 systemd[1]: Starting update-engine.service - Update Engine... May 27 02:47:48.127052 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 27 02:47:48.134767 systemd[1]: Started chronyd.service - NTP client/server. May 27 02:47:48.140432 extend-filesystems[1845]: Found loop4 May 27 02:47:48.143421 extend-filesystems[1845]: Found loop5 May 27 02:47:48.143421 extend-filesystems[1845]: Found loop6 May 27 02:47:48.143421 extend-filesystems[1845]: Found loop7 May 27 02:47:48.143421 extend-filesystems[1845]: Found sda May 27 02:47:48.143421 extend-filesystems[1845]: Found sda1 May 27 02:47:48.143421 extend-filesystems[1845]: Found sda2 May 27 02:47:48.143421 extend-filesystems[1845]: Found sda3 May 27 02:47:48.143421 extend-filesystems[1845]: Found usr May 27 02:47:48.143421 extend-filesystems[1845]: Found sda4 May 27 02:47:48.143421 extend-filesystems[1845]: Found sda6 May 27 02:47:48.143421 extend-filesystems[1845]: Found sda7 May 27 02:47:48.143421 extend-filesystems[1845]: Found sda9 May 27 02:47:48.143421 extend-filesystems[1845]: Checking size of /dev/sda9 May 27 02:47:48.319032 extend-filesystems[1845]: Old size kept for /dev/sda9 May 27 02:47:48.319032 extend-filesystems[1845]: Found sr0 May 27 02:47:48.150513 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 27 02:47:48.349556 update_engine[1864]: I20250527 02:47:48.246902 1864 main.cc:92] Flatcar Update Engine starting May 27 02:47:48.168775 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 27 02:47:48.350415 jq[1869]: true May 27 02:47:48.169111 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 27 02:47:48.173375 systemd[1]: motdgen.service: Deactivated successfully. May 27 02:47:48.173589 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 27 02:47:48.351936 tar[1876]: linux-arm64/LICENSE May 27 02:47:48.351936 tar[1876]: linux-arm64/helm May 27 02:47:48.188933 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 27 02:47:48.189124 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 27 02:47:48.204056 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 27 02:47:48.352331 jq[1878]: true May 27 02:47:48.224903 (ntainerd)[1879]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 27 02:47:48.266605 systemd-logind[1862]: New seat seat0. May 27 02:47:48.273018 systemd[1]: extend-filesystems.service: Deactivated successfully. May 27 02:47:48.273221 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 27 02:47:48.273839 systemd-logind[1862]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 27 02:47:48.284048 systemd[1]: Started systemd-logind.service - User Login Management. May 27 02:47:48.378554 dbus-daemon[1842]: [system] SELinux support is enabled May 27 02:47:48.378788 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 27 02:47:48.389696 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 27 02:47:48.396043 update_engine[1864]: I20250527 02:47:48.395700 1864 update_check_scheduler.cc:74] Next update check in 8m13s May 27 02:47:48.390119 dbus-daemon[1842]: [system] Successfully activated service 'org.freedesktop.systemd1' May 27 02:47:48.389732 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 27 02:47:48.398949 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 27 02:47:48.398973 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 27 02:47:48.407922 systemd[1]: Started update-engine.service - Update Engine. May 27 02:47:48.416757 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 27 02:47:48.444402 bash[1921]: Updated "/home/core/.ssh/authorized_keys" May 27 02:47:48.444505 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 27 02:47:48.453410 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. May 27 02:47:48.474915 coreos-metadata[1838]: May 27 02:47:48.474 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 May 27 02:47:48.481781 coreos-metadata[1838]: May 27 02:47:48.481 INFO Fetch successful May 27 02:47:48.481781 coreos-metadata[1838]: May 27 02:47:48.481 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 May 27 02:47:48.491507 coreos-metadata[1838]: May 27 02:47:48.491 INFO Fetch successful May 27 02:47:48.491507 coreos-metadata[1838]: May 27 02:47:48.491 INFO Fetching http://168.63.129.16/machine/b62b392c-4579-45b3-9199-a62f93cdd9c2/868726ca%2Dd715%2D4c7d%2D81dc%2Dac5d5be6bb39.%5Fci%2D4344.0.0%2Da%2D583de22c75?comp=config&type=sharedConfig&incarnation=1: Attempt #1 May 27 02:47:48.528193 coreos-metadata[1838]: May 27 02:47:48.527 INFO Fetch successful May 27 02:47:48.528193 coreos-metadata[1838]: May 27 02:47:48.527 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 May 27 02:47:48.548571 coreos-metadata[1838]: May 27 02:47:48.547 INFO Fetch successful May 27 02:47:48.618530 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 27 02:47:48.630398 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 27 02:47:48.692493 locksmithd[1931]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 27 02:47:48.749962 sshd_keygen[1866]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 27 02:47:48.780564 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 27 02:47:48.788755 systemd[1]: Starting issuegen.service - Generate /run/issue... May 27 02:47:48.799563 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... May 27 02:47:48.831294 systemd[1]: issuegen.service: Deactivated successfully. May 27 02:47:48.834076 systemd[1]: Finished issuegen.service - Generate /run/issue. May 27 02:47:48.850770 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 27 02:47:48.867550 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. May 27 02:47:48.896055 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 27 02:47:48.905868 systemd[1]: Started getty@tty1.service - Getty on tty1. May 27 02:47:48.915799 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. May 27 02:47:48.920926 systemd[1]: Reached target getty.target - Login Prompts. May 27 02:47:48.954103 containerd[1879]: time="2025-05-27T02:47:48Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 27 02:47:48.955136 containerd[1879]: time="2025-05-27T02:47:48.955091564Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 May 27 02:47:48.968748 containerd[1879]: time="2025-05-27T02:47:48.968695436Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.024µs" May 27 02:47:48.968748 containerd[1879]: time="2025-05-27T02:47:48.968737924Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 27 02:47:48.968748 containerd[1879]: time="2025-05-27T02:47:48.968754660Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 27 02:47:48.968974 containerd[1879]: time="2025-05-27T02:47:48.968947524Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 27 02:47:48.968974 containerd[1879]: time="2025-05-27T02:47:48.968965172Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 27 02:47:48.969014 containerd[1879]: time="2025-05-27T02:47:48.968988500Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 27 02:47:48.969060 containerd[1879]: time="2025-05-27T02:47:48.969033052Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 27 02:47:48.969060 containerd[1879]: time="2025-05-27T02:47:48.969042500Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 27 02:47:48.969281 containerd[1879]: time="2025-05-27T02:47:48.969261452Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 27 02:47:48.969281 containerd[1879]: time="2025-05-27T02:47:48.969277748Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 27 02:47:48.969313 containerd[1879]: time="2025-05-27T02:47:48.969285820Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 27 02:47:48.969313 containerd[1879]: time="2025-05-27T02:47:48.969291092Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 27 02:47:48.969367 containerd[1879]: time="2025-05-27T02:47:48.969357460Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 27 02:47:48.972985 containerd[1879]: time="2025-05-27T02:47:48.972939252Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 27 02:47:48.973111 containerd[1879]: time="2025-05-27T02:47:48.973007588Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 27 02:47:48.973111 containerd[1879]: time="2025-05-27T02:47:48.973017476Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 27 02:47:48.974821 containerd[1879]: time="2025-05-27T02:47:48.974000676Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 27 02:47:48.974821 containerd[1879]: time="2025-05-27T02:47:48.974198452Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 27 02:47:48.974821 containerd[1879]: time="2025-05-27T02:47:48.974311124Z" level=info msg="metadata content store policy set" policy=shared May 27 02:47:48.993039 containerd[1879]: time="2025-05-27T02:47:48.992979140Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 27 02:47:48.993395 containerd[1879]: time="2025-05-27T02:47:48.993378948Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 27 02:47:48.993459 containerd[1879]: time="2025-05-27T02:47:48.993414716Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 27 02:47:48.993459 containerd[1879]: time="2025-05-27T02:47:48.993424484Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 27 02:47:48.993459 containerd[1879]: time="2025-05-27T02:47:48.993433508Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 27 02:47:48.993459 containerd[1879]: time="2025-05-27T02:47:48.993444916Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 27 02:47:48.993459 containerd[1879]: time="2025-05-27T02:47:48.993453508Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 27 02:47:48.993550 containerd[1879]: time="2025-05-27T02:47:48.993466068Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 27 02:47:48.993550 containerd[1879]: time="2025-05-27T02:47:48.993487460Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 27 02:47:48.993550 containerd[1879]: time="2025-05-27T02:47:48.993494716Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 27 02:47:48.993550 containerd[1879]: time="2025-05-27T02:47:48.993502628Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 27 02:47:48.993550 containerd[1879]: time="2025-05-27T02:47:48.993511940Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 27 02:47:48.994026 containerd[1879]: time="2025-05-27T02:47:48.993960236Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 27 02:47:48.994062 containerd[1879]: time="2025-05-27T02:47:48.994034204Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 27 02:47:48.994062 containerd[1879]: time="2025-05-27T02:47:48.994051164Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 27 02:47:48.994062 containerd[1879]: time="2025-05-27T02:47:48.994058940Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 27 02:47:48.994107 containerd[1879]: time="2025-05-27T02:47:48.994066604Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 27 02:47:48.994107 containerd[1879]: time="2025-05-27T02:47:48.994075212Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 27 02:47:48.994241 containerd[1879]: time="2025-05-27T02:47:48.994227580Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 27 02:47:48.994260 containerd[1879]: time="2025-05-27T02:47:48.994241748Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 27 02:47:48.994260 containerd[1879]: time="2025-05-27T02:47:48.994250484Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 27 02:47:48.994260 containerd[1879]: time="2025-05-27T02:47:48.994257588Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 27 02:47:48.994296 containerd[1879]: time="2025-05-27T02:47:48.994264420Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 27 02:47:48.994570 containerd[1879]: time="2025-05-27T02:47:48.994540652Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 27 02:47:48.994611 containerd[1879]: time="2025-05-27T02:47:48.994571356Z" level=info msg="Start snapshots syncer" May 27 02:47:48.994611 containerd[1879]: time="2025-05-27T02:47:48.994599092Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 27 02:47:48.995544 containerd[1879]: time="2025-05-27T02:47:48.995151436Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 27 02:47:48.995544 containerd[1879]: time="2025-05-27T02:47:48.995376748Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 27 02:47:48.995678 containerd[1879]: time="2025-05-27T02:47:48.995648212Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 27 02:47:48.996186 containerd[1879]: time="2025-05-27T02:47:48.996159860Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 27 02:47:48.996231 containerd[1879]: time="2025-05-27T02:47:48.996191828Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 27 02:47:48.996231 containerd[1879]: time="2025-05-27T02:47:48.996200764Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 27 02:47:48.996231 containerd[1879]: time="2025-05-27T02:47:48.996210364Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 27 02:47:48.996270 containerd[1879]: time="2025-05-27T02:47:48.996256764Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 27 02:47:48.996285 containerd[1879]: time="2025-05-27T02:47:48.996269876Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 27 02:47:48.996285 containerd[1879]: time="2025-05-27T02:47:48.996278916Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 27 02:47:48.996318 containerd[1879]: time="2025-05-27T02:47:48.996307132Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 27 02:47:48.996495 containerd[1879]: time="2025-05-27T02:47:48.996449940Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 27 02:47:48.996495 containerd[1879]: time="2025-05-27T02:47:48.996464636Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 27 02:47:48.996770 containerd[1879]: time="2025-05-27T02:47:48.996659820Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 27 02:47:48.996770 containerd[1879]: time="2025-05-27T02:47:48.996681788Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 27 02:47:48.996770 containerd[1879]: time="2025-05-27T02:47:48.996688700Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 27 02:47:48.996770 containerd[1879]: time="2025-05-27T02:47:48.996737444Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 27 02:47:48.996770 containerd[1879]: time="2025-05-27T02:47:48.996743932Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 27 02:47:48.996770 containerd[1879]: time="2025-05-27T02:47:48.996750932Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 27 02:47:48.996770 containerd[1879]: time="2025-05-27T02:47:48.996759788Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 27 02:47:48.996870 containerd[1879]: time="2025-05-27T02:47:48.996776388Z" level=info msg="runtime interface created" May 27 02:47:48.996870 containerd[1879]: time="2025-05-27T02:47:48.996780412Z" level=info msg="created NRI interface" May 27 02:47:48.997232 containerd[1879]: time="2025-05-27T02:47:48.996790860Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 27 02:47:48.997232 containerd[1879]: time="2025-05-27T02:47:48.996949420Z" level=info msg="Connect containerd service" May 27 02:47:48.997232 containerd[1879]: time="2025-05-27T02:47:48.997109764Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 27 02:47:48.999530 containerd[1879]: time="2025-05-27T02:47:48.999493636Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 27 02:47:49.075663 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 02:47:49.090092 (kubelet)[2030]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 02:47:49.095014 tar[1876]: linux-arm64/README.md May 27 02:47:49.115118 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 27 02:47:49.362991 kubelet[2030]: E0527 02:47:49.362839 2030 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 02:47:49.364950 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 02:47:49.365072 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 02:47:49.366581 systemd[1]: kubelet.service: Consumed 568ms CPU time, 255M memory peak. May 27 02:47:49.823898 containerd[1879]: time="2025-05-27T02:47:49.823784844Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 27 02:47:49.823898 containerd[1879]: time="2025-05-27T02:47:49.823849068Z" level=info msg=serving... address=/run/containerd/containerd.sock May 27 02:47:49.823898 containerd[1879]: time="2025-05-27T02:47:49.823873892Z" level=info msg="Start subscribing containerd event" May 27 02:47:49.824030 containerd[1879]: time="2025-05-27T02:47:49.823913252Z" level=info msg="Start recovering state" May 27 02:47:49.824030 containerd[1879]: time="2025-05-27T02:47:49.823985756Z" level=info msg="Start event monitor" May 27 02:47:49.824030 containerd[1879]: time="2025-05-27T02:47:49.823995940Z" level=info msg="Start cni network conf syncer for default" May 27 02:47:49.824030 containerd[1879]: time="2025-05-27T02:47:49.824001380Z" level=info msg="Start streaming server" May 27 02:47:49.824030 containerd[1879]: time="2025-05-27T02:47:49.824006588Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 27 02:47:49.824030 containerd[1879]: time="2025-05-27T02:47:49.824011588Z" level=info msg="runtime interface starting up..." May 27 02:47:49.824030 containerd[1879]: time="2025-05-27T02:47:49.824015100Z" level=info msg="starting plugins..." May 27 02:47:49.824030 containerd[1879]: time="2025-05-27T02:47:49.824025828Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 27 02:47:49.824133 containerd[1879]: time="2025-05-27T02:47:49.824126956Z" level=info msg="containerd successfully booted in 0.870515s" May 27 02:47:49.824651 systemd[1]: Started containerd.service - containerd container runtime. May 27 02:47:49.830018 systemd[1]: Reached target multi-user.target - Multi-User System. May 27 02:47:49.839859 systemd[1]: Startup finished in 1.669s (kernel) + 12.135s (initrd) + 11.794s (userspace) = 25.599s. May 27 02:47:50.042162 login[2020]: pam_lastlog(login:session): file /var/log/lastlog is locked/write, retrying May 27 02:47:50.043037 login[2019]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) May 27 02:47:50.064326 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 27 02:47:50.065385 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 27 02:47:50.071053 systemd-logind[1862]: New session 1 of user core. May 27 02:47:50.083034 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 27 02:47:50.086783 systemd[1]: Starting user@500.service - User Manager for UID 500... May 27 02:47:50.101072 (systemd)[2056]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 27 02:47:50.103343 systemd-logind[1862]: New session c1 of user core. May 27 02:47:50.261130 systemd[2056]: Queued start job for default target default.target. May 27 02:47:50.268328 systemd[2056]: Created slice app.slice - User Application Slice. May 27 02:47:50.268356 systemd[2056]: Reached target paths.target - Paths. May 27 02:47:50.268389 systemd[2056]: Reached target timers.target - Timers. May 27 02:47:50.269572 systemd[2056]: Starting dbus.socket - D-Bus User Message Bus Socket... May 27 02:47:50.280360 systemd[2056]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 27 02:47:50.280464 systemd[2056]: Reached target sockets.target - Sockets. May 27 02:47:50.280526 systemd[2056]: Reached target basic.target - Basic System. May 27 02:47:50.280547 systemd[2056]: Reached target default.target - Main User Target. May 27 02:47:50.280568 systemd[2056]: Startup finished in 171ms. May 27 02:47:50.280705 systemd[1]: Started user@500.service - User Manager for UID 500. May 27 02:47:50.282271 systemd[1]: Started session-1.scope - Session 1 of User core. May 27 02:47:50.609868 waagent[2017]: 2025-05-27T02:47:50.605531Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 May 27 02:47:50.610469 waagent[2017]: 2025-05-27T02:47:50.610408Z INFO Daemon Daemon OS: flatcar 4344.0.0 May 27 02:47:50.613986 waagent[2017]: 2025-05-27T02:47:50.613919Z INFO Daemon Daemon Python: 3.11.12 May 27 02:47:50.617776 waagent[2017]: 2025-05-27T02:47:50.617708Z INFO Daemon Daemon Run daemon May 27 02:47:50.621030 waagent[2017]: 2025-05-27T02:47:50.620969Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4344.0.0' May 27 02:47:50.628038 waagent[2017]: 2025-05-27T02:47:50.627973Z INFO Daemon Daemon Using waagent for provisioning May 27 02:47:50.635458 waagent[2017]: 2025-05-27T02:47:50.635384Z INFO Daemon Daemon Activate resource disk May 27 02:47:50.639565 waagent[2017]: 2025-05-27T02:47:50.639498Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb May 27 02:47:50.648978 waagent[2017]: 2025-05-27T02:47:50.648897Z INFO Daemon Daemon Found device: None May 27 02:47:50.652561 waagent[2017]: 2025-05-27T02:47:50.652499Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology May 27 02:47:50.659355 waagent[2017]: 2025-05-27T02:47:50.659278Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 May 27 02:47:50.669371 waagent[2017]: 2025-05-27T02:47:50.669301Z INFO Daemon Daemon Clean protocol and wireserver endpoint May 27 02:47:50.674054 waagent[2017]: 2025-05-27T02:47:50.673993Z INFO Daemon Daemon Running default provisioning handler May 27 02:47:50.684211 waagent[2017]: 2025-05-27T02:47:50.684143Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. May 27 02:47:50.695146 waagent[2017]: 2025-05-27T02:47:50.695079Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' May 27 02:47:50.703121 waagent[2017]: 2025-05-27T02:47:50.703035Z INFO Daemon Daemon cloud-init is enabled: False May 27 02:47:50.707523 waagent[2017]: 2025-05-27T02:47:50.707427Z INFO Daemon Daemon Copying ovf-env.xml May 27 02:47:50.781529 waagent[2017]: 2025-05-27T02:47:50.781427Z INFO Daemon Daemon Successfully mounted dvd May 27 02:47:50.809134 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. May 27 02:47:50.811610 waagent[2017]: 2025-05-27T02:47:50.810789Z INFO Daemon Daemon Detect protocol endpoint May 27 02:47:50.814963 waagent[2017]: 2025-05-27T02:47:50.814886Z INFO Daemon Daemon Clean protocol and wireserver endpoint May 27 02:47:50.819516 waagent[2017]: 2025-05-27T02:47:50.819404Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler May 27 02:47:50.824838 waagent[2017]: 2025-05-27T02:47:50.824774Z INFO Daemon Daemon Test for route to 168.63.129.16 May 27 02:47:50.828883 waagent[2017]: 2025-05-27T02:47:50.828802Z INFO Daemon Daemon Route to 168.63.129.16 exists May 27 02:47:50.833041 waagent[2017]: 2025-05-27T02:47:50.832978Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 May 27 02:47:50.878996 waagent[2017]: 2025-05-27T02:47:50.878869Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 May 27 02:47:50.884831 waagent[2017]: 2025-05-27T02:47:50.884797Z INFO Daemon Daemon Wire protocol version:2012-11-30 May 27 02:47:50.888997 waagent[2017]: 2025-05-27T02:47:50.888917Z INFO Daemon Daemon Server preferred version:2015-04-05 May 27 02:47:50.975785 waagent[2017]: 2025-05-27T02:47:50.975691Z INFO Daemon Daemon Initializing goal state during protocol detection May 27 02:47:50.984306 waagent[2017]: 2025-05-27T02:47:50.984199Z INFO Daemon Daemon Forcing an update of the goal state. May 27 02:47:50.992392 waagent[2017]: 2025-05-27T02:47:50.992332Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] May 27 02:47:51.043452 login[2020]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) May 27 02:47:51.047908 systemd-logind[1862]: New session 2 of user core. May 27 02:47:51.057912 systemd[1]: Started session-2.scope - Session 2 of User core. May 27 02:47:51.061359 waagent[2017]: 2025-05-27T02:47:51.059091Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.164 May 27 02:47:51.067815 waagent[2017]: 2025-05-27T02:47:51.064148Z INFO Daemon May 27 02:47:51.070669 waagent[2017]: 2025-05-27T02:47:51.070602Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: acf546e8-0354-48fd-906b-31428dfc7eb1 eTag: 226020717952583075 source: Fabric] May 27 02:47:51.079728 waagent[2017]: 2025-05-27T02:47:51.079670Z INFO Daemon The vmSettings originated via Fabric; will ignore them. May 27 02:47:51.085139 waagent[2017]: 2025-05-27T02:47:51.084979Z INFO Daemon May 27 02:47:51.087167 waagent[2017]: 2025-05-27T02:47:51.087113Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] May 27 02:47:51.096865 waagent[2017]: 2025-05-27T02:47:51.096815Z INFO Daemon Daemon Downloading artifacts profile blob May 27 02:47:51.194255 waagent[2017]: 2025-05-27T02:47:51.194105Z INFO Daemon Downloaded certificate {'thumbprint': '55DA18ACE21E3F6F098E846299EBC98CD01BB5E8', 'hasPrivateKey': False} May 27 02:47:51.202062 waagent[2017]: 2025-05-27T02:47:51.202012Z INFO Daemon Downloaded certificate {'thumbprint': '24CEC8F04F3EDC771DBD9EBE95A0E62B7BE2C573', 'hasPrivateKey': True} May 27 02:47:51.209721 waagent[2017]: 2025-05-27T02:47:51.209670Z INFO Daemon Fetch goal state completed May 27 02:47:51.223296 waagent[2017]: 2025-05-27T02:47:51.223246Z INFO Daemon Daemon Starting provisioning May 27 02:47:51.227282 waagent[2017]: 2025-05-27T02:47:51.227230Z INFO Daemon Daemon Handle ovf-env.xml. May 27 02:47:51.232402 waagent[2017]: 2025-05-27T02:47:51.232363Z INFO Daemon Daemon Set hostname [ci-4344.0.0-a-583de22c75] May 27 02:47:51.252324 waagent[2017]: 2025-05-27T02:47:51.252254Z INFO Daemon Daemon Publish hostname [ci-4344.0.0-a-583de22c75] May 27 02:47:51.257053 waagent[2017]: 2025-05-27T02:47:51.256995Z INFO Daemon Daemon Examine /proc/net/route for primary interface May 27 02:47:51.261821 waagent[2017]: 2025-05-27T02:47:51.261769Z INFO Daemon Daemon Primary interface is [eth0] May 27 02:47:51.272087 systemd-networkd[1487]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 02:47:51.272094 systemd-networkd[1487]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 02:47:51.272129 systemd-networkd[1487]: eth0: DHCP lease lost May 27 02:47:51.273857 waagent[2017]: 2025-05-27T02:47:51.273590Z INFO Daemon Daemon Create user account if not exists May 27 02:47:51.278006 waagent[2017]: 2025-05-27T02:47:51.277955Z INFO Daemon Daemon User core already exists, skip useradd May 27 02:47:51.282320 waagent[2017]: 2025-05-27T02:47:51.282274Z INFO Daemon Daemon Configure sudoer May 27 02:47:51.289430 waagent[2017]: 2025-05-27T02:47:51.289349Z INFO Daemon Daemon Configure sshd May 27 02:47:51.296521 waagent[2017]: 2025-05-27T02:47:51.296407Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. May 27 02:47:51.306305 waagent[2017]: 2025-05-27T02:47:51.306246Z INFO Daemon Daemon Deploy ssh public key. May 27 02:47:51.306462 systemd-networkd[1487]: eth0: DHCPv4 address 10.200.20.22/24, gateway 10.200.20.1 acquired from 168.63.129.16 May 27 02:47:52.424724 waagent[2017]: 2025-05-27T02:47:52.424671Z INFO Daemon Daemon Provisioning complete May 27 02:47:52.438998 waagent[2017]: 2025-05-27T02:47:52.438944Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping May 27 02:47:52.444356 waagent[2017]: 2025-05-27T02:47:52.444309Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. May 27 02:47:52.452070 waagent[2017]: 2025-05-27T02:47:52.452030Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent May 27 02:47:52.560092 waagent[2110]: 2025-05-27T02:47:52.560009Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) May 27 02:47:52.560413 waagent[2110]: 2025-05-27T02:47:52.560164Z INFO ExtHandler ExtHandler OS: flatcar 4344.0.0 May 27 02:47:52.560413 waagent[2110]: 2025-05-27T02:47:52.560201Z INFO ExtHandler ExtHandler Python: 3.11.12 May 27 02:47:52.560413 waagent[2110]: 2025-05-27T02:47:52.560237Z INFO ExtHandler ExtHandler CPU Arch: aarch64 May 27 02:47:52.597511 waagent[2110]: 2025-05-27T02:47:52.596709Z INFO ExtHandler ExtHandler Distro: flatcar-4344.0.0; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.12; Arch: aarch64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.22.0; May 27 02:47:52.597511 waagent[2110]: 2025-05-27T02:47:52.596967Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file May 27 02:47:52.597511 waagent[2110]: 2025-05-27T02:47:52.597027Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 May 27 02:47:52.606835 waagent[2110]: 2025-05-27T02:47:52.606763Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] May 27 02:47:52.613253 waagent[2110]: 2025-05-27T02:47:52.613211Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.164 May 27 02:47:52.613796 waagent[2110]: 2025-05-27T02:47:52.613755Z INFO ExtHandler May 27 02:47:52.613853 waagent[2110]: 2025-05-27T02:47:52.613834Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: fd7eee7b-615d-452c-8087-9fa2e3105c7f eTag: 226020717952583075 source: Fabric] May 27 02:47:52.614116 waagent[2110]: 2025-05-27T02:47:52.614088Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. May 27 02:47:52.614580 waagent[2110]: 2025-05-27T02:47:52.614549Z INFO ExtHandler May 27 02:47:52.614619 waagent[2110]: 2025-05-27T02:47:52.614604Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] May 27 02:47:52.618700 waagent[2110]: 2025-05-27T02:47:52.618667Z INFO ExtHandler ExtHandler Downloading artifacts profile blob May 27 02:47:52.682884 waagent[2110]: 2025-05-27T02:47:52.682736Z INFO ExtHandler Downloaded certificate {'thumbprint': '55DA18ACE21E3F6F098E846299EBC98CD01BB5E8', 'hasPrivateKey': False} May 27 02:47:52.683203 waagent[2110]: 2025-05-27T02:47:52.683164Z INFO ExtHandler Downloaded certificate {'thumbprint': '24CEC8F04F3EDC771DBD9EBE95A0E62B7BE2C573', 'hasPrivateKey': True} May 27 02:47:52.683579 waagent[2110]: 2025-05-27T02:47:52.683545Z INFO ExtHandler Fetch goal state completed May 27 02:47:52.703494 waagent[2110]: 2025-05-27T02:47:52.703402Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.3.3 11 Feb 2025 (Library: OpenSSL 3.3.3 11 Feb 2025) May 27 02:47:52.707604 waagent[2110]: 2025-05-27T02:47:52.707539Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 2110 May 27 02:47:52.707741 waagent[2110]: 2025-05-27T02:47:52.707714Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** May 27 02:47:52.708062 waagent[2110]: 2025-05-27T02:47:52.708033Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** May 27 02:47:52.709272 waagent[2110]: 2025-05-27T02:47:52.709231Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4344.0.0', '', 'Flatcar Container Linux by Kinvolk'] May 27 02:47:52.709676 waagent[2110]: 2025-05-27T02:47:52.709643Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4344.0.0', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported May 27 02:47:52.709815 waagent[2110]: 2025-05-27T02:47:52.709795Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False May 27 02:47:52.710278 waagent[2110]: 2025-05-27T02:47:52.710248Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules May 27 02:47:52.755634 waagent[2110]: 2025-05-27T02:47:52.755594Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service May 27 02:47:52.755842 waagent[2110]: 2025-05-27T02:47:52.755814Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup May 27 02:47:52.761259 waagent[2110]: 2025-05-27T02:47:52.760708Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now May 27 02:47:52.766540 systemd[1]: Reload requested from client PID 2127 ('systemctl') (unit waagent.service)... May 27 02:47:52.766553 systemd[1]: Reloading... May 27 02:47:52.850547 zram_generator::config[2164]: No configuration found. May 27 02:47:52.924870 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 02:47:53.008760 systemd[1]: Reloading finished in 241 ms. May 27 02:47:53.034616 waagent[2110]: 2025-05-27T02:47:53.034547Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service May 27 02:47:53.034731 waagent[2110]: 2025-05-27T02:47:53.034697Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully May 27 02:47:53.319552 waagent[2110]: 2025-05-27T02:47:53.319382Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. May 27 02:47:53.319788 waagent[2110]: 2025-05-27T02:47:53.319754Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] May 27 02:47:53.320521 waagent[2110]: 2025-05-27T02:47:53.320440Z INFO ExtHandler ExtHandler Starting env monitor service. May 27 02:47:53.320640 waagent[2110]: 2025-05-27T02:47:53.320581Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file May 27 02:47:53.320735 waagent[2110]: 2025-05-27T02:47:53.320702Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 May 27 02:47:53.320926 waagent[2110]: 2025-05-27T02:47:53.320892Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. May 27 02:47:53.321403 waagent[2110]: 2025-05-27T02:47:53.321354Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. May 27 02:47:53.321528 waagent[2110]: 2025-05-27T02:47:53.321400Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: May 27 02:47:53.321528 waagent[2110]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT May 27 02:47:53.321528 waagent[2110]: eth0 00000000 0114C80A 0003 0 0 1024 00000000 0 0 0 May 27 02:47:53.321528 waagent[2110]: eth0 0014C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 May 27 02:47:53.321528 waagent[2110]: eth0 0114C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 May 27 02:47:53.321528 waagent[2110]: eth0 10813FA8 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 May 27 02:47:53.321528 waagent[2110]: eth0 FEA9FEA9 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 May 27 02:47:53.322047 waagent[2110]: 2025-05-27T02:47:53.321943Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file May 27 02:47:53.322047 waagent[2110]: 2025-05-27T02:47:53.322008Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread May 27 02:47:53.322119 waagent[2110]: 2025-05-27T02:47:53.322096Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 May 27 02:47:53.322254 waagent[2110]: 2025-05-27T02:47:53.322228Z INFO EnvHandler ExtHandler Configure routes May 27 02:47:53.322432 waagent[2110]: 2025-05-27T02:47:53.322388Z INFO ExtHandler ExtHandler Start Extension Telemetry service. May 27 02:47:53.322502 waagent[2110]: 2025-05-27T02:47:53.322446Z INFO EnvHandler ExtHandler Gateway:None May 27 02:47:53.322974 waagent[2110]: 2025-05-27T02:47:53.322921Z INFO EnvHandler ExtHandler Routes:None May 27 02:47:53.323534 waagent[2110]: 2025-05-27T02:47:53.323349Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True May 27 02:47:53.323534 waagent[2110]: 2025-05-27T02:47:53.323398Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. May 27 02:47:53.323610 waagent[2110]: 2025-05-27T02:47:53.323574Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread May 27 02:47:53.329620 waagent[2110]: 2025-05-27T02:47:53.329569Z INFO ExtHandler ExtHandler May 27 02:47:53.329828 waagent[2110]: 2025-05-27T02:47:53.329796Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: 2e2b90eb-bd39-4e94-8b5f-8def469cbe91 correlation 7f78e82b-895e-4db3-92c1-ac7a627b7c5c created: 2025-05-27T02:46:40.555664Z] May 27 02:47:53.330451 waagent[2110]: 2025-05-27T02:47:53.330403Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. May 27 02:47:53.331065 waagent[2110]: 2025-05-27T02:47:53.331025Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 1 ms] May 27 02:47:53.359142 waagent[2110]: 2025-05-27T02:47:53.359067Z WARNING ExtHandler ExtHandler Failed to get firewall packets: 'iptables -w -t security -L OUTPUT --zero OUTPUT -nxv' failed: 2 (iptables v1.8.11 (nf_tables): Illegal option `--numeric' with this command May 27 02:47:53.359142 waagent[2110]: Try `iptables -h' or 'iptables --help' for more information.) May 27 02:47:53.359581 waagent[2110]: 2025-05-27T02:47:53.359545Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: 9023363E-BD58-4FF1-8505-9F24902E9BF5;DroppedPackets: -1;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] May 27 02:47:53.389546 waagent[2110]: 2025-05-27T02:47:53.388943Z INFO MonitorHandler ExtHandler Network interfaces: May 27 02:47:53.389546 waagent[2110]: Executing ['ip', '-a', '-o', 'link']: May 27 02:47:53.389546 waagent[2110]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 May 27 02:47:53.389546 waagent[2110]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 00:0d:3a:c5:35:11 brd ff:ff:ff:ff:ff:ff May 27 02:47:53.389546 waagent[2110]: 3: enP3984s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 00:0d:3a:c5:35:11 brd ff:ff:ff:ff:ff:ff\ altname enP3984p0s2 May 27 02:47:53.389546 waagent[2110]: Executing ['ip', '-4', '-a', '-o', 'address']: May 27 02:47:53.389546 waagent[2110]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever May 27 02:47:53.389546 waagent[2110]: 2: eth0 inet 10.200.20.22/24 metric 1024 brd 10.200.20.255 scope global eth0\ valid_lft forever preferred_lft forever May 27 02:47:53.389546 waagent[2110]: Executing ['ip', '-6', '-a', '-o', 'address']: May 27 02:47:53.389546 waagent[2110]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever May 27 02:47:53.389546 waagent[2110]: 2: eth0 inet6 fe80::20d:3aff:fec5:3511/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever May 27 02:47:53.389546 waagent[2110]: 3: enP3984s1 inet6 fe80::20d:3aff:fec5:3511/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever May 27 02:47:53.422396 waagent[2110]: 2025-05-27T02:47:53.417403Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: May 27 02:47:53.422396 waagent[2110]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) May 27 02:47:53.422396 waagent[2110]: pkts bytes target prot opt in out source destination May 27 02:47:53.422396 waagent[2110]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) May 27 02:47:53.422396 waagent[2110]: pkts bytes target prot opt in out source destination May 27 02:47:53.422396 waagent[2110]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) May 27 02:47:53.422396 waagent[2110]: pkts bytes target prot opt in out source destination May 27 02:47:53.422396 waagent[2110]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 May 27 02:47:53.422396 waagent[2110]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 May 27 02:47:53.422396 waagent[2110]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW May 27 02:47:53.425408 waagent[2110]: 2025-05-27T02:47:53.425356Z INFO EnvHandler ExtHandler Current Firewall rules: May 27 02:47:53.425408 waagent[2110]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) May 27 02:47:53.425408 waagent[2110]: pkts bytes target prot opt in out source destination May 27 02:47:53.425408 waagent[2110]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) May 27 02:47:53.425408 waagent[2110]: pkts bytes target prot opt in out source destination May 27 02:47:53.425408 waagent[2110]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) May 27 02:47:53.425408 waagent[2110]: pkts bytes target prot opt in out source destination May 27 02:47:53.425408 waagent[2110]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 May 27 02:47:53.425408 waagent[2110]: 3 356 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 May 27 02:47:53.425408 waagent[2110]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW May 27 02:47:53.425942 waagent[2110]: 2025-05-27T02:47:53.425914Z INFO EnvHandler ExtHandler Set block dev timeout: sda with timeout: 300 May 27 02:47:57.342446 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 27 02:47:57.343934 systemd[1]: Started sshd@0-10.200.20.22:22-10.200.16.10:34956.service - OpenSSH per-connection server daemon (10.200.16.10:34956). May 27 02:47:57.875872 sshd[2252]: Accepted publickey for core from 10.200.16.10 port 34956 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:47:57.877629 sshd-session[2252]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:47:57.881551 systemd-logind[1862]: New session 3 of user core. May 27 02:47:57.888644 systemd[1]: Started session-3.scope - Session 3 of User core. May 27 02:47:58.283463 systemd[1]: Started sshd@1-10.200.20.22:22-10.200.16.10:34972.service - OpenSSH per-connection server daemon (10.200.16.10:34972). May 27 02:47:58.765889 sshd[2257]: Accepted publickey for core from 10.200.16.10 port 34972 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:47:58.767143 sshd-session[2257]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:47:58.771373 systemd-logind[1862]: New session 4 of user core. May 27 02:47:58.789929 systemd[1]: Started session-4.scope - Session 4 of User core. May 27 02:47:59.113392 sshd[2259]: Connection closed by 10.200.16.10 port 34972 May 27 02:47:59.113946 sshd-session[2257]: pam_unix(sshd:session): session closed for user core May 27 02:47:59.117867 systemd[1]: sshd@1-10.200.20.22:22-10.200.16.10:34972.service: Deactivated successfully. May 27 02:47:59.120008 systemd[1]: session-4.scope: Deactivated successfully. May 27 02:47:59.121394 systemd-logind[1862]: Session 4 logged out. Waiting for processes to exit. May 27 02:47:59.122751 systemd-logind[1862]: Removed session 4. May 27 02:47:59.219589 systemd[1]: Started sshd@2-10.200.20.22:22-10.200.16.10:58488.service - OpenSSH per-connection server daemon (10.200.16.10:58488). May 27 02:47:59.389553 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 27 02:47:59.391621 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 02:47:59.492925 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 02:47:59.504775 (kubelet)[2275]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 02:47:59.625174 kubelet[2275]: E0527 02:47:59.625114 2275 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 02:47:59.628186 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 02:47:59.628308 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 02:47:59.628853 systemd[1]: kubelet.service: Consumed 119ms CPU time, 106.1M memory peak. May 27 02:47:59.703808 sshd[2265]: Accepted publickey for core from 10.200.16.10 port 58488 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:47:59.704896 sshd-session[2265]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:47:59.709896 systemd-logind[1862]: New session 5 of user core. May 27 02:47:59.715640 systemd[1]: Started session-5.scope - Session 5 of User core. May 27 02:48:00.048866 sshd[2282]: Connection closed by 10.200.16.10 port 58488 May 27 02:48:00.049557 sshd-session[2265]: pam_unix(sshd:session): session closed for user core May 27 02:48:00.053098 systemd[1]: sshd@2-10.200.20.22:22-10.200.16.10:58488.service: Deactivated successfully. May 27 02:48:00.055176 systemd[1]: session-5.scope: Deactivated successfully. May 27 02:48:00.056103 systemd-logind[1862]: Session 5 logged out. Waiting for processes to exit. May 27 02:48:00.057756 systemd-logind[1862]: Removed session 5. May 27 02:48:00.130749 systemd[1]: Started sshd@3-10.200.20.22:22-10.200.16.10:58498.service - OpenSSH per-connection server daemon (10.200.16.10:58498). May 27 02:48:00.580817 sshd[2288]: Accepted publickey for core from 10.200.16.10 port 58498 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:48:00.582047 sshd-session[2288]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:48:00.586428 systemd-logind[1862]: New session 6 of user core. May 27 02:48:00.588652 systemd[1]: Started session-6.scope - Session 6 of User core. May 27 02:48:00.904182 sshd[2290]: Connection closed by 10.200.16.10 port 58498 May 27 02:48:00.903346 sshd-session[2288]: pam_unix(sshd:session): session closed for user core May 27 02:48:00.906135 systemd[1]: sshd@3-10.200.20.22:22-10.200.16.10:58498.service: Deactivated successfully. May 27 02:48:00.908722 systemd[1]: session-6.scope: Deactivated successfully. May 27 02:48:00.910026 systemd-logind[1862]: Session 6 logged out. Waiting for processes to exit. May 27 02:48:00.911298 systemd-logind[1862]: Removed session 6. May 27 02:48:00.996629 systemd[1]: Started sshd@4-10.200.20.22:22-10.200.16.10:58510.service - OpenSSH per-connection server daemon (10.200.16.10:58510). May 27 02:48:01.479111 sshd[2296]: Accepted publickey for core from 10.200.16.10 port 58510 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:48:01.480339 sshd-session[2296]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:48:01.484800 systemd-logind[1862]: New session 7 of user core. May 27 02:48:01.494634 systemd[1]: Started session-7.scope - Session 7 of User core. May 27 02:48:01.836719 sudo[2299]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 27 02:48:01.836956 sudo[2299]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 02:48:01.863413 sudo[2299]: pam_unix(sudo:session): session closed for user root May 27 02:48:01.942512 sshd[2298]: Connection closed by 10.200.16.10 port 58510 May 27 02:48:01.942409 sshd-session[2296]: pam_unix(sshd:session): session closed for user core May 27 02:48:01.946429 systemd[1]: sshd@4-10.200.20.22:22-10.200.16.10:58510.service: Deactivated successfully. May 27 02:48:01.947948 systemd[1]: session-7.scope: Deactivated successfully. May 27 02:48:01.948735 systemd-logind[1862]: Session 7 logged out. Waiting for processes to exit. May 27 02:48:01.950095 systemd-logind[1862]: Removed session 7. May 27 02:48:02.028753 systemd[1]: Started sshd@5-10.200.20.22:22-10.200.16.10:58518.service - OpenSSH per-connection server daemon (10.200.16.10:58518). May 27 02:48:02.512116 sshd[2305]: Accepted publickey for core from 10.200.16.10 port 58518 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:48:02.513463 sshd-session[2305]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:48:02.517537 systemd-logind[1862]: New session 8 of user core. May 27 02:48:02.523647 systemd[1]: Started session-8.scope - Session 8 of User core. May 27 02:48:02.788732 sudo[2309]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 27 02:48:02.788980 sudo[2309]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 02:48:02.799296 sudo[2309]: pam_unix(sudo:session): session closed for user root May 27 02:48:02.803806 sudo[2308]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 27 02:48:02.804032 sudo[2308]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 02:48:02.811254 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 27 02:48:02.843824 augenrules[2331]: No rules May 27 02:48:02.845190 systemd[1]: audit-rules.service: Deactivated successfully. May 27 02:48:02.845583 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 27 02:48:02.847217 sudo[2308]: pam_unix(sudo:session): session closed for user root May 27 02:48:02.926564 sshd[2307]: Connection closed by 10.200.16.10 port 58518 May 27 02:48:02.926922 sshd-session[2305]: pam_unix(sshd:session): session closed for user core May 27 02:48:02.930867 systemd[1]: sshd@5-10.200.20.22:22-10.200.16.10:58518.service: Deactivated successfully. May 27 02:48:02.932615 systemd[1]: session-8.scope: Deactivated successfully. May 27 02:48:02.933301 systemd-logind[1862]: Session 8 logged out. Waiting for processes to exit. May 27 02:48:02.934807 systemd-logind[1862]: Removed session 8. May 27 02:48:03.011881 systemd[1]: Started sshd@6-10.200.20.22:22-10.200.16.10:58530.service - OpenSSH per-connection server daemon (10.200.16.10:58530). May 27 02:48:03.460407 sshd[2340]: Accepted publickey for core from 10.200.16.10 port 58530 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:48:03.461711 sshd-session[2340]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:48:03.466543 systemd-logind[1862]: New session 9 of user core. May 27 02:48:03.473915 systemd[1]: Started session-9.scope - Session 9 of User core. May 27 02:48:03.713769 sudo[2343]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 27 02:48:03.714005 sudo[2343]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 02:48:04.981469 systemd[1]: Starting docker.service - Docker Application Container Engine... May 27 02:48:04.994053 (dockerd)[2361]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 27 02:48:05.814316 dockerd[2361]: time="2025-05-27T02:48:05.813801596Z" level=info msg="Starting up" May 27 02:48:05.815608 dockerd[2361]: time="2025-05-27T02:48:05.815577460Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 27 02:48:05.972551 dockerd[2361]: time="2025-05-27T02:48:05.972505348Z" level=info msg="Loading containers: start." May 27 02:48:06.020499 kernel: Initializing XFRM netlink socket May 27 02:48:06.445598 systemd-networkd[1487]: docker0: Link UP May 27 02:48:06.462495 dockerd[2361]: time="2025-05-27T02:48:06.462426028Z" level=info msg="Loading containers: done." May 27 02:48:06.492440 dockerd[2361]: time="2025-05-27T02:48:06.492370772Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 27 02:48:06.492647 dockerd[2361]: time="2025-05-27T02:48:06.492510972Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 May 27 02:48:06.492647 dockerd[2361]: time="2025-05-27T02:48:06.492637980Z" level=info msg="Initializing buildkit" May 27 02:48:06.549703 dockerd[2361]: time="2025-05-27T02:48:06.549635772Z" level=info msg="Completed buildkit initialization" May 27 02:48:06.554715 dockerd[2361]: time="2025-05-27T02:48:06.554639444Z" level=info msg="Daemon has completed initialization" May 27 02:48:06.554860 dockerd[2361]: time="2025-05-27T02:48:06.554724484Z" level=info msg="API listen on /run/docker.sock" May 27 02:48:06.555407 systemd[1]: Started docker.service - Docker Application Container Engine. May 27 02:48:07.362923 containerd[1879]: time="2025-05-27T02:48:07.362881980Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.5\"" May 27 02:48:08.255331 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1373331162.mount: Deactivated successfully. May 27 02:48:09.574758 containerd[1879]: time="2025-05-27T02:48:09.574693900Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:09.578761 containerd[1879]: time="2025-05-27T02:48:09.578703460Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.5: active requests=0, bytes read=26326311" May 27 02:48:09.583967 containerd[1879]: time="2025-05-27T02:48:09.583898772Z" level=info msg="ImageCreate event name:\"sha256:42968274c3d27c41cdc146f5442f122c1c74960e299c13e2f348d2fe835a9134\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:09.588503 containerd[1879]: time="2025-05-27T02:48:09.587936772Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:0bee1bf751fe06009678c0cde7545443ba3a8d2edf71cea4c69cbb5774b9bf47\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:09.588503 containerd[1879]: time="2025-05-27T02:48:09.588434004Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.5\" with image id \"sha256:42968274c3d27c41cdc146f5442f122c1c74960e299c13e2f348d2fe835a9134\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:0bee1bf751fe06009678c0cde7545443ba3a8d2edf71cea4c69cbb5774b9bf47\", size \"26323111\" in 2.225511448s" May 27 02:48:09.588679 containerd[1879]: time="2025-05-27T02:48:09.588466676Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.5\" returns image reference \"sha256:42968274c3d27c41cdc146f5442f122c1c74960e299c13e2f348d2fe835a9134\"" May 27 02:48:09.589730 containerd[1879]: time="2025-05-27T02:48:09.589699860Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.5\"" May 27 02:48:09.639514 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 27 02:48:09.641667 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 02:48:09.743065 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 02:48:09.756241 (kubelet)[2623]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 02:48:09.875075 kubelet[2623]: E0527 02:48:09.874920 2623 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 02:48:09.877729 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 02:48:09.877852 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 02:48:09.878139 systemd[1]: kubelet.service: Consumed 116ms CPU time, 107.3M memory peak. May 27 02:48:11.352235 containerd[1879]: time="2025-05-27T02:48:11.352159636Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:11.356948 containerd[1879]: time="2025-05-27T02:48:11.356882236Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.5: active requests=0, bytes read=22530547" May 27 02:48:11.360992 containerd[1879]: time="2025-05-27T02:48:11.360912500Z" level=info msg="ImageCreate event name:\"sha256:82042044d6ea1f1e5afda9c7351883800adbde447314786c4e5a2fd9e42aab09\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:11.374843 containerd[1879]: time="2025-05-27T02:48:11.374745948Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:79bcf2f5e614c336c02dcea9dfcdf485d7297aed6a21239a99c87f7164f9baca\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:11.375616 containerd[1879]: time="2025-05-27T02:48:11.375471996Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.5\" with image id \"sha256:82042044d6ea1f1e5afda9c7351883800adbde447314786c4e5a2fd9e42aab09\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:79bcf2f5e614c336c02dcea9dfcdf485d7297aed6a21239a99c87f7164f9baca\", size \"24066313\" in 1.785739368s" May 27 02:48:11.375616 containerd[1879]: time="2025-05-27T02:48:11.375523772Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.5\" returns image reference \"sha256:82042044d6ea1f1e5afda9c7351883800adbde447314786c4e5a2fd9e42aab09\"" May 27 02:48:11.376210 containerd[1879]: time="2025-05-27T02:48:11.376111876Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.5\"" May 27 02:48:11.889919 chronyd[1848]: Selected source PHC0 May 27 02:48:12.597219 containerd[1879]: time="2025-05-27T02:48:12.597155862Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:12.600222 containerd[1879]: time="2025-05-27T02:48:12.599995317Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.5: active requests=0, bytes read=17484190" May 27 02:48:12.605462 containerd[1879]: time="2025-05-27T02:48:12.605402744Z" level=info msg="ImageCreate event name:\"sha256:e149336437f90109dad736c8a42e4b73c137a66579be8f3b9a456bcc62af3f9b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:12.614759 containerd[1879]: time="2025-05-27T02:48:12.614676367Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f0f39d8b9808c407cacb3a46a5a9ce4d4a4a7cf3b674ba4bd221f5bc90051d2a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:12.616165 containerd[1879]: time="2025-05-27T02:48:12.615881976Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.5\" with image id \"sha256:e149336437f90109dad736c8a42e4b73c137a66579be8f3b9a456bcc62af3f9b\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f0f39d8b9808c407cacb3a46a5a9ce4d4a4a7cf3b674ba4bd221f5bc90051d2a\", size \"19019974\" in 1.239738015s" May 27 02:48:12.616165 containerd[1879]: time="2025-05-27T02:48:12.616037265Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.5\" returns image reference \"sha256:e149336437f90109dad736c8a42e4b73c137a66579be8f3b9a456bcc62af3f9b\"" May 27 02:48:12.617133 containerd[1879]: time="2025-05-27T02:48:12.616665604Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.5\"" May 27 02:48:14.267734 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2806120065.mount: Deactivated successfully. May 27 02:48:14.586612 containerd[1879]: time="2025-05-27T02:48:14.586054742Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:14.590069 containerd[1879]: time="2025-05-27T02:48:14.590005701Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.5: active requests=0, bytes read=27377375" May 27 02:48:14.594545 containerd[1879]: time="2025-05-27T02:48:14.594460274Z" level=info msg="ImageCreate event name:\"sha256:69b7afc06f22edcae3b6a7d80cdacb488a5415fd605e89534679e5ebc41375fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:14.598547 containerd[1879]: time="2025-05-27T02:48:14.598494370Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:9dc6553459c3319525ba4090a780db1a133d5dee68c08e07f9b9d6ba83b42a0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:14.599020 containerd[1879]: time="2025-05-27T02:48:14.598965992Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.5\" with image id \"sha256:69b7afc06f22edcae3b6a7d80cdacb488a5415fd605e89534679e5ebc41375fc\", repo tag \"registry.k8s.io/kube-proxy:v1.32.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:9dc6553459c3319525ba4090a780db1a133d5dee68c08e07f9b9d6ba83b42a0b\", size \"27376394\" in 1.982252121s" May 27 02:48:14.599020 containerd[1879]: time="2025-05-27T02:48:14.599001993Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.5\" returns image reference \"sha256:69b7afc06f22edcae3b6a7d80cdacb488a5415fd605e89534679e5ebc41375fc\"" May 27 02:48:14.599630 containerd[1879]: time="2025-05-27T02:48:14.599597009Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" May 27 02:48:15.276039 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount814070256.mount: Deactivated successfully. May 27 02:48:16.351504 containerd[1879]: time="2025-05-27T02:48:16.351342067Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:16.355125 containerd[1879]: time="2025-05-27T02:48:16.355075323Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951622" May 27 02:48:16.358463 containerd[1879]: time="2025-05-27T02:48:16.358402544Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:16.365515 containerd[1879]: time="2025-05-27T02:48:16.365404236Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:16.366064 containerd[1879]: time="2025-05-27T02:48:16.366032814Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.766402652s" May 27 02:48:16.366165 containerd[1879]: time="2025-05-27T02:48:16.366154081Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" May 27 02:48:16.367052 containerd[1879]: time="2025-05-27T02:48:16.366993401Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 27 02:48:16.978890 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2908050663.mount: Deactivated successfully. May 27 02:48:17.014543 containerd[1879]: time="2025-05-27T02:48:17.014206866Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 02:48:17.017001 containerd[1879]: time="2025-05-27T02:48:17.016943190Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" May 27 02:48:17.024001 containerd[1879]: time="2025-05-27T02:48:17.023916058Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 02:48:17.031510 containerd[1879]: time="2025-05-27T02:48:17.031429260Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 02:48:17.032075 containerd[1879]: time="2025-05-27T02:48:17.031935322Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 664.759236ms" May 27 02:48:17.032075 containerd[1879]: time="2025-05-27T02:48:17.031974363Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" May 27 02:48:17.032421 containerd[1879]: time="2025-05-27T02:48:17.032396623Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" May 27 02:48:17.753548 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1718580127.mount: Deactivated successfully. May 27 02:48:19.889586 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 27 02:48:19.891059 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 02:48:20.225438 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 02:48:20.230764 (kubelet)[2763]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 02:48:20.339762 kubelet[2763]: E0527 02:48:20.339684 2763 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 02:48:20.342211 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 02:48:20.342487 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 02:48:20.343036 systemd[1]: kubelet.service: Consumed 117ms CPU time, 105.2M memory peak. May 27 02:48:20.868528 containerd[1879]: time="2025-05-27T02:48:20.867871219Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:20.871464 containerd[1879]: time="2025-05-27T02:48:20.871413183Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=67812469" May 27 02:48:20.878392 containerd[1879]: time="2025-05-27T02:48:20.878334882Z" level=info msg="ImageCreate event name:\"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:20.883638 containerd[1879]: time="2025-05-27T02:48:20.883574886Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:20.884214 containerd[1879]: time="2025-05-27T02:48:20.884051420Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"67941650\" in 3.851624884s" May 27 02:48:20.884214 containerd[1879]: time="2025-05-27T02:48:20.884089077Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\"" May 27 02:48:23.639138 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 02:48:23.639261 systemd[1]: kubelet.service: Consumed 117ms CPU time, 105.2M memory peak. May 27 02:48:23.641165 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 02:48:23.666826 systemd[1]: Reload requested from client PID 2798 ('systemctl') (unit session-9.scope)... May 27 02:48:23.666842 systemd[1]: Reloading... May 27 02:48:23.777539 zram_generator::config[2864]: No configuration found. May 27 02:48:23.833443 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 02:48:23.918734 systemd[1]: Reloading finished in 251 ms. May 27 02:48:23.972965 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 27 02:48:23.973032 systemd[1]: kubelet.service: Failed with result 'signal'. May 27 02:48:23.973241 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 02:48:23.973286 systemd[1]: kubelet.service: Consumed 79ms CPU time, 95M memory peak. May 27 02:48:23.975984 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 02:48:24.206363 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 02:48:24.212837 (kubelet)[2910]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 27 02:48:24.334972 kubelet[2910]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 02:48:24.334972 kubelet[2910]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 27 02:48:24.334972 kubelet[2910]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 02:48:24.334972 kubelet[2910]: I0527 02:48:24.334796 2910 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 27 02:48:24.764797 kubelet[2910]: I0527 02:48:24.764743 2910 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" May 27 02:48:24.764797 kubelet[2910]: I0527 02:48:24.764783 2910 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 27 02:48:24.765028 kubelet[2910]: I0527 02:48:24.765007 2910 server.go:954] "Client rotation is on, will bootstrap in background" May 27 02:48:24.779428 kubelet[2910]: E0527 02:48:24.779374 2910 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.200.20.22:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.20.22:6443: connect: connection refused" logger="UnhandledError" May 27 02:48:24.780154 kubelet[2910]: I0527 02:48:24.780131 2910 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 02:48:24.784731 kubelet[2910]: I0527 02:48:24.784656 2910 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 27 02:48:24.789148 kubelet[2910]: I0527 02:48:24.789115 2910 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 27 02:48:24.789703 kubelet[2910]: I0527 02:48:24.789655 2910 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 27 02:48:24.789839 kubelet[2910]: I0527 02:48:24.789705 2910 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4344.0.0-a-583de22c75","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 27 02:48:24.789932 kubelet[2910]: I0527 02:48:24.789849 2910 topology_manager.go:138] "Creating topology manager with none policy" May 27 02:48:24.789932 kubelet[2910]: I0527 02:48:24.789857 2910 container_manager_linux.go:304] "Creating device plugin manager" May 27 02:48:24.790015 kubelet[2910]: I0527 02:48:24.790001 2910 state_mem.go:36] "Initialized new in-memory state store" May 27 02:48:24.791673 kubelet[2910]: I0527 02:48:24.791652 2910 kubelet.go:446] "Attempting to sync node with API server" May 27 02:48:24.791694 kubelet[2910]: I0527 02:48:24.791677 2910 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" May 27 02:48:24.791722 kubelet[2910]: I0527 02:48:24.791699 2910 kubelet.go:352] "Adding apiserver pod source" May 27 02:48:24.791722 kubelet[2910]: I0527 02:48:24.791709 2910 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 27 02:48:24.796511 kubelet[2910]: W0527 02:48:24.796372 2910 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.20.22:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused May 27 02:48:24.796511 kubelet[2910]: E0527 02:48:24.796447 2910 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.200.20.22:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.22:6443: connect: connection refused" logger="UnhandledError" May 27 02:48:24.796746 kubelet[2910]: W0527 02:48:24.796716 2910 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.20.22:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4344.0.0-a-583de22c75&limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused May 27 02:48:24.796782 kubelet[2910]: E0527 02:48:24.796753 2910 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.200.20.22:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4344.0.0-a-583de22c75&limit=500&resourceVersion=0\": dial tcp 10.200.20.22:6443: connect: connection refused" logger="UnhandledError" May 27 02:48:24.797159 kubelet[2910]: I0527 02:48:24.797138 2910 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 27 02:48:24.797836 kubelet[2910]: I0527 02:48:24.797815 2910 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 27 02:48:24.797903 kubelet[2910]: W0527 02:48:24.797876 2910 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 27 02:48:24.798498 kubelet[2910]: I0527 02:48:24.798455 2910 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 27 02:48:24.798498 kubelet[2910]: I0527 02:48:24.798502 2910 server.go:1287] "Started kubelet" May 27 02:48:24.799119 kubelet[2910]: I0527 02:48:24.799090 2910 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 May 27 02:48:24.799914 kubelet[2910]: I0527 02:48:24.799892 2910 server.go:479] "Adding debug handlers to kubelet server" May 27 02:48:24.800558 kubelet[2910]: I0527 02:48:24.800496 2910 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 27 02:48:24.800811 kubelet[2910]: I0527 02:48:24.800789 2910 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 27 02:48:24.801119 kubelet[2910]: E0527 02:48:24.800989 2910 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.22:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.22:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4344.0.0-a-583de22c75.1843426a54515dce default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4344.0.0-a-583de22c75,UID:ci-4344.0.0-a-583de22c75,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4344.0.0-a-583de22c75,},FirstTimestamp:2025-05-27 02:48:24.798469582 +0000 UTC m=+0.582274010,LastTimestamp:2025-05-27 02:48:24.798469582 +0000 UTC m=+0.582274010,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4344.0.0-a-583de22c75,}" May 27 02:48:24.802985 kubelet[2910]: I0527 02:48:24.802958 2910 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 27 02:48:24.803582 kubelet[2910]: I0527 02:48:24.803557 2910 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 27 02:48:24.806444 kubelet[2910]: E0527 02:48:24.806318 2910 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4344.0.0-a-583de22c75\" not found" May 27 02:48:24.806600 kubelet[2910]: I0527 02:48:24.806584 2910 volume_manager.go:297] "Starting Kubelet Volume Manager" May 27 02:48:24.806836 kubelet[2910]: I0527 02:48:24.806817 2910 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 27 02:48:24.806957 kubelet[2910]: I0527 02:48:24.806948 2910 reconciler.go:26] "Reconciler: start to sync state" May 27 02:48:24.807399 kubelet[2910]: W0527 02:48:24.807364 2910 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.20.22:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused May 27 02:48:24.807529 kubelet[2910]: E0527 02:48:24.807513 2910 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.200.20.22:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.22:6443: connect: connection refused" logger="UnhandledError" May 27 02:48:24.808252 kubelet[2910]: E0527 02:48:24.808194 2910 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 27 02:48:24.809717 kubelet[2910]: E0527 02:48:24.809662 2910 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.22:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344.0.0-a-583de22c75?timeout=10s\": dial tcp 10.200.20.22:6443: connect: connection refused" interval="200ms" May 27 02:48:24.810248 kubelet[2910]: I0527 02:48:24.810207 2910 factory.go:221] Registration of the containerd container factory successfully May 27 02:48:24.810248 kubelet[2910]: I0527 02:48:24.810224 2910 factory.go:221] Registration of the systemd container factory successfully May 27 02:48:24.810438 kubelet[2910]: I0527 02:48:24.810418 2910 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 27 02:48:24.840012 kubelet[2910]: I0527 02:48:24.839715 2910 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 27 02:48:24.841192 kubelet[2910]: I0527 02:48:24.840622 2910 cpu_manager.go:221] "Starting CPU manager" policy="none" May 27 02:48:24.841192 kubelet[2910]: I0527 02:48:24.840640 2910 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 27 02:48:24.841192 kubelet[2910]: I0527 02:48:24.840663 2910 state_mem.go:36] "Initialized new in-memory state store" May 27 02:48:24.842603 kubelet[2910]: I0527 02:48:24.842567 2910 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 27 02:48:24.845904 kubelet[2910]: I0527 02:48:24.842785 2910 status_manager.go:227] "Starting to sync pod status with apiserver" May 27 02:48:24.845904 kubelet[2910]: I0527 02:48:24.842820 2910 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 27 02:48:24.845904 kubelet[2910]: I0527 02:48:24.842834 2910 kubelet.go:2382] "Starting kubelet main sync loop" May 27 02:48:24.845904 kubelet[2910]: E0527 02:48:24.842871 2910 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 27 02:48:24.845904 kubelet[2910]: W0527 02:48:24.844257 2910 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.20.22:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused May 27 02:48:24.845904 kubelet[2910]: E0527 02:48:24.844291 2910 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.200.20.22:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.22:6443: connect: connection refused" logger="UnhandledError" May 27 02:48:24.847409 kubelet[2910]: I0527 02:48:24.847379 2910 policy_none.go:49] "None policy: Start" May 27 02:48:24.847409 kubelet[2910]: I0527 02:48:24.847407 2910 memory_manager.go:186] "Starting memorymanager" policy="None" May 27 02:48:24.847409 kubelet[2910]: I0527 02:48:24.847419 2910 state_mem.go:35] "Initializing new in-memory state store" May 27 02:48:24.858730 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 27 02:48:24.871740 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 27 02:48:24.881465 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 27 02:48:24.883670 kubelet[2910]: I0527 02:48:24.883643 2910 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 27 02:48:24.884002 kubelet[2910]: I0527 02:48:24.883984 2910 eviction_manager.go:189] "Eviction manager: starting control loop" May 27 02:48:24.884105 kubelet[2910]: I0527 02:48:24.884073 2910 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 27 02:48:24.884961 kubelet[2910]: I0527 02:48:24.884919 2910 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 27 02:48:24.885628 kubelet[2910]: E0527 02:48:24.885524 2910 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 27 02:48:24.885628 kubelet[2910]: E0527 02:48:24.885572 2910 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4344.0.0-a-583de22c75\" not found" May 27 02:48:24.954304 systemd[1]: Created slice kubepods-burstable-podb820e7e747c743f008459849cb959226.slice - libcontainer container kubepods-burstable-podb820e7e747c743f008459849cb959226.slice. May 27 02:48:24.964215 kubelet[2910]: E0527 02:48:24.964124 2910 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-a-583de22c75\" not found" node="ci-4344.0.0-a-583de22c75" May 27 02:48:24.964867 systemd[1]: Created slice kubepods-burstable-pod81dd4481c0f6609d3c5d04c3b3ce8709.slice - libcontainer container kubepods-burstable-pod81dd4481c0f6609d3c5d04c3b3ce8709.slice. May 27 02:48:24.967504 kubelet[2910]: E0527 02:48:24.967450 2910 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-a-583de22c75\" not found" node="ci-4344.0.0-a-583de22c75" May 27 02:48:24.970408 systemd[1]: Created slice kubepods-burstable-podf5b3c1c02744d6d8865ece708e61218b.slice - libcontainer container kubepods-burstable-podf5b3c1c02744d6d8865ece708e61218b.slice. May 27 02:48:24.972085 kubelet[2910]: E0527 02:48:24.972060 2910 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-a-583de22c75\" not found" node="ci-4344.0.0-a-583de22c75" May 27 02:48:24.987135 kubelet[2910]: I0527 02:48:24.986794 2910 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344.0.0-a-583de22c75" May 27 02:48:24.987376 kubelet[2910]: E0527 02:48:24.987349 2910 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.22:6443/api/v1/nodes\": dial tcp 10.200.20.22:6443: connect: connection refused" node="ci-4344.0.0-a-583de22c75" May 27 02:48:25.008611 kubelet[2910]: I0527 02:48:25.008545 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/81dd4481c0f6609d3c5d04c3b3ce8709-ca-certs\") pod \"kube-apiserver-ci-4344.0.0-a-583de22c75\" (UID: \"81dd4481c0f6609d3c5d04c3b3ce8709\") " pod="kube-system/kube-apiserver-ci-4344.0.0-a-583de22c75" May 27 02:48:25.008753 kubelet[2910]: I0527 02:48:25.008740 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/81dd4481c0f6609d3c5d04c3b3ce8709-k8s-certs\") pod \"kube-apiserver-ci-4344.0.0-a-583de22c75\" (UID: \"81dd4481c0f6609d3c5d04c3b3ce8709\") " pod="kube-system/kube-apiserver-ci-4344.0.0-a-583de22c75" May 27 02:48:25.008931 kubelet[2910]: I0527 02:48:25.008876 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f5b3c1c02744d6d8865ece708e61218b-k8s-certs\") pod \"kube-controller-manager-ci-4344.0.0-a-583de22c75\" (UID: \"f5b3c1c02744d6d8865ece708e61218b\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-583de22c75" May 27 02:48:25.008931 kubelet[2910]: I0527 02:48:25.008896 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b820e7e747c743f008459849cb959226-kubeconfig\") pod \"kube-scheduler-ci-4344.0.0-a-583de22c75\" (UID: \"b820e7e747c743f008459849cb959226\") " pod="kube-system/kube-scheduler-ci-4344.0.0-a-583de22c75" May 27 02:48:25.008931 kubelet[2910]: I0527 02:48:25.008914 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/81dd4481c0f6609d3c5d04c3b3ce8709-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4344.0.0-a-583de22c75\" (UID: \"81dd4481c0f6609d3c5d04c3b3ce8709\") " pod="kube-system/kube-apiserver-ci-4344.0.0-a-583de22c75" May 27 02:48:25.009087 kubelet[2910]: I0527 02:48:25.009061 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f5b3c1c02744d6d8865ece708e61218b-ca-certs\") pod \"kube-controller-manager-ci-4344.0.0-a-583de22c75\" (UID: \"f5b3c1c02744d6d8865ece708e61218b\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-583de22c75" May 27 02:48:25.009211 kubelet[2910]: I0527 02:48:25.009167 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/f5b3c1c02744d6d8865ece708e61218b-flexvolume-dir\") pod \"kube-controller-manager-ci-4344.0.0-a-583de22c75\" (UID: \"f5b3c1c02744d6d8865ece708e61218b\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-583de22c75" May 27 02:48:25.009211 kubelet[2910]: I0527 02:48:25.009187 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f5b3c1c02744d6d8865ece708e61218b-kubeconfig\") pod \"kube-controller-manager-ci-4344.0.0-a-583de22c75\" (UID: \"f5b3c1c02744d6d8865ece708e61218b\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-583de22c75" May 27 02:48:25.009211 kubelet[2910]: I0527 02:48:25.009198 2910 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f5b3c1c02744d6d8865ece708e61218b-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4344.0.0-a-583de22c75\" (UID: \"f5b3c1c02744d6d8865ece708e61218b\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-583de22c75" May 27 02:48:25.010958 kubelet[2910]: E0527 02:48:25.010927 2910 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.22:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344.0.0-a-583de22c75?timeout=10s\": dial tcp 10.200.20.22:6443: connect: connection refused" interval="400ms" May 27 02:48:25.190021 kubelet[2910]: I0527 02:48:25.189989 2910 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344.0.0-a-583de22c75" May 27 02:48:25.190416 kubelet[2910]: E0527 02:48:25.190388 2910 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.22:6443/api/v1/nodes\": dial tcp 10.200.20.22:6443: connect: connection refused" node="ci-4344.0.0-a-583de22c75" May 27 02:48:25.265893 containerd[1879]: time="2025-05-27T02:48:25.265789819Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4344.0.0-a-583de22c75,Uid:b820e7e747c743f008459849cb959226,Namespace:kube-system,Attempt:0,}" May 27 02:48:25.269527 containerd[1879]: time="2025-05-27T02:48:25.269467867Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4344.0.0-a-583de22c75,Uid:81dd4481c0f6609d3c5d04c3b3ce8709,Namespace:kube-system,Attempt:0,}" May 27 02:48:25.273262 containerd[1879]: time="2025-05-27T02:48:25.273194020Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4344.0.0-a-583de22c75,Uid:f5b3c1c02744d6d8865ece708e61218b,Namespace:kube-system,Attempt:0,}" May 27 02:48:25.411775 kubelet[2910]: E0527 02:48:25.411728 2910 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.22:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344.0.0-a-583de22c75?timeout=10s\": dial tcp 10.200.20.22:6443: connect: connection refused" interval="800ms" May 27 02:48:25.593439 kubelet[2910]: I0527 02:48:25.593023 2910 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344.0.0-a-583de22c75" May 27 02:48:25.593439 kubelet[2910]: E0527 02:48:25.593411 2910 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.22:6443/api/v1/nodes\": dial tcp 10.200.20.22:6443: connect: connection refused" node="ci-4344.0.0-a-583de22c75" May 27 02:48:25.630752 kubelet[2910]: W0527 02:48:25.630693 2910 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.20.22:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4344.0.0-a-583de22c75&limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused May 27 02:48:25.630875 kubelet[2910]: E0527 02:48:25.630760 2910 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.200.20.22:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4344.0.0-a-583de22c75&limit=500&resourceVersion=0\": dial tcp 10.200.20.22:6443: connect: connection refused" logger="UnhandledError" May 27 02:48:25.789147 kubelet[2910]: W0527 02:48:25.789092 2910 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.20.22:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused May 27 02:48:25.789255 kubelet[2910]: E0527 02:48:25.789157 2910 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.200.20.22:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.22:6443: connect: connection refused" logger="UnhandledError" May 27 02:48:25.944206 kubelet[2910]: W0527 02:48:25.944144 2910 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.20.22:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused May 27 02:48:25.944206 kubelet[2910]: E0527 02:48:25.944212 2910 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.200.20.22:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.22:6443: connect: connection refused" logger="UnhandledError" May 27 02:48:25.985014 kubelet[2910]: W0527 02:48:25.984979 2910 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.20.22:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.20.22:6443: connect: connection refused May 27 02:48:25.985119 kubelet[2910]: E0527 02:48:25.985023 2910 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.200.20.22:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.22:6443: connect: connection refused" logger="UnhandledError" May 27 02:48:26.212654 kubelet[2910]: E0527 02:48:26.212520 2910 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.22:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344.0.0-a-583de22c75?timeout=10s\": dial tcp 10.200.20.22:6443: connect: connection refused" interval="1.6s" May 27 02:48:26.290580 containerd[1879]: time="2025-05-27T02:48:26.290521499Z" level=info msg="connecting to shim 59a2a6ba8e970245adb2f01267b97f7f7371859e901a8625d8364f9595d2c186" address="unix:///run/containerd/s/a908b9cab1c841e603bb19d484533400a0d0ccc9850a2df9389ec5e0caec7300" namespace=k8s.io protocol=ttrpc version=3 May 27 02:48:26.298226 containerd[1879]: time="2025-05-27T02:48:26.298050928Z" level=info msg="connecting to shim c33f9bfe7e7e50a9d4e91d6f6ffa01895281b29254383820659b6702eb4574be" address="unix:///run/containerd/s/297a926ca13f6abaab8961df0c7fd9b32592bbf712e2bd62c9a396a540451693" namespace=k8s.io protocol=ttrpc version=3 May 27 02:48:26.319857 systemd[1]: Started cri-containerd-c33f9bfe7e7e50a9d4e91d6f6ffa01895281b29254383820659b6702eb4574be.scope - libcontainer container c33f9bfe7e7e50a9d4e91d6f6ffa01895281b29254383820659b6702eb4574be. May 27 02:48:26.320237 containerd[1879]: time="2025-05-27T02:48:26.319915913Z" level=info msg="connecting to shim a407506d360f732e8bbc3f98ac909d06d2fe17dece17e748655713c11b5cbfe2" address="unix:///run/containerd/s/16d4c36cb7fa5b0b99dc1b95a213136999d0183f1547e8fca334248fadf639df" namespace=k8s.io protocol=ttrpc version=3 May 27 02:48:26.326916 systemd[1]: Started cri-containerd-59a2a6ba8e970245adb2f01267b97f7f7371859e901a8625d8364f9595d2c186.scope - libcontainer container 59a2a6ba8e970245adb2f01267b97f7f7371859e901a8625d8364f9595d2c186. May 27 02:48:26.359680 systemd[1]: Started cri-containerd-a407506d360f732e8bbc3f98ac909d06d2fe17dece17e748655713c11b5cbfe2.scope - libcontainer container a407506d360f732e8bbc3f98ac909d06d2fe17dece17e748655713c11b5cbfe2. May 27 02:48:26.387693 containerd[1879]: time="2025-05-27T02:48:26.387640624Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4344.0.0-a-583de22c75,Uid:81dd4481c0f6609d3c5d04c3b3ce8709,Namespace:kube-system,Attempt:0,} returns sandbox id \"59a2a6ba8e970245adb2f01267b97f7f7371859e901a8625d8364f9595d2c186\"" May 27 02:48:26.392724 containerd[1879]: time="2025-05-27T02:48:26.392625949Z" level=info msg="CreateContainer within sandbox \"59a2a6ba8e970245adb2f01267b97f7f7371859e901a8625d8364f9595d2c186\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 27 02:48:26.393271 containerd[1879]: time="2025-05-27T02:48:26.393238414Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4344.0.0-a-583de22c75,Uid:b820e7e747c743f008459849cb959226,Namespace:kube-system,Attempt:0,} returns sandbox id \"c33f9bfe7e7e50a9d4e91d6f6ffa01895281b29254383820659b6702eb4574be\"" May 27 02:48:26.396678 kubelet[2910]: I0527 02:48:26.396429 2910 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344.0.0-a-583de22c75" May 27 02:48:26.397625 kubelet[2910]: E0527 02:48:26.397588 2910 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.22:6443/api/v1/nodes\": dial tcp 10.200.20.22:6443: connect: connection refused" node="ci-4344.0.0-a-583de22c75" May 27 02:48:26.398033 containerd[1879]: time="2025-05-27T02:48:26.397997717Z" level=info msg="CreateContainer within sandbox \"c33f9bfe7e7e50a9d4e91d6f6ffa01895281b29254383820659b6702eb4574be\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 27 02:48:26.433708 containerd[1879]: time="2025-05-27T02:48:26.433665179Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4344.0.0-a-583de22c75,Uid:f5b3c1c02744d6d8865ece708e61218b,Namespace:kube-system,Attempt:0,} returns sandbox id \"a407506d360f732e8bbc3f98ac909d06d2fe17dece17e748655713c11b5cbfe2\"" May 27 02:48:26.436313 containerd[1879]: time="2025-05-27T02:48:26.436270197Z" level=info msg="CreateContainer within sandbox \"a407506d360f732e8bbc3f98ac909d06d2fe17dece17e748655713c11b5cbfe2\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 27 02:48:26.443685 containerd[1879]: time="2025-05-27T02:48:26.443639253Z" level=info msg="Container 5f29d4aee5f95cc786fe0e7f0a69b53363a2b0bed0d30ea74f889271c6a0fd8b: CDI devices from CRI Config.CDIDevices: []" May 27 02:48:26.457316 containerd[1879]: time="2025-05-27T02:48:26.457201428Z" level=info msg="Container 7f243aff205ee5fdadefa412f98edd5255163c5d8813e52453c079eca16ba358: CDI devices from CRI Config.CDIDevices: []" May 27 02:48:26.507363 containerd[1879]: time="2025-05-27T02:48:26.507116524Z" level=info msg="CreateContainer within sandbox \"c33f9bfe7e7e50a9d4e91d6f6ffa01895281b29254383820659b6702eb4574be\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"5f29d4aee5f95cc786fe0e7f0a69b53363a2b0bed0d30ea74f889271c6a0fd8b\"" May 27 02:48:26.510015 containerd[1879]: time="2025-05-27T02:48:26.509936924Z" level=info msg="StartContainer for \"5f29d4aee5f95cc786fe0e7f0a69b53363a2b0bed0d30ea74f889271c6a0fd8b\"" May 27 02:48:26.513395 containerd[1879]: time="2025-05-27T02:48:26.513267098Z" level=info msg="CreateContainer within sandbox \"59a2a6ba8e970245adb2f01267b97f7f7371859e901a8625d8364f9595d2c186\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"7f243aff205ee5fdadefa412f98edd5255163c5d8813e52453c079eca16ba358\"" May 27 02:48:26.516201 containerd[1879]: time="2025-05-27T02:48:26.515364621Z" level=info msg="StartContainer for \"7f243aff205ee5fdadefa412f98edd5255163c5d8813e52453c079eca16ba358\"" May 27 02:48:26.516201 containerd[1879]: time="2025-05-27T02:48:26.516082689Z" level=info msg="connecting to shim 5f29d4aee5f95cc786fe0e7f0a69b53363a2b0bed0d30ea74f889271c6a0fd8b" address="unix:///run/containerd/s/297a926ca13f6abaab8961df0c7fd9b32592bbf712e2bd62c9a396a540451693" protocol=ttrpc version=3 May 27 02:48:26.517172 containerd[1879]: time="2025-05-27T02:48:26.517137599Z" level=info msg="connecting to shim 7f243aff205ee5fdadefa412f98edd5255163c5d8813e52453c079eca16ba358" address="unix:///run/containerd/s/a908b9cab1c841e603bb19d484533400a0d0ccc9850a2df9389ec5e0caec7300" protocol=ttrpc version=3 May 27 02:48:26.529374 containerd[1879]: time="2025-05-27T02:48:26.529288766Z" level=info msg="Container 7bc12eec1e9092d001b64f89ee1adb3f5e13264ac35dffbe160ccceb9ae51e89: CDI devices from CRI Config.CDIDevices: []" May 27 02:48:26.538675 systemd[1]: Started cri-containerd-5f29d4aee5f95cc786fe0e7f0a69b53363a2b0bed0d30ea74f889271c6a0fd8b.scope - libcontainer container 5f29d4aee5f95cc786fe0e7f0a69b53363a2b0bed0d30ea74f889271c6a0fd8b. May 27 02:48:26.540369 systemd[1]: Started cri-containerd-7f243aff205ee5fdadefa412f98edd5255163c5d8813e52453c079eca16ba358.scope - libcontainer container 7f243aff205ee5fdadefa412f98edd5255163c5d8813e52453c079eca16ba358. May 27 02:48:26.558918 containerd[1879]: time="2025-05-27T02:48:26.558869721Z" level=info msg="CreateContainer within sandbox \"a407506d360f732e8bbc3f98ac909d06d2fe17dece17e748655713c11b5cbfe2\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"7bc12eec1e9092d001b64f89ee1adb3f5e13264ac35dffbe160ccceb9ae51e89\"" May 27 02:48:26.560123 containerd[1879]: time="2025-05-27T02:48:26.560054754Z" level=info msg="StartContainer for \"7bc12eec1e9092d001b64f89ee1adb3f5e13264ac35dffbe160ccceb9ae51e89\"" May 27 02:48:26.561350 containerd[1879]: time="2025-05-27T02:48:26.561289645Z" level=info msg="connecting to shim 7bc12eec1e9092d001b64f89ee1adb3f5e13264ac35dffbe160ccceb9ae51e89" address="unix:///run/containerd/s/16d4c36cb7fa5b0b99dc1b95a213136999d0183f1547e8fca334248fadf639df" protocol=ttrpc version=3 May 27 02:48:26.583837 systemd[1]: Started cri-containerd-7bc12eec1e9092d001b64f89ee1adb3f5e13264ac35dffbe160ccceb9ae51e89.scope - libcontainer container 7bc12eec1e9092d001b64f89ee1adb3f5e13264ac35dffbe160ccceb9ae51e89. May 27 02:48:26.611890 containerd[1879]: time="2025-05-27T02:48:26.611849688Z" level=info msg="StartContainer for \"5f29d4aee5f95cc786fe0e7f0a69b53363a2b0bed0d30ea74f889271c6a0fd8b\" returns successfully" May 27 02:48:26.613595 containerd[1879]: time="2025-05-27T02:48:26.613498078Z" level=info msg="StartContainer for \"7f243aff205ee5fdadefa412f98edd5255163c5d8813e52453c079eca16ba358\" returns successfully" May 27 02:48:26.670153 containerd[1879]: time="2025-05-27T02:48:26.670111652Z" level=info msg="StartContainer for \"7bc12eec1e9092d001b64f89ee1adb3f5e13264ac35dffbe160ccceb9ae51e89\" returns successfully" May 27 02:48:26.861782 kubelet[2910]: E0527 02:48:26.861662 2910 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-a-583de22c75\" not found" node="ci-4344.0.0-a-583de22c75" May 27 02:48:26.871866 kubelet[2910]: E0527 02:48:26.871836 2910 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-a-583de22c75\" not found" node="ci-4344.0.0-a-583de22c75" May 27 02:48:26.873850 kubelet[2910]: E0527 02:48:26.873817 2910 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-a-583de22c75\" not found" node="ci-4344.0.0-a-583de22c75" May 27 02:48:27.817521 kubelet[2910]: E0527 02:48:27.817463 2910 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4344.0.0-a-583de22c75\" not found" node="ci-4344.0.0-a-583de22c75" May 27 02:48:27.875668 kubelet[2910]: E0527 02:48:27.875490 2910 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-a-583de22c75\" not found" node="ci-4344.0.0-a-583de22c75" May 27 02:48:27.875668 kubelet[2910]: E0527 02:48:27.875511 2910 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-a-583de22c75\" not found" node="ci-4344.0.0-a-583de22c75" May 27 02:48:27.999864 kubelet[2910]: I0527 02:48:27.999794 2910 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344.0.0-a-583de22c75" May 27 02:48:28.012548 kubelet[2910]: I0527 02:48:28.012507 2910 kubelet_node_status.go:78] "Successfully registered node" node="ci-4344.0.0-a-583de22c75" May 27 02:48:28.012548 kubelet[2910]: E0527 02:48:28.012547 2910 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4344.0.0-a-583de22c75\": node \"ci-4344.0.0-a-583de22c75\" not found" May 27 02:48:28.026227 kubelet[2910]: E0527 02:48:28.026183 2910 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4344.0.0-a-583de22c75\" not found" May 27 02:48:28.127293 kubelet[2910]: E0527 02:48:28.127148 2910 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4344.0.0-a-583de22c75\" not found" May 27 02:48:28.228064 kubelet[2910]: E0527 02:48:28.228012 2910 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4344.0.0-a-583de22c75\" not found" May 27 02:48:28.309686 kubelet[2910]: I0527 02:48:28.309640 2910 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4344.0.0-a-583de22c75" May 27 02:48:28.315203 kubelet[2910]: E0527 02:48:28.315019 2910 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4344.0.0-a-583de22c75\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4344.0.0-a-583de22c75" May 27 02:48:28.315203 kubelet[2910]: I0527 02:48:28.315056 2910 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4344.0.0-a-583de22c75" May 27 02:48:28.317206 kubelet[2910]: E0527 02:48:28.317159 2910 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4344.0.0-a-583de22c75\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4344.0.0-a-583de22c75" May 27 02:48:28.317206 kubelet[2910]: I0527 02:48:28.317194 2910 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4344.0.0-a-583de22c75" May 27 02:48:28.319038 kubelet[2910]: E0527 02:48:28.318997 2910 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4344.0.0-a-583de22c75\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4344.0.0-a-583de22c75" May 27 02:48:28.796856 kubelet[2910]: I0527 02:48:28.796815 2910 apiserver.go:52] "Watching apiserver" May 27 02:48:28.807390 kubelet[2910]: I0527 02:48:28.807322 2910 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 27 02:48:30.253530 systemd[1]: Reload requested from client PID 3184 ('systemctl') (unit session-9.scope)... May 27 02:48:30.253882 systemd[1]: Reloading... May 27 02:48:30.358539 zram_generator::config[3229]: No configuration found. May 27 02:48:30.435134 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 02:48:30.528905 systemd[1]: Reloading finished in 274 ms. May 27 02:48:30.559600 kubelet[2910]: I0527 02:48:30.559540 2910 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 02:48:30.560137 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 27 02:48:30.571852 systemd[1]: kubelet.service: Deactivated successfully. May 27 02:48:30.572074 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 02:48:30.572133 systemd[1]: kubelet.service: Consumed 791ms CPU time, 125.7M memory peak. May 27 02:48:30.574966 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 02:48:30.697402 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 02:48:30.703893 (kubelet)[3293]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 27 02:48:30.780938 kubelet[3293]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 02:48:30.780938 kubelet[3293]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 27 02:48:30.780938 kubelet[3293]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 02:48:30.780938 kubelet[3293]: I0527 02:48:30.780884 3293 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 27 02:48:30.790439 kubelet[3293]: I0527 02:48:30.789802 3293 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" May 27 02:48:30.790439 kubelet[3293]: I0527 02:48:30.789835 3293 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 27 02:48:30.791592 kubelet[3293]: I0527 02:48:30.791562 3293 server.go:954] "Client rotation is on, will bootstrap in background" May 27 02:48:30.792766 kubelet[3293]: I0527 02:48:30.792736 3293 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 27 02:48:30.794760 kubelet[3293]: I0527 02:48:30.794586 3293 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 02:48:30.798734 kubelet[3293]: I0527 02:48:30.798709 3293 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 27 02:48:30.801776 kubelet[3293]: I0527 02:48:30.801740 3293 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 27 02:48:30.802112 kubelet[3293]: I0527 02:48:30.802041 3293 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 27 02:48:30.802298 kubelet[3293]: I0527 02:48:30.802113 3293 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4344.0.0-a-583de22c75","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 27 02:48:30.802356 kubelet[3293]: I0527 02:48:30.802332 3293 topology_manager.go:138] "Creating topology manager with none policy" May 27 02:48:30.802446 kubelet[3293]: I0527 02:48:30.802430 3293 container_manager_linux.go:304] "Creating device plugin manager" May 27 02:48:30.802567 kubelet[3293]: I0527 02:48:30.802555 3293 state_mem.go:36] "Initialized new in-memory state store" May 27 02:48:30.802786 kubelet[3293]: I0527 02:48:30.802770 3293 kubelet.go:446] "Attempting to sync node with API server" May 27 02:48:30.802814 kubelet[3293]: I0527 02:48:30.802790 3293 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" May 27 02:48:30.802831 kubelet[3293]: I0527 02:48:30.802814 3293 kubelet.go:352] "Adding apiserver pod source" May 27 02:48:30.802831 kubelet[3293]: I0527 02:48:30.802824 3293 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 27 02:48:30.805405 kubelet[3293]: I0527 02:48:30.805335 3293 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 27 02:48:30.805720 kubelet[3293]: I0527 02:48:30.805698 3293 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 27 02:48:30.806129 kubelet[3293]: I0527 02:48:30.806097 3293 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 27 02:48:30.806129 kubelet[3293]: I0527 02:48:30.806131 3293 server.go:1287] "Started kubelet" May 27 02:48:30.809515 kubelet[3293]: I0527 02:48:30.809110 3293 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 27 02:48:30.819941 kubelet[3293]: I0527 02:48:30.819810 3293 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 May 27 02:48:30.821188 kubelet[3293]: I0527 02:48:30.820726 3293 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 27 02:48:30.822277 kubelet[3293]: I0527 02:48:30.821824 3293 volume_manager.go:297] "Starting Kubelet Volume Manager" May 27 02:48:30.822277 kubelet[3293]: I0527 02:48:30.821839 3293 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 27 02:48:30.822277 kubelet[3293]: I0527 02:48:30.822136 3293 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 27 02:48:30.823870 kubelet[3293]: I0527 02:48:30.823749 3293 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 27 02:48:30.824164 kubelet[3293]: I0527 02:48:30.824149 3293 reconciler.go:26] "Reconciler: start to sync state" May 27 02:48:30.828921 kubelet[3293]: I0527 02:48:30.827751 3293 server.go:479] "Adding debug handlers to kubelet server" May 27 02:48:30.834261 kubelet[3293]: I0527 02:48:30.834215 3293 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 27 02:48:30.835425 kubelet[3293]: I0527 02:48:30.835398 3293 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 27 02:48:30.835603 kubelet[3293]: I0527 02:48:30.835592 3293 status_manager.go:227] "Starting to sync pod status with apiserver" May 27 02:48:30.835683 kubelet[3293]: I0527 02:48:30.835673 3293 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 27 02:48:30.835727 kubelet[3293]: I0527 02:48:30.835719 3293 kubelet.go:2382] "Starting kubelet main sync loop" May 27 02:48:30.836118 kubelet[3293]: E0527 02:48:30.836003 3293 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 27 02:48:30.836235 kubelet[3293]: E0527 02:48:30.836214 3293 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 27 02:48:30.836277 kubelet[3293]: I0527 02:48:30.836255 3293 factory.go:221] Registration of the systemd container factory successfully May 27 02:48:30.836442 kubelet[3293]: I0527 02:48:30.836423 3293 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 27 02:48:30.841500 kubelet[3293]: I0527 02:48:30.841460 3293 factory.go:221] Registration of the containerd container factory successfully May 27 02:48:30.894971 kubelet[3293]: I0527 02:48:30.894943 3293 cpu_manager.go:221] "Starting CPU manager" policy="none" May 27 02:48:30.894971 kubelet[3293]: I0527 02:48:30.894959 3293 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 27 02:48:30.894971 kubelet[3293]: I0527 02:48:30.894980 3293 state_mem.go:36] "Initialized new in-memory state store" May 27 02:48:30.895228 kubelet[3293]: I0527 02:48:30.895129 3293 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 27 02:48:30.895228 kubelet[3293]: I0527 02:48:30.895138 3293 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 27 02:48:30.895228 kubelet[3293]: I0527 02:48:30.895152 3293 policy_none.go:49] "None policy: Start" May 27 02:48:30.895228 kubelet[3293]: I0527 02:48:30.895160 3293 memory_manager.go:186] "Starting memorymanager" policy="None" May 27 02:48:30.895228 kubelet[3293]: I0527 02:48:30.895167 3293 state_mem.go:35] "Initializing new in-memory state store" May 27 02:48:30.895330 kubelet[3293]: I0527 02:48:30.895234 3293 state_mem.go:75] "Updated machine memory state" May 27 02:48:30.900008 kubelet[3293]: I0527 02:48:30.899466 3293 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 27 02:48:30.900008 kubelet[3293]: I0527 02:48:30.899695 3293 eviction_manager.go:189] "Eviction manager: starting control loop" May 27 02:48:30.900008 kubelet[3293]: I0527 02:48:30.899708 3293 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 27 02:48:30.900008 kubelet[3293]: I0527 02:48:30.899947 3293 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 27 02:48:30.905073 kubelet[3293]: E0527 02:48:30.905049 3293 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 27 02:48:30.937102 kubelet[3293]: I0527 02:48:30.937053 3293 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4344.0.0-a-583de22c75" May 27 02:48:30.937341 kubelet[3293]: I0527 02:48:30.937051 3293 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4344.0.0-a-583de22c75" May 27 02:48:30.937671 kubelet[3293]: I0527 02:48:30.937188 3293 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4344.0.0-a-583de22c75" May 27 02:48:30.945165 kubelet[3293]: W0527 02:48:30.945122 3293 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 27 02:48:30.945439 kubelet[3293]: W0527 02:48:30.945423 3293 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 27 02:48:30.948036 kubelet[3293]: W0527 02:48:30.948000 3293 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 27 02:48:31.009292 kubelet[3293]: I0527 02:48:31.008954 3293 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344.0.0-a-583de22c75" May 27 02:48:31.023509 kubelet[3293]: I0527 02:48:31.023455 3293 kubelet_node_status.go:124] "Node was previously registered" node="ci-4344.0.0-a-583de22c75" May 27 02:48:31.023649 kubelet[3293]: I0527 02:48:31.023576 3293 kubelet_node_status.go:78] "Successfully registered node" node="ci-4344.0.0-a-583de22c75" May 27 02:48:31.024643 kubelet[3293]: I0527 02:48:31.024616 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b820e7e747c743f008459849cb959226-kubeconfig\") pod \"kube-scheduler-ci-4344.0.0-a-583de22c75\" (UID: \"b820e7e747c743f008459849cb959226\") " pod="kube-system/kube-scheduler-ci-4344.0.0-a-583de22c75" May 27 02:48:31.024685 kubelet[3293]: I0527 02:48:31.024644 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/81dd4481c0f6609d3c5d04c3b3ce8709-ca-certs\") pod \"kube-apiserver-ci-4344.0.0-a-583de22c75\" (UID: \"81dd4481c0f6609d3c5d04c3b3ce8709\") " pod="kube-system/kube-apiserver-ci-4344.0.0-a-583de22c75" May 27 02:48:31.024685 kubelet[3293]: I0527 02:48:31.024655 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/81dd4481c0f6609d3c5d04c3b3ce8709-k8s-certs\") pod \"kube-apiserver-ci-4344.0.0-a-583de22c75\" (UID: \"81dd4481c0f6609d3c5d04c3b3ce8709\") " pod="kube-system/kube-apiserver-ci-4344.0.0-a-583de22c75" May 27 02:48:31.024685 kubelet[3293]: I0527 02:48:31.024669 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/81dd4481c0f6609d3c5d04c3b3ce8709-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4344.0.0-a-583de22c75\" (UID: \"81dd4481c0f6609d3c5d04c3b3ce8709\") " pod="kube-system/kube-apiserver-ci-4344.0.0-a-583de22c75" May 27 02:48:31.024685 kubelet[3293]: I0527 02:48:31.024681 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f5b3c1c02744d6d8865ece708e61218b-ca-certs\") pod \"kube-controller-manager-ci-4344.0.0-a-583de22c75\" (UID: \"f5b3c1c02744d6d8865ece708e61218b\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-583de22c75" May 27 02:48:31.024753 kubelet[3293]: I0527 02:48:31.024692 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/f5b3c1c02744d6d8865ece708e61218b-flexvolume-dir\") pod \"kube-controller-manager-ci-4344.0.0-a-583de22c75\" (UID: \"f5b3c1c02744d6d8865ece708e61218b\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-583de22c75" May 27 02:48:31.024753 kubelet[3293]: I0527 02:48:31.024701 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f5b3c1c02744d6d8865ece708e61218b-k8s-certs\") pod \"kube-controller-manager-ci-4344.0.0-a-583de22c75\" (UID: \"f5b3c1c02744d6d8865ece708e61218b\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-583de22c75" May 27 02:48:31.024753 kubelet[3293]: I0527 02:48:31.024711 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f5b3c1c02744d6d8865ece708e61218b-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4344.0.0-a-583de22c75\" (UID: \"f5b3c1c02744d6d8865ece708e61218b\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-583de22c75" May 27 02:48:31.024753 kubelet[3293]: I0527 02:48:31.024724 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f5b3c1c02744d6d8865ece708e61218b-kubeconfig\") pod \"kube-controller-manager-ci-4344.0.0-a-583de22c75\" (UID: \"f5b3c1c02744d6d8865ece708e61218b\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-583de22c75" May 27 02:48:31.514527 kernel: hv_balloon: Max. dynamic memory size: 4096 MB May 27 02:48:31.804011 kubelet[3293]: I0527 02:48:31.803604 3293 apiserver.go:52] "Watching apiserver" May 27 02:48:31.824506 kubelet[3293]: I0527 02:48:31.824375 3293 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 27 02:48:31.835291 kubelet[3293]: I0527 02:48:31.834897 3293 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4344.0.0-a-583de22c75" podStartSLOduration=1.8348759970000001 podStartE2EDuration="1.834875997s" podCreationTimestamp="2025-05-27 02:48:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 02:48:31.824437525 +0000 UTC m=+1.113754519" watchObservedRunningTime="2025-05-27 02:48:31.834875997 +0000 UTC m=+1.124192999" May 27 02:48:31.845270 kubelet[3293]: I0527 02:48:31.845207 3293 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4344.0.0-a-583de22c75" podStartSLOduration=1.845176594 podStartE2EDuration="1.845176594s" podCreationTimestamp="2025-05-27 02:48:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 02:48:31.83541109 +0000 UTC m=+1.124728084" watchObservedRunningTime="2025-05-27 02:48:31.845176594 +0000 UTC m=+1.134493588" May 27 02:48:31.845677 kubelet[3293]: I0527 02:48:31.845285 3293 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4344.0.0-a-583de22c75" podStartSLOduration=1.845280996 podStartE2EDuration="1.845280996s" podCreationTimestamp="2025-05-27 02:48:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 02:48:31.845098288 +0000 UTC m=+1.134415282" watchObservedRunningTime="2025-05-27 02:48:31.845280996 +0000 UTC m=+1.134598006" May 27 02:48:31.876364 kubelet[3293]: I0527 02:48:31.876329 3293 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4344.0.0-a-583de22c75" May 27 02:48:31.877745 kubelet[3293]: I0527 02:48:31.877713 3293 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4344.0.0-a-583de22c75" May 27 02:48:31.888792 kubelet[3293]: W0527 02:48:31.888750 3293 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 27 02:48:31.889866 kubelet[3293]: E0527 02:48:31.888822 3293 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4344.0.0-a-583de22c75\" already exists" pod="kube-system/kube-apiserver-ci-4344.0.0-a-583de22c75" May 27 02:48:31.889866 kubelet[3293]: W0527 02:48:31.890413 3293 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 27 02:48:31.889866 kubelet[3293]: E0527 02:48:31.890471 3293 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4344.0.0-a-583de22c75\" already exists" pod="kube-system/kube-scheduler-ci-4344.0.0-a-583de22c75" May 27 02:48:33.242964 update_engine[1864]: I20250527 02:48:33.242851 1864 update_attempter.cc:509] Updating boot flags... May 27 02:48:36.201516 kubelet[3293]: I0527 02:48:36.201398 3293 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 27 02:48:36.203083 containerd[1879]: time="2025-05-27T02:48:36.202627517Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 27 02:48:36.203990 kubelet[3293]: I0527 02:48:36.203433 3293 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 27 02:48:37.105439 systemd[1]: Created slice kubepods-besteffort-podfd5540ad_273f_4b3b_b67c_cb8df10cb1d9.slice - libcontainer container kubepods-besteffort-podfd5540ad_273f_4b3b_b67c_cb8df10cb1d9.slice. May 27 02:48:37.163159 kubelet[3293]: I0527 02:48:37.163104 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6zx9\" (UniqueName: \"kubernetes.io/projected/fd5540ad-273f-4b3b-b67c-cb8df10cb1d9-kube-api-access-p6zx9\") pod \"kube-proxy-5shcn\" (UID: \"fd5540ad-273f-4b3b-b67c-cb8df10cb1d9\") " pod="kube-system/kube-proxy-5shcn" May 27 02:48:37.163339 kubelet[3293]: I0527 02:48:37.163192 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/fd5540ad-273f-4b3b-b67c-cb8df10cb1d9-xtables-lock\") pod \"kube-proxy-5shcn\" (UID: \"fd5540ad-273f-4b3b-b67c-cb8df10cb1d9\") " pod="kube-system/kube-proxy-5shcn" May 27 02:48:37.163339 kubelet[3293]: I0527 02:48:37.163206 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/fd5540ad-273f-4b3b-b67c-cb8df10cb1d9-kube-proxy\") pod \"kube-proxy-5shcn\" (UID: \"fd5540ad-273f-4b3b-b67c-cb8df10cb1d9\") " pod="kube-system/kube-proxy-5shcn" May 27 02:48:37.163339 kubelet[3293]: I0527 02:48:37.163217 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd5540ad-273f-4b3b-b67c-cb8df10cb1d9-lib-modules\") pod \"kube-proxy-5shcn\" (UID: \"fd5540ad-273f-4b3b-b67c-cb8df10cb1d9\") " pod="kube-system/kube-proxy-5shcn" May 27 02:48:37.275423 systemd[1]: Created slice kubepods-besteffort-pode9878c16_0207_480b_86df_6090e2635c8b.slice - libcontainer container kubepods-besteffort-pode9878c16_0207_480b_86df_6090e2635c8b.slice. May 27 02:48:37.364547 kubelet[3293]: I0527 02:48:37.364399 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e9878c16-0207-480b-86df-6090e2635c8b-var-lib-calico\") pod \"tigera-operator-844669ff44-q24z2\" (UID: \"e9878c16-0207-480b-86df-6090e2635c8b\") " pod="tigera-operator/tigera-operator-844669ff44-q24z2" May 27 02:48:37.364547 kubelet[3293]: I0527 02:48:37.364448 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc8hg\" (UniqueName: \"kubernetes.io/projected/e9878c16-0207-480b-86df-6090e2635c8b-kube-api-access-zc8hg\") pod \"tigera-operator-844669ff44-q24z2\" (UID: \"e9878c16-0207-480b-86df-6090e2635c8b\") " pod="tigera-operator/tigera-operator-844669ff44-q24z2" May 27 02:48:37.415495 containerd[1879]: time="2025-05-27T02:48:37.415299867Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-5shcn,Uid:fd5540ad-273f-4b3b-b67c-cb8df10cb1d9,Namespace:kube-system,Attempt:0,}" May 27 02:48:37.466348 containerd[1879]: time="2025-05-27T02:48:37.466209277Z" level=info msg="connecting to shim 65a0483747cc1d4d46d92cbc875e754d221b1aa9eeb187bed95c9c48bd8ba716" address="unix:///run/containerd/s/c3ceab9e828a3c540cae08e24b6acefc405135f29345f6feab67199464fa7e62" namespace=k8s.io protocol=ttrpc version=3 May 27 02:48:37.494701 systemd[1]: Started cri-containerd-65a0483747cc1d4d46d92cbc875e754d221b1aa9eeb187bed95c9c48bd8ba716.scope - libcontainer container 65a0483747cc1d4d46d92cbc875e754d221b1aa9eeb187bed95c9c48bd8ba716. May 27 02:48:37.519830 containerd[1879]: time="2025-05-27T02:48:37.519600133Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-5shcn,Uid:fd5540ad-273f-4b3b-b67c-cb8df10cb1d9,Namespace:kube-system,Attempt:0,} returns sandbox id \"65a0483747cc1d4d46d92cbc875e754d221b1aa9eeb187bed95c9c48bd8ba716\"" May 27 02:48:37.524822 containerd[1879]: time="2025-05-27T02:48:37.524638433Z" level=info msg="CreateContainer within sandbox \"65a0483747cc1d4d46d92cbc875e754d221b1aa9eeb187bed95c9c48bd8ba716\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 27 02:48:37.555309 containerd[1879]: time="2025-05-27T02:48:37.555260430Z" level=info msg="Container 10c4b992b631f0b7b9593d5bbe81406c69a9178e3da2454f3d60bad6dc1f9269: CDI devices from CRI Config.CDIDevices: []" May 27 02:48:37.572671 containerd[1879]: time="2025-05-27T02:48:37.572621745Z" level=info msg="CreateContainer within sandbox \"65a0483747cc1d4d46d92cbc875e754d221b1aa9eeb187bed95c9c48bd8ba716\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"10c4b992b631f0b7b9593d5bbe81406c69a9178e3da2454f3d60bad6dc1f9269\"" May 27 02:48:37.573271 containerd[1879]: time="2025-05-27T02:48:37.573231636Z" level=info msg="StartContainer for \"10c4b992b631f0b7b9593d5bbe81406c69a9178e3da2454f3d60bad6dc1f9269\"" May 27 02:48:37.577168 containerd[1879]: time="2025-05-27T02:48:37.577123356Z" level=info msg="connecting to shim 10c4b992b631f0b7b9593d5bbe81406c69a9178e3da2454f3d60bad6dc1f9269" address="unix:///run/containerd/s/c3ceab9e828a3c540cae08e24b6acefc405135f29345f6feab67199464fa7e62" protocol=ttrpc version=3 May 27 02:48:37.580785 containerd[1879]: time="2025-05-27T02:48:37.580736181Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-844669ff44-q24z2,Uid:e9878c16-0207-480b-86df-6090e2635c8b,Namespace:tigera-operator,Attempt:0,}" May 27 02:48:37.607939 systemd[1]: Started cri-containerd-10c4b992b631f0b7b9593d5bbe81406c69a9178e3da2454f3d60bad6dc1f9269.scope - libcontainer container 10c4b992b631f0b7b9593d5bbe81406c69a9178e3da2454f3d60bad6dc1f9269. May 27 02:48:37.651086 containerd[1879]: time="2025-05-27T02:48:37.650975775Z" level=info msg="connecting to shim 57d78cd204716030f8db5acbd93f82cacdc61a6f64e7bdf3d0cbb272bf9de372" address="unix:///run/containerd/s/f826ea01bf2eefd863aa88ed22c20e04eed3a4514591a3ee71995f145491e364" namespace=k8s.io protocol=ttrpc version=3 May 27 02:48:37.657057 containerd[1879]: time="2025-05-27T02:48:37.656947096Z" level=info msg="StartContainer for \"10c4b992b631f0b7b9593d5bbe81406c69a9178e3da2454f3d60bad6dc1f9269\" returns successfully" May 27 02:48:37.678702 systemd[1]: Started cri-containerd-57d78cd204716030f8db5acbd93f82cacdc61a6f64e7bdf3d0cbb272bf9de372.scope - libcontainer container 57d78cd204716030f8db5acbd93f82cacdc61a6f64e7bdf3d0cbb272bf9de372. May 27 02:48:37.722822 containerd[1879]: time="2025-05-27T02:48:37.722775145Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-844669ff44-q24z2,Uid:e9878c16-0207-480b-86df-6090e2635c8b,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"57d78cd204716030f8db5acbd93f82cacdc61a6f64e7bdf3d0cbb272bf9de372\"" May 27 02:48:37.727049 containerd[1879]: time="2025-05-27T02:48:37.727003372Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\"" May 27 02:48:37.913199 kubelet[3293]: I0527 02:48:37.912876 3293 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-5shcn" podStartSLOduration=0.91285692 podStartE2EDuration="912.85692ms" podCreationTimestamp="2025-05-27 02:48:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 02:48:37.90035162 +0000 UTC m=+7.189668614" watchObservedRunningTime="2025-05-27 02:48:37.91285692 +0000 UTC m=+7.202173914" May 27 02:48:38.286657 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1344975789.mount: Deactivated successfully. May 27 02:48:39.422877 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2025429087.mount: Deactivated successfully. May 27 02:48:40.019146 containerd[1879]: time="2025-05-27T02:48:40.018596724Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:40.021189 containerd[1879]: time="2025-05-27T02:48:40.021139268Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.0: active requests=0, bytes read=22143480" May 27 02:48:40.028788 containerd[1879]: time="2025-05-27T02:48:40.028731992Z" level=info msg="ImageCreate event name:\"sha256:171854d50ba608218142ad5d32c7dd12ce55d536f02872e56e7c04c1f0a96a6b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:40.033756 containerd[1879]: time="2025-05-27T02:48:40.033686203Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:40.034466 containerd[1879]: time="2025-05-27T02:48:40.034195593Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.0\" with image id \"sha256:171854d50ba608218142ad5d32c7dd12ce55d536f02872e56e7c04c1f0a96a6b\", repo tag \"quay.io/tigera/operator:v1.38.0\", repo digest \"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\", size \"22139475\" in 2.307152228s" May 27 02:48:40.034466 containerd[1879]: time="2025-05-27T02:48:40.034229786Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\" returns image reference \"sha256:171854d50ba608218142ad5d32c7dd12ce55d536f02872e56e7c04c1f0a96a6b\"" May 27 02:48:40.036640 containerd[1879]: time="2025-05-27T02:48:40.036604773Z" level=info msg="CreateContainer within sandbox \"57d78cd204716030f8db5acbd93f82cacdc61a6f64e7bdf3d0cbb272bf9de372\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 27 02:48:40.071522 containerd[1879]: time="2025-05-27T02:48:40.071052729Z" level=info msg="Container 2b880251dca76fdc54b8bcfa56306ac19cf9e4f59dd7ec6a135c26ed2da3cdaa: CDI devices from CRI Config.CDIDevices: []" May 27 02:48:40.072350 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1144121018.mount: Deactivated successfully. May 27 02:48:40.087122 containerd[1879]: time="2025-05-27T02:48:40.087077306Z" level=info msg="CreateContainer within sandbox \"57d78cd204716030f8db5acbd93f82cacdc61a6f64e7bdf3d0cbb272bf9de372\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"2b880251dca76fdc54b8bcfa56306ac19cf9e4f59dd7ec6a135c26ed2da3cdaa\"" May 27 02:48:40.088355 containerd[1879]: time="2025-05-27T02:48:40.088242795Z" level=info msg="StartContainer for \"2b880251dca76fdc54b8bcfa56306ac19cf9e4f59dd7ec6a135c26ed2da3cdaa\"" May 27 02:48:40.089497 containerd[1879]: time="2025-05-27T02:48:40.089411739Z" level=info msg="connecting to shim 2b880251dca76fdc54b8bcfa56306ac19cf9e4f59dd7ec6a135c26ed2da3cdaa" address="unix:///run/containerd/s/f826ea01bf2eefd863aa88ed22c20e04eed3a4514591a3ee71995f145491e364" protocol=ttrpc version=3 May 27 02:48:40.107670 systemd[1]: Started cri-containerd-2b880251dca76fdc54b8bcfa56306ac19cf9e4f59dd7ec6a135c26ed2da3cdaa.scope - libcontainer container 2b880251dca76fdc54b8bcfa56306ac19cf9e4f59dd7ec6a135c26ed2da3cdaa. May 27 02:48:40.138291 containerd[1879]: time="2025-05-27T02:48:40.138255659Z" level=info msg="StartContainer for \"2b880251dca76fdc54b8bcfa56306ac19cf9e4f59dd7ec6a135c26ed2da3cdaa\" returns successfully" May 27 02:48:44.455829 kubelet[3293]: I0527 02:48:44.455747 3293 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-844669ff44-q24z2" podStartSLOduration=5.145477317 podStartE2EDuration="7.455733206s" podCreationTimestamp="2025-05-27 02:48:37 +0000 UTC" firstStartedPulling="2025-05-27 02:48:37.724920708 +0000 UTC m=+7.014237702" lastFinishedPulling="2025-05-27 02:48:40.035176597 +0000 UTC m=+9.324493591" observedRunningTime="2025-05-27 02:48:40.906418355 +0000 UTC m=+10.195735357" watchObservedRunningTime="2025-05-27 02:48:44.455733206 +0000 UTC m=+13.745050208" May 27 02:48:45.386428 sudo[2343]: pam_unix(sudo:session): session closed for user root May 27 02:48:45.458513 sshd[2342]: Connection closed by 10.200.16.10 port 58530 May 27 02:48:45.459125 sshd-session[2340]: pam_unix(sshd:session): session closed for user core May 27 02:48:45.465807 systemd-logind[1862]: Session 9 logged out. Waiting for processes to exit. May 27 02:48:45.467923 systemd[1]: sshd@6-10.200.20.22:22-10.200.16.10:58530.service: Deactivated successfully. May 27 02:48:45.472325 systemd[1]: session-9.scope: Deactivated successfully. May 27 02:48:45.472793 systemd[1]: session-9.scope: Consumed 3.411s CPU time, 228.1M memory peak. May 27 02:48:45.476115 systemd-logind[1862]: Removed session 9. May 27 02:48:49.057259 systemd[1]: Created slice kubepods-besteffort-podb14a9489_3402_4fdf_bf69_e3ad8e03db6d.slice - libcontainer container kubepods-besteffort-podb14a9489_3402_4fdf_bf69_e3ad8e03db6d.slice. May 27 02:48:49.146034 kubelet[3293]: I0527 02:48:49.145827 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b14a9489-3402-4fdf-bf69-e3ad8e03db6d-tigera-ca-bundle\") pod \"calico-typha-5b55fff4dd-9sh6j\" (UID: \"b14a9489-3402-4fdf-bf69-e3ad8e03db6d\") " pod="calico-system/calico-typha-5b55fff4dd-9sh6j" May 27 02:48:49.146034 kubelet[3293]: I0527 02:48:49.145877 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/b14a9489-3402-4fdf-bf69-e3ad8e03db6d-typha-certs\") pod \"calico-typha-5b55fff4dd-9sh6j\" (UID: \"b14a9489-3402-4fdf-bf69-e3ad8e03db6d\") " pod="calico-system/calico-typha-5b55fff4dd-9sh6j" May 27 02:48:49.146034 kubelet[3293]: I0527 02:48:49.145897 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkdc5\" (UniqueName: \"kubernetes.io/projected/b14a9489-3402-4fdf-bf69-e3ad8e03db6d-kube-api-access-hkdc5\") pod \"calico-typha-5b55fff4dd-9sh6j\" (UID: \"b14a9489-3402-4fdf-bf69-e3ad8e03db6d\") " pod="calico-system/calico-typha-5b55fff4dd-9sh6j" May 27 02:48:49.224158 systemd[1]: Created slice kubepods-besteffort-pod1d15249c_c3ad_4103_926e_7f1ffc66ff29.slice - libcontainer container kubepods-besteffort-pod1d15249c_c3ad_4103_926e_7f1ffc66ff29.slice. May 27 02:48:49.247065 kubelet[3293]: I0527 02:48:49.247013 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1d15249c-c3ad-4103-926e-7f1ffc66ff29-lib-modules\") pod \"calico-node-tfnsm\" (UID: \"1d15249c-c3ad-4103-926e-7f1ffc66ff29\") " pod="calico-system/calico-node-tfnsm" May 27 02:48:49.247522 kubelet[3293]: I0527 02:48:49.247284 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/1d15249c-c3ad-4103-926e-7f1ffc66ff29-flexvol-driver-host\") pod \"calico-node-tfnsm\" (UID: \"1d15249c-c3ad-4103-926e-7f1ffc66ff29\") " pod="calico-system/calico-node-tfnsm" May 27 02:48:49.247522 kubelet[3293]: I0527 02:48:49.247304 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cft6g\" (UniqueName: \"kubernetes.io/projected/1d15249c-c3ad-4103-926e-7f1ffc66ff29-kube-api-access-cft6g\") pod \"calico-node-tfnsm\" (UID: \"1d15249c-c3ad-4103-926e-7f1ffc66ff29\") " pod="calico-system/calico-node-tfnsm" May 27 02:48:49.247522 kubelet[3293]: I0527 02:48:49.247318 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/1d15249c-c3ad-4103-926e-7f1ffc66ff29-xtables-lock\") pod \"calico-node-tfnsm\" (UID: \"1d15249c-c3ad-4103-926e-7f1ffc66ff29\") " pod="calico-system/calico-node-tfnsm" May 27 02:48:49.247522 kubelet[3293]: I0527 02:48:49.247339 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d15249c-c3ad-4103-926e-7f1ffc66ff29-tigera-ca-bundle\") pod \"calico-node-tfnsm\" (UID: \"1d15249c-c3ad-4103-926e-7f1ffc66ff29\") " pod="calico-system/calico-node-tfnsm" May 27 02:48:49.247522 kubelet[3293]: I0527 02:48:49.247352 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/1d15249c-c3ad-4103-926e-7f1ffc66ff29-cni-bin-dir\") pod \"calico-node-tfnsm\" (UID: \"1d15249c-c3ad-4103-926e-7f1ffc66ff29\") " pod="calico-system/calico-node-tfnsm" May 27 02:48:49.247825 kubelet[3293]: I0527 02:48:49.247360 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/1d15249c-c3ad-4103-926e-7f1ffc66ff29-cni-log-dir\") pod \"calico-node-tfnsm\" (UID: \"1d15249c-c3ad-4103-926e-7f1ffc66ff29\") " pod="calico-system/calico-node-tfnsm" May 27 02:48:49.247825 kubelet[3293]: I0527 02:48:49.247370 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/1d15249c-c3ad-4103-926e-7f1ffc66ff29-cni-net-dir\") pod \"calico-node-tfnsm\" (UID: \"1d15249c-c3ad-4103-926e-7f1ffc66ff29\") " pod="calico-system/calico-node-tfnsm" May 27 02:48:49.247825 kubelet[3293]: I0527 02:48:49.247380 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/1d15249c-c3ad-4103-926e-7f1ffc66ff29-node-certs\") pod \"calico-node-tfnsm\" (UID: \"1d15249c-c3ad-4103-926e-7f1ffc66ff29\") " pod="calico-system/calico-node-tfnsm" May 27 02:48:49.247825 kubelet[3293]: I0527 02:48:49.247388 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/1d15249c-c3ad-4103-926e-7f1ffc66ff29-policysync\") pod \"calico-node-tfnsm\" (UID: \"1d15249c-c3ad-4103-926e-7f1ffc66ff29\") " pod="calico-system/calico-node-tfnsm" May 27 02:48:49.247825 kubelet[3293]: I0527 02:48:49.247397 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/1d15249c-c3ad-4103-926e-7f1ffc66ff29-var-run-calico\") pod \"calico-node-tfnsm\" (UID: \"1d15249c-c3ad-4103-926e-7f1ffc66ff29\") " pod="calico-system/calico-node-tfnsm" May 27 02:48:49.247908 kubelet[3293]: I0527 02:48:49.247408 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/1d15249c-c3ad-4103-926e-7f1ffc66ff29-var-lib-calico\") pod \"calico-node-tfnsm\" (UID: \"1d15249c-c3ad-4103-926e-7f1ffc66ff29\") " pod="calico-system/calico-node-tfnsm" May 27 02:48:49.330046 kubelet[3293]: E0527 02:48:49.329904 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ds6zz" podUID="c823d13d-65d4-46eb-8fbc-bfdffdd173a7" May 27 02:48:49.348802 kubelet[3293]: I0527 02:48:49.347592 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/c823d13d-65d4-46eb-8fbc-bfdffdd173a7-varrun\") pod \"csi-node-driver-ds6zz\" (UID: \"c823d13d-65d4-46eb-8fbc-bfdffdd173a7\") " pod="calico-system/csi-node-driver-ds6zz" May 27 02:48:49.348802 kubelet[3293]: I0527 02:48:49.347676 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c823d13d-65d4-46eb-8fbc-bfdffdd173a7-kubelet-dir\") pod \"csi-node-driver-ds6zz\" (UID: \"c823d13d-65d4-46eb-8fbc-bfdffdd173a7\") " pod="calico-system/csi-node-driver-ds6zz" May 27 02:48:49.348802 kubelet[3293]: I0527 02:48:49.347729 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xj27\" (UniqueName: \"kubernetes.io/projected/c823d13d-65d4-46eb-8fbc-bfdffdd173a7-kube-api-access-2xj27\") pod \"csi-node-driver-ds6zz\" (UID: \"c823d13d-65d4-46eb-8fbc-bfdffdd173a7\") " pod="calico-system/csi-node-driver-ds6zz" May 27 02:48:49.348802 kubelet[3293]: I0527 02:48:49.347745 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c823d13d-65d4-46eb-8fbc-bfdffdd173a7-registration-dir\") pod \"csi-node-driver-ds6zz\" (UID: \"c823d13d-65d4-46eb-8fbc-bfdffdd173a7\") " pod="calico-system/csi-node-driver-ds6zz" May 27 02:48:49.348802 kubelet[3293]: I0527 02:48:49.347760 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c823d13d-65d4-46eb-8fbc-bfdffdd173a7-socket-dir\") pod \"csi-node-driver-ds6zz\" (UID: \"c823d13d-65d4-46eb-8fbc-bfdffdd173a7\") " pod="calico-system/csi-node-driver-ds6zz" May 27 02:48:49.349633 kubelet[3293]: E0527 02:48:49.349599 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.349633 kubelet[3293]: W0527 02:48:49.349627 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.349916 kubelet[3293]: E0527 02:48:49.349654 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.349962 kubelet[3293]: E0527 02:48:49.349940 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.349983 kubelet[3293]: W0527 02:48:49.349960 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.349983 kubelet[3293]: E0527 02:48:49.349974 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.350162 kubelet[3293]: E0527 02:48:49.350147 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.350162 kubelet[3293]: W0527 02:48:49.350159 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.350323 kubelet[3293]: E0527 02:48:49.350206 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.350452 kubelet[3293]: E0527 02:48:49.350437 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.350452 kubelet[3293]: W0527 02:48:49.350449 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.350637 kubelet[3293]: E0527 02:48:49.350621 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.351074 kubelet[3293]: E0527 02:48:49.351051 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.351074 kubelet[3293]: W0527 02:48:49.351071 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.351360 kubelet[3293]: E0527 02:48:49.351247 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.351533 kubelet[3293]: E0527 02:48:49.351505 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.351604 kubelet[3293]: W0527 02:48:49.351584 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.351658 kubelet[3293]: E0527 02:48:49.351642 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.352168 kubelet[3293]: E0527 02:48:49.352144 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.352168 kubelet[3293]: W0527 02:48:49.352163 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.352314 kubelet[3293]: E0527 02:48:49.352289 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.353184 kubelet[3293]: E0527 02:48:49.352783 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.353184 kubelet[3293]: W0527 02:48:49.352799 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.353184 kubelet[3293]: E0527 02:48:49.352932 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.353184 kubelet[3293]: W0527 02:48:49.352939 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.354973 kubelet[3293]: E0527 02:48:49.354326 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.355518 kubelet[3293]: W0527 02:48:49.355491 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.355586 kubelet[3293]: E0527 02:48:49.354804 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.355605 kubelet[3293]: E0527 02:48:49.355060 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.355605 kubelet[3293]: E0527 02:48:49.355597 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.358361 kubelet[3293]: E0527 02:48:49.358208 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.358361 kubelet[3293]: W0527 02:48:49.358237 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.359232 kubelet[3293]: E0527 02:48:49.359011 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.361136 kubelet[3293]: E0527 02:48:49.360823 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.361136 kubelet[3293]: W0527 02:48:49.360848 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.361719 kubelet[3293]: E0527 02:48:49.361695 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.362080 kubelet[3293]: E0527 02:48:49.362056 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.362080 kubelet[3293]: W0527 02:48:49.362077 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.362443 kubelet[3293]: E0527 02:48:49.362403 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.362443 kubelet[3293]: W0527 02:48:49.362427 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.364636 kubelet[3293]: E0527 02:48:49.364577 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.364636 kubelet[3293]: W0527 02:48:49.364606 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.364835 kubelet[3293]: E0527 02:48:49.364818 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.364835 kubelet[3293]: W0527 02:48:49.364826 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.365196 kubelet[3293]: E0527 02:48:49.364983 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.365196 kubelet[3293]: W0527 02:48:49.365184 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.365444 kubelet[3293]: E0527 02:48:49.365312 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.365588 kubelet[3293]: E0527 02:48:49.365519 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.365588 kubelet[3293]: E0527 02:48:49.365528 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.365588 kubelet[3293]: W0527 02:48:49.365534 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.365588 kubelet[3293]: E0527 02:48:49.365562 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.367427 kubelet[3293]: E0527 02:48:49.365519 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.368555 kubelet[3293]: E0527 02:48:49.367436 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.368555 kubelet[3293]: E0527 02:48:49.367448 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.370321 kubelet[3293]: E0527 02:48:49.367399 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.370928 kubelet[3293]: W0527 02:48:49.370763 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.370928 kubelet[3293]: E0527 02:48:49.370805 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.374701 kubelet[3293]: E0527 02:48:49.374379 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.374701 kubelet[3293]: W0527 02:48:49.374403 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.376068 kubelet[3293]: E0527 02:48:49.375510 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.377533 kubelet[3293]: E0527 02:48:49.377274 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.377889 kubelet[3293]: W0527 02:48:49.377654 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.378282 kubelet[3293]: E0527 02:48:49.378123 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.379557 kubelet[3293]: E0527 02:48:49.378965 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.379684 kubelet[3293]: W0527 02:48:49.379652 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.379829 kubelet[3293]: E0527 02:48:49.379803 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.380375 kubelet[3293]: E0527 02:48:49.380350 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.380375 kubelet[3293]: W0527 02:48:49.380365 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.380469 kubelet[3293]: E0527 02:48:49.380430 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.381148 kubelet[3293]: E0527 02:48:49.381118 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.381148 kubelet[3293]: W0527 02:48:49.381134 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.381456 kubelet[3293]: E0527 02:48:49.381441 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.381950 kubelet[3293]: E0527 02:48:49.381918 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.381950 kubelet[3293]: W0527 02:48:49.381939 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.382142 kubelet[3293]: E0527 02:48:49.382121 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.382531 kubelet[3293]: E0527 02:48:49.382423 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.382531 kubelet[3293]: W0527 02:48:49.382527 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.382700 kubelet[3293]: E0527 02:48:49.382676 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.383751 containerd[1879]: time="2025-05-27T02:48:49.383653799Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5b55fff4dd-9sh6j,Uid:b14a9489-3402-4fdf-bf69-e3ad8e03db6d,Namespace:calico-system,Attempt:0,}" May 27 02:48:49.384062 kubelet[3293]: E0527 02:48:49.383876 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.384062 kubelet[3293]: W0527 02:48:49.383897 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.384062 kubelet[3293]: E0527 02:48:49.384008 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.384062 kubelet[3293]: W0527 02:48:49.384013 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.384699 kubelet[3293]: E0527 02:48:49.384167 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.384699 kubelet[3293]: W0527 02:48:49.384178 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.384699 kubelet[3293]: E0527 02:48:49.384230 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.384699 kubelet[3293]: E0527 02:48:49.384246 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.384699 kubelet[3293]: E0527 02:48:49.384255 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.384699 kubelet[3293]: E0527 02:48:49.384689 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.384699 kubelet[3293]: W0527 02:48:49.384703 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.384934 kubelet[3293]: E0527 02:48:49.384914 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.385545 kubelet[3293]: E0527 02:48:49.385522 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.385545 kubelet[3293]: W0527 02:48:49.385539 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.385752 kubelet[3293]: E0527 02:48:49.385732 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.386042 kubelet[3293]: E0527 02:48:49.386022 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.386042 kubelet[3293]: W0527 02:48:49.386038 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.386293 kubelet[3293]: E0527 02:48:49.386227 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.386860 kubelet[3293]: E0527 02:48:49.386830 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.386860 kubelet[3293]: W0527 02:48:49.386849 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.386952 kubelet[3293]: E0527 02:48:49.386880 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.387444 kubelet[3293]: E0527 02:48:49.387390 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.387444 kubelet[3293]: W0527 02:48:49.387406 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.387444 kubelet[3293]: E0527 02:48:49.387431 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.388159 kubelet[3293]: E0527 02:48:49.388135 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.388159 kubelet[3293]: W0527 02:48:49.388155 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.388256 kubelet[3293]: E0527 02:48:49.388213 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.388538 kubelet[3293]: E0527 02:48:49.388520 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.388655 kubelet[3293]: W0527 02:48:49.388626 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.388802 kubelet[3293]: E0527 02:48:49.388783 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.389094 kubelet[3293]: E0527 02:48:49.389073 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.389094 kubelet[3293]: W0527 02:48:49.389090 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.389240 kubelet[3293]: E0527 02:48:49.389224 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.389898 kubelet[3293]: E0527 02:48:49.389872 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.389898 kubelet[3293]: W0527 02:48:49.389888 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.390227 kubelet[3293]: E0527 02:48:49.389918 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.390357 kubelet[3293]: E0527 02:48:49.390334 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.390357 kubelet[3293]: W0527 02:48:49.390349 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.390418 kubelet[3293]: E0527 02:48:49.390360 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.390966 kubelet[3293]: E0527 02:48:49.390933 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.390966 kubelet[3293]: W0527 02:48:49.390952 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.390966 kubelet[3293]: E0527 02:48:49.390965 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.395413 kubelet[3293]: E0527 02:48:49.395368 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.395413 kubelet[3293]: W0527 02:48:49.395390 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.395776 kubelet[3293]: E0527 02:48:49.395725 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.449291 kubelet[3293]: E0527 02:48:49.449240 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.449291 kubelet[3293]: W0527 02:48:49.449278 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.449291 kubelet[3293]: E0527 02:48:49.449303 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.449637 kubelet[3293]: E0527 02:48:49.449623 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.449637 kubelet[3293]: W0527 02:48:49.449635 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.449681 kubelet[3293]: E0527 02:48:49.449654 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.452198 kubelet[3293]: E0527 02:48:49.449908 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.452198 kubelet[3293]: W0527 02:48:49.449922 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.452567 kubelet[3293]: E0527 02:48:49.452531 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.452814 kubelet[3293]: E0527 02:48:49.452788 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.452814 kubelet[3293]: W0527 02:48:49.452803 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.452884 kubelet[3293]: E0527 02:48:49.452831 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.453154 kubelet[3293]: E0527 02:48:49.453130 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.453154 kubelet[3293]: W0527 02:48:49.453145 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.453223 kubelet[3293]: E0527 02:48:49.453161 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.458182 kubelet[3293]: E0527 02:48:49.455376 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.458182 kubelet[3293]: W0527 02:48:49.455395 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.458182 kubelet[3293]: E0527 02:48:49.455419 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.458182 kubelet[3293]: E0527 02:48:49.455906 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.458182 kubelet[3293]: W0527 02:48:49.455919 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.458182 kubelet[3293]: E0527 02:48:49.455932 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.458182 kubelet[3293]: E0527 02:48:49.457810 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.458182 kubelet[3293]: W0527 02:48:49.457838 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.458182 kubelet[3293]: E0527 02:48:49.457912 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.458182 kubelet[3293]: E0527 02:48:49.458033 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.458441 kubelet[3293]: W0527 02:48:49.458041 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.458441 kubelet[3293]: E0527 02:48:49.458138 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.458625 kubelet[3293]: E0527 02:48:49.458581 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.458625 kubelet[3293]: W0527 02:48:49.458608 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.459240 kubelet[3293]: E0527 02:48:49.459206 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.459444 kubelet[3293]: E0527 02:48:49.459427 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.459444 kubelet[3293]: W0527 02:48:49.459440 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.459525 kubelet[3293]: E0527 02:48:49.459465 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.460022 kubelet[3293]: E0527 02:48:49.459998 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.460022 kubelet[3293]: W0527 02:48:49.460015 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.460188 kubelet[3293]: E0527 02:48:49.460029 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.460954 kubelet[3293]: E0527 02:48:49.460921 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.460954 kubelet[3293]: W0527 02:48:49.460939 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.461177 kubelet[3293]: E0527 02:48:49.461150 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.461985 kubelet[3293]: E0527 02:48:49.461750 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.461985 kubelet[3293]: W0527 02:48:49.461766 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.462079 kubelet[3293]: E0527 02:48:49.462035 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.462855 kubelet[3293]: E0527 02:48:49.462828 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.462855 kubelet[3293]: W0527 02:48:49.462846 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.465567 kubelet[3293]: E0527 02:48:49.463923 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.465567 kubelet[3293]: E0527 02:48:49.464108 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.465567 kubelet[3293]: W0527 02:48:49.464116 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.465567 kubelet[3293]: E0527 02:48:49.464227 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.465567 kubelet[3293]: E0527 02:48:49.464334 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.465567 kubelet[3293]: W0527 02:48:49.464340 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.465567 kubelet[3293]: E0527 02:48:49.464622 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.465792 kubelet[3293]: E0527 02:48:49.465705 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.465792 kubelet[3293]: W0527 02:48:49.465719 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.465792 kubelet[3293]: E0527 02:48:49.465761 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.466095 kubelet[3293]: E0527 02:48:49.465925 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.466095 kubelet[3293]: W0527 02:48:49.465936 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.466095 kubelet[3293]: E0527 02:48:49.465993 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.466095 kubelet[3293]: E0527 02:48:49.466099 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.466201 kubelet[3293]: W0527 02:48:49.466105 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.466581 kubelet[3293]: E0527 02:48:49.466503 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.466812 kubelet[3293]: E0527 02:48:49.466794 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.466812 kubelet[3293]: W0527 02:48:49.466807 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.467704 kubelet[3293]: E0527 02:48:49.467460 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.467704 kubelet[3293]: E0527 02:48:49.467646 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.467704 kubelet[3293]: W0527 02:48:49.467655 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.467807 containerd[1879]: time="2025-05-27T02:48:49.467616755Z" level=info msg="connecting to shim a60b4ba2612ec9ecd66895fa899bd113c0e7435a8ac4baf74074ca5c45020437" address="unix:///run/containerd/s/85fe3569c9359d751257ff812723d0bda78cb3fafc1c39395b67609954600a05" namespace=k8s.io protocol=ttrpc version=3 May 27 02:48:49.467936 kubelet[3293]: E0527 02:48:49.467914 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.468967 kubelet[3293]: E0527 02:48:49.468939 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.468967 kubelet[3293]: W0527 02:48:49.468959 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.469564 kubelet[3293]: E0527 02:48:49.469516 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.469927 kubelet[3293]: E0527 02:48:49.469867 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.469927 kubelet[3293]: W0527 02:48:49.469884 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.469927 kubelet[3293]: E0527 02:48:49.469895 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.471325 kubelet[3293]: E0527 02:48:49.471284 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.471325 kubelet[3293]: W0527 02:48:49.471305 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.471325 kubelet[3293]: E0527 02:48:49.471316 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.501780 systemd[1]: Started cri-containerd-a60b4ba2612ec9ecd66895fa899bd113c0e7435a8ac4baf74074ca5c45020437.scope - libcontainer container a60b4ba2612ec9ecd66895fa899bd113c0e7435a8ac4baf74074ca5c45020437. May 27 02:48:49.502399 kubelet[3293]: E0527 02:48:49.502296 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:49.502399 kubelet[3293]: W0527 02:48:49.502322 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:49.502399 kubelet[3293]: E0527 02:48:49.502345 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:49.530869 containerd[1879]: time="2025-05-27T02:48:49.530293447Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-tfnsm,Uid:1d15249c-c3ad-4103-926e-7f1ffc66ff29,Namespace:calico-system,Attempt:0,}" May 27 02:48:49.588048 containerd[1879]: time="2025-05-27T02:48:49.587402697Z" level=info msg="connecting to shim 93ddd9d9a2389cbd3e2c7c7b6aacaa2d19b04cee50b9b836f6f40877f19e894e" address="unix:///run/containerd/s/4b489aa0d577229127af0ff201c591fbb9c8ea9b35bbb64987b4b65be6445196" namespace=k8s.io protocol=ttrpc version=3 May 27 02:48:49.621924 systemd[1]: Started cri-containerd-93ddd9d9a2389cbd3e2c7c7b6aacaa2d19b04cee50b9b836f6f40877f19e894e.scope - libcontainer container 93ddd9d9a2389cbd3e2c7c7b6aacaa2d19b04cee50b9b836f6f40877f19e894e. May 27 02:48:49.659184 containerd[1879]: time="2025-05-27T02:48:49.659041326Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5b55fff4dd-9sh6j,Uid:b14a9489-3402-4fdf-bf69-e3ad8e03db6d,Namespace:calico-system,Attempt:0,} returns sandbox id \"a60b4ba2612ec9ecd66895fa899bd113c0e7435a8ac4baf74074ca5c45020437\"" May 27 02:48:49.664605 containerd[1879]: time="2025-05-27T02:48:49.664552087Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\"" May 27 02:48:49.675312 containerd[1879]: time="2025-05-27T02:48:49.675235864Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-tfnsm,Uid:1d15249c-c3ad-4103-926e-7f1ffc66ff29,Namespace:calico-system,Attempt:0,} returns sandbox id \"93ddd9d9a2389cbd3e2c7c7b6aacaa2d19b04cee50b9b836f6f40877f19e894e\"" May 27 02:48:50.837526 kubelet[3293]: E0527 02:48:50.836962 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ds6zz" podUID="c823d13d-65d4-46eb-8fbc-bfdffdd173a7" May 27 02:48:50.969223 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3097738218.mount: Deactivated successfully. May 27 02:48:51.351681 containerd[1879]: time="2025-05-27T02:48:51.351633264Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:51.355644 containerd[1879]: time="2025-05-27T02:48:51.355586246Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.0: active requests=0, bytes read=33020269" May 27 02:48:51.359533 containerd[1879]: time="2025-05-27T02:48:51.359458337Z" level=info msg="ImageCreate event name:\"sha256:05ca98cdd7b8267a0dc5550048c0a195c8d42f85d92f090a669493485d8a6beb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:51.362979 containerd[1879]: time="2025-05-27T02:48:51.362927977Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:51.363531 containerd[1879]: time="2025-05-27T02:48:51.363504361Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.0\" with image id \"sha256:05ca98cdd7b8267a0dc5550048c0a195c8d42f85d92f090a669493485d8a6beb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\", size \"33020123\" in 1.698713843s" May 27 02:48:51.363623 containerd[1879]: time="2025-05-27T02:48:51.363611740Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\" returns image reference \"sha256:05ca98cdd7b8267a0dc5550048c0a195c8d42f85d92f090a669493485d8a6beb\"" May 27 02:48:51.365851 containerd[1879]: time="2025-05-27T02:48:51.365824170Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\"" May 27 02:48:51.378693 containerd[1879]: time="2025-05-27T02:48:51.378651022Z" level=info msg="CreateContainer within sandbox \"a60b4ba2612ec9ecd66895fa899bd113c0e7435a8ac4baf74074ca5c45020437\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 27 02:48:51.408906 containerd[1879]: time="2025-05-27T02:48:51.408849349Z" level=info msg="Container 07b5181547f200e8511cba2759791e502e2b4cf16bef5cf49e757422cd5a78aa: CDI devices from CRI Config.CDIDevices: []" May 27 02:48:51.432526 containerd[1879]: time="2025-05-27T02:48:51.432453396Z" level=info msg="CreateContainer within sandbox \"a60b4ba2612ec9ecd66895fa899bd113c0e7435a8ac4baf74074ca5c45020437\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"07b5181547f200e8511cba2759791e502e2b4cf16bef5cf49e757422cd5a78aa\"" May 27 02:48:51.433267 containerd[1879]: time="2025-05-27T02:48:51.433062469Z" level=info msg="StartContainer for \"07b5181547f200e8511cba2759791e502e2b4cf16bef5cf49e757422cd5a78aa\"" May 27 02:48:51.435027 containerd[1879]: time="2025-05-27T02:48:51.434917656Z" level=info msg="connecting to shim 07b5181547f200e8511cba2759791e502e2b4cf16bef5cf49e757422cd5a78aa" address="unix:///run/containerd/s/85fe3569c9359d751257ff812723d0bda78cb3fafc1c39395b67609954600a05" protocol=ttrpc version=3 May 27 02:48:51.459711 systemd[1]: Started cri-containerd-07b5181547f200e8511cba2759791e502e2b4cf16bef5cf49e757422cd5a78aa.scope - libcontainer container 07b5181547f200e8511cba2759791e502e2b4cf16bef5cf49e757422cd5a78aa. May 27 02:48:51.502770 containerd[1879]: time="2025-05-27T02:48:51.502731260Z" level=info msg="StartContainer for \"07b5181547f200e8511cba2759791e502e2b4cf16bef5cf49e757422cd5a78aa\" returns successfully" May 27 02:48:51.950391 kubelet[3293]: E0527 02:48:51.950355 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.950391 kubelet[3293]: W0527 02:48:51.950378 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.950391 kubelet[3293]: E0527 02:48:51.950400 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.952095 kubelet[3293]: E0527 02:48:51.952066 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.952160 kubelet[3293]: W0527 02:48:51.952091 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.952160 kubelet[3293]: E0527 02:48:51.952143 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.952319 kubelet[3293]: E0527 02:48:51.952308 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.952319 kubelet[3293]: W0527 02:48:51.952317 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.952395 kubelet[3293]: E0527 02:48:51.952324 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.952456 kubelet[3293]: E0527 02:48:51.952446 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.952456 kubelet[3293]: W0527 02:48:51.952456 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.952528 kubelet[3293]: E0527 02:48:51.952462 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.952658 kubelet[3293]: E0527 02:48:51.952644 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.952658 kubelet[3293]: W0527 02:48:51.952652 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.952710 kubelet[3293]: E0527 02:48:51.952659 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.952755 kubelet[3293]: E0527 02:48:51.952744 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.952755 kubelet[3293]: W0527 02:48:51.952749 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.952755 kubelet[3293]: E0527 02:48:51.952753 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.952837 kubelet[3293]: E0527 02:48:51.952826 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.952837 kubelet[3293]: W0527 02:48:51.952830 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.952837 kubelet[3293]: E0527 02:48:51.952834 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.952924 kubelet[3293]: E0527 02:48:51.952915 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.952924 kubelet[3293]: W0527 02:48:51.952921 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.952978 kubelet[3293]: E0527 02:48:51.952926 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.953049 kubelet[3293]: E0527 02:48:51.953006 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.953049 kubelet[3293]: W0527 02:48:51.953010 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.953049 kubelet[3293]: E0527 02:48:51.953014 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.953163 kubelet[3293]: E0527 02:48:51.953081 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.953163 kubelet[3293]: W0527 02:48:51.953085 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.953163 kubelet[3293]: E0527 02:48:51.953089 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.953408 kubelet[3293]: E0527 02:48:51.953185 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.953408 kubelet[3293]: W0527 02:48:51.953189 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.953408 kubelet[3293]: E0527 02:48:51.953193 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.953408 kubelet[3293]: E0527 02:48:51.953265 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.953408 kubelet[3293]: W0527 02:48:51.953269 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.953408 kubelet[3293]: E0527 02:48:51.953273 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.955595 kubelet[3293]: E0527 02:48:51.953452 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.955595 kubelet[3293]: W0527 02:48:51.953463 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.955595 kubelet[3293]: E0527 02:48:51.953471 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.955890 kubelet[3293]: E0527 02:48:51.955871 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.955890 kubelet[3293]: W0527 02:48:51.955888 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.955968 kubelet[3293]: E0527 02:48:51.955903 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.956042 kubelet[3293]: E0527 02:48:51.956030 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.956042 kubelet[3293]: W0527 02:48:51.956040 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.956084 kubelet[3293]: E0527 02:48:51.956047 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.979618 kubelet[3293]: E0527 02:48:51.979549 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.979618 kubelet[3293]: W0527 02:48:51.979578 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.979618 kubelet[3293]: E0527 02:48:51.979599 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.980130 kubelet[3293]: E0527 02:48:51.980044 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.980130 kubelet[3293]: W0527 02:48:51.980056 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.980130 kubelet[3293]: E0527 02:48:51.980073 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.980455 kubelet[3293]: E0527 02:48:51.980364 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.980455 kubelet[3293]: W0527 02:48:51.980376 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.980455 kubelet[3293]: E0527 02:48:51.980390 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.980741 kubelet[3293]: E0527 02:48:51.980729 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.981005 kubelet[3293]: W0527 02:48:51.980990 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.981082 kubelet[3293]: E0527 02:48:51.981072 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.981777 kubelet[3293]: E0527 02:48:51.981752 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.981777 kubelet[3293]: W0527 02:48:51.981772 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.981870 kubelet[3293]: E0527 02:48:51.981788 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.981950 kubelet[3293]: E0527 02:48:51.981936 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.981950 kubelet[3293]: W0527 02:48:51.981945 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.981950 kubelet[3293]: E0527 02:48:51.981952 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.982111 kubelet[3293]: E0527 02:48:51.982084 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.982111 kubelet[3293]: W0527 02:48:51.982090 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.982184 kubelet[3293]: E0527 02:48:51.982171 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.982222 kubelet[3293]: E0527 02:48:51.982213 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.982222 kubelet[3293]: W0527 02:48:51.982219 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.982311 kubelet[3293]: E0527 02:48:51.982239 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.982437 kubelet[3293]: E0527 02:48:51.982421 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.982437 kubelet[3293]: W0527 02:48:51.982431 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.982553 kubelet[3293]: E0527 02:48:51.982505 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.983002 kubelet[3293]: E0527 02:48:51.982577 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.983002 kubelet[3293]: W0527 02:48:51.982583 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.983002 kubelet[3293]: E0527 02:48:51.982597 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.983002 kubelet[3293]: E0527 02:48:51.982846 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.983002 kubelet[3293]: W0527 02:48:51.982853 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.983002 kubelet[3293]: E0527 02:48:51.982863 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.983211 kubelet[3293]: E0527 02:48:51.983196 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.983211 kubelet[3293]: W0527 02:48:51.983208 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.983261 kubelet[3293]: E0527 02:48:51.983221 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.983420 kubelet[3293]: E0527 02:48:51.983406 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.983420 kubelet[3293]: W0527 02:48:51.983418 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.983498 kubelet[3293]: E0527 02:48:51.983430 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.983748 kubelet[3293]: E0527 02:48:51.983734 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.983990 kubelet[3293]: W0527 02:48:51.983797 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.983990 kubelet[3293]: E0527 02:48:51.983823 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.983990 kubelet[3293]: E0527 02:48:51.983956 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.983990 kubelet[3293]: W0527 02:48:51.983965 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.983990 kubelet[3293]: E0527 02:48:51.983981 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.984110 kubelet[3293]: E0527 02:48:51.984095 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.984110 kubelet[3293]: W0527 02:48:51.984105 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.984146 kubelet[3293]: E0527 02:48:51.984119 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.984645 kubelet[3293]: E0527 02:48:51.984624 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.984645 kubelet[3293]: W0527 02:48:51.984644 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.984721 kubelet[3293]: E0527 02:48:51.984656 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:51.985024 kubelet[3293]: E0527 02:48:51.985005 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:51.985024 kubelet[3293]: W0527 02:48:51.985020 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:51.985082 kubelet[3293]: E0527 02:48:51.985030 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.836741 kubelet[3293]: E0527 02:48:52.836691 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ds6zz" podUID="c823d13d-65d4-46eb-8fbc-bfdffdd173a7" May 27 02:48:52.874121 containerd[1879]: time="2025-05-27T02:48:52.874061169Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:52.878031 containerd[1879]: time="2025-05-27T02:48:52.877857471Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0: active requests=0, bytes read=4264304" May 27 02:48:52.882553 containerd[1879]: time="2025-05-27T02:48:52.882518877Z" level=info msg="ImageCreate event name:\"sha256:080eaf4c238c85534b61055c31b109c96ce3d20075391e58988541a442c7c701\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:52.888743 containerd[1879]: time="2025-05-27T02:48:52.888459408Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:52.889115 containerd[1879]: time="2025-05-27T02:48:52.889075738Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" with image id \"sha256:080eaf4c238c85534b61055c31b109c96ce3d20075391e58988541a442c7c701\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\", size \"5633505\" in 1.523090212s" May 27 02:48:52.889115 containerd[1879]: time="2025-05-27T02:48:52.889113643Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" returns image reference \"sha256:080eaf4c238c85534b61055c31b109c96ce3d20075391e58988541a442c7c701\"" May 27 02:48:52.892351 containerd[1879]: time="2025-05-27T02:48:52.892312591Z" level=info msg="CreateContainer within sandbox \"93ddd9d9a2389cbd3e2c7c7b6aacaa2d19b04cee50b9b836f6f40877f19e894e\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 27 02:48:52.923668 containerd[1879]: time="2025-05-27T02:48:52.923605061Z" level=info msg="Container c4854a0a9653f5924f2366d55a73f5ba97b1f3dfdc0a576cf5e7722edfde8a80: CDI devices from CRI Config.CDIDevices: []" May 27 02:48:52.924181 kubelet[3293]: I0527 02:48:52.923735 3293 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 02:48:52.947609 containerd[1879]: time="2025-05-27T02:48:52.947560767Z" level=info msg="CreateContainer within sandbox \"93ddd9d9a2389cbd3e2c7c7b6aacaa2d19b04cee50b9b836f6f40877f19e894e\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"c4854a0a9653f5924f2366d55a73f5ba97b1f3dfdc0a576cf5e7722edfde8a80\"" May 27 02:48:52.948694 containerd[1879]: time="2025-05-27T02:48:52.948658647Z" level=info msg="StartContainer for \"c4854a0a9653f5924f2366d55a73f5ba97b1f3dfdc0a576cf5e7722edfde8a80\"" May 27 02:48:52.951352 containerd[1879]: time="2025-05-27T02:48:52.951314588Z" level=info msg="connecting to shim c4854a0a9653f5924f2366d55a73f5ba97b1f3dfdc0a576cf5e7722edfde8a80" address="unix:///run/containerd/s/4b489aa0d577229127af0ff201c591fbb9c8ea9b35bbb64987b4b65be6445196" protocol=ttrpc version=3 May 27 02:48:52.960670 kubelet[3293]: E0527 02:48:52.960380 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.960670 kubelet[3293]: W0527 02:48:52.960407 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.960670 kubelet[3293]: E0527 02:48:52.960429 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.962517 kubelet[3293]: E0527 02:48:52.961270 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.962517 kubelet[3293]: W0527 02:48:52.961297 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.962517 kubelet[3293]: E0527 02:48:52.961552 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.963191 kubelet[3293]: E0527 02:48:52.962658 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.963191 kubelet[3293]: W0527 02:48:52.962673 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.963191 kubelet[3293]: E0527 02:48:52.962690 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.963191 kubelet[3293]: E0527 02:48:52.962954 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.963191 kubelet[3293]: W0527 02:48:52.962975 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.963191 kubelet[3293]: E0527 02:48:52.962987 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.965148 kubelet[3293]: E0527 02:48:52.963241 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.965148 kubelet[3293]: W0527 02:48:52.963251 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.965148 kubelet[3293]: E0527 02:48:52.963261 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.965148 kubelet[3293]: E0527 02:48:52.963381 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.965148 kubelet[3293]: W0527 02:48:52.963388 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.965148 kubelet[3293]: E0527 02:48:52.963396 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.965148 kubelet[3293]: E0527 02:48:52.963559 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.965148 kubelet[3293]: W0527 02:48:52.963565 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.965148 kubelet[3293]: E0527 02:48:52.963573 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.965148 kubelet[3293]: E0527 02:48:52.963695 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.965305 kubelet[3293]: W0527 02:48:52.963701 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.965305 kubelet[3293]: E0527 02:48:52.963708 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.965305 kubelet[3293]: E0527 02:48:52.963837 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.965305 kubelet[3293]: W0527 02:48:52.963845 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.965305 kubelet[3293]: E0527 02:48:52.963852 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.965305 kubelet[3293]: E0527 02:48:52.963942 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.965305 kubelet[3293]: W0527 02:48:52.963948 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.965305 kubelet[3293]: E0527 02:48:52.963954 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.965305 kubelet[3293]: E0527 02:48:52.964045 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.965305 kubelet[3293]: W0527 02:48:52.964051 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.965444 kubelet[3293]: E0527 02:48:52.964068 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.965444 kubelet[3293]: E0527 02:48:52.964151 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.965444 kubelet[3293]: W0527 02:48:52.964158 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.965444 kubelet[3293]: E0527 02:48:52.964164 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.965444 kubelet[3293]: E0527 02:48:52.964262 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.965444 kubelet[3293]: W0527 02:48:52.964269 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.965444 kubelet[3293]: E0527 02:48:52.964275 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.965444 kubelet[3293]: E0527 02:48:52.964376 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.965444 kubelet[3293]: W0527 02:48:52.964382 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.965444 kubelet[3293]: E0527 02:48:52.964387 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.965606 kubelet[3293]: E0527 02:48:52.964481 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.965606 kubelet[3293]: W0527 02:48:52.964486 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.965606 kubelet[3293]: E0527 02:48:52.964492 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.977681 systemd[1]: Started cri-containerd-c4854a0a9653f5924f2366d55a73f5ba97b1f3dfdc0a576cf5e7722edfde8a80.scope - libcontainer container c4854a0a9653f5924f2366d55a73f5ba97b1f3dfdc0a576cf5e7722edfde8a80. May 27 02:48:52.986051 kubelet[3293]: E0527 02:48:52.986008 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.986051 kubelet[3293]: W0527 02:48:52.986036 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.986051 kubelet[3293]: E0527 02:48:52.986058 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.986511 kubelet[3293]: E0527 02:48:52.986291 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.986511 kubelet[3293]: W0527 02:48:52.986304 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.986511 kubelet[3293]: E0527 02:48:52.986316 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.986511 kubelet[3293]: E0527 02:48:52.986463 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.986511 kubelet[3293]: W0527 02:48:52.986469 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.986653 kubelet[3293]: E0527 02:48:52.986522 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.986696 kubelet[3293]: E0527 02:48:52.986682 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.986696 kubelet[3293]: W0527 02:48:52.986692 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.986737 kubelet[3293]: E0527 02:48:52.986705 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.986836 kubelet[3293]: E0527 02:48:52.986821 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.986836 kubelet[3293]: W0527 02:48:52.986829 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.986880 kubelet[3293]: E0527 02:48:52.986840 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.987360 kubelet[3293]: E0527 02:48:52.987332 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.987360 kubelet[3293]: W0527 02:48:52.987348 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.987438 kubelet[3293]: E0527 02:48:52.987396 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.987621 kubelet[3293]: E0527 02:48:52.987527 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.987621 kubelet[3293]: W0527 02:48:52.987537 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.987621 kubelet[3293]: E0527 02:48:52.987591 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.987789 kubelet[3293]: E0527 02:48:52.987706 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.987789 kubelet[3293]: W0527 02:48:52.987715 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.987789 kubelet[3293]: E0527 02:48:52.987737 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.987880 kubelet[3293]: E0527 02:48:52.987867 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.987880 kubelet[3293]: W0527 02:48:52.987875 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.987923 kubelet[3293]: E0527 02:48:52.987893 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.988245 kubelet[3293]: E0527 02:48:52.988228 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.988245 kubelet[3293]: W0527 02:48:52.988241 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.988399 kubelet[3293]: E0527 02:48:52.988255 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.988399 kubelet[3293]: E0527 02:48:52.988359 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.988399 kubelet[3293]: W0527 02:48:52.988364 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.988399 kubelet[3293]: E0527 02:48:52.988369 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.988581 kubelet[3293]: E0527 02:48:52.988516 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.988581 kubelet[3293]: W0527 02:48:52.988524 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.988581 kubelet[3293]: E0527 02:48:52.988534 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.989091 kubelet[3293]: E0527 02:48:52.988841 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.989091 kubelet[3293]: W0527 02:48:52.988854 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.989091 kubelet[3293]: E0527 02:48:52.988864 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.989203 kubelet[3293]: E0527 02:48:52.989101 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.989203 kubelet[3293]: W0527 02:48:52.989108 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.989203 kubelet[3293]: E0527 02:48:52.989116 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.989267 kubelet[3293]: E0527 02:48:52.989252 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.989267 kubelet[3293]: W0527 02:48:52.989262 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.989297 kubelet[3293]: E0527 02:48:52.989276 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.989536 kubelet[3293]: E0527 02:48:52.989520 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.989536 kubelet[3293]: W0527 02:48:52.989532 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.989705 kubelet[3293]: E0527 02:48:52.989543 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.990084 kubelet[3293]: E0527 02:48:52.990035 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.990084 kubelet[3293]: W0527 02:48:52.990050 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.990084 kubelet[3293]: E0527 02:48:52.990062 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:52.990259 kubelet[3293]: E0527 02:48:52.990204 3293 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 02:48:52.990259 kubelet[3293]: W0527 02:48:52.990213 3293 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 02:48:52.990259 kubelet[3293]: E0527 02:48:52.990219 3293 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 02:48:53.019494 containerd[1879]: time="2025-05-27T02:48:53.019367469Z" level=info msg="StartContainer for \"c4854a0a9653f5924f2366d55a73f5ba97b1f3dfdc0a576cf5e7722edfde8a80\" returns successfully" May 27 02:48:53.021880 systemd[1]: cri-containerd-c4854a0a9653f5924f2366d55a73f5ba97b1f3dfdc0a576cf5e7722edfde8a80.scope: Deactivated successfully. May 27 02:48:53.026136 containerd[1879]: time="2025-05-27T02:48:53.026089502Z" level=info msg="received exit event container_id:\"c4854a0a9653f5924f2366d55a73f5ba97b1f3dfdc0a576cf5e7722edfde8a80\" id:\"c4854a0a9653f5924f2366d55a73f5ba97b1f3dfdc0a576cf5e7722edfde8a80\" pid:4038 exited_at:{seconds:1748314133 nanos:25698163}" May 27 02:48:53.026350 containerd[1879]: time="2025-05-27T02:48:53.026197049Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c4854a0a9653f5924f2366d55a73f5ba97b1f3dfdc0a576cf5e7722edfde8a80\" id:\"c4854a0a9653f5924f2366d55a73f5ba97b1f3dfdc0a576cf5e7722edfde8a80\" pid:4038 exited_at:{seconds:1748314133 nanos:25698163}" May 27 02:48:53.050672 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c4854a0a9653f5924f2366d55a73f5ba97b1f3dfdc0a576cf5e7722edfde8a80-rootfs.mount: Deactivated successfully. May 27 02:48:53.942826 kubelet[3293]: I0527 02:48:53.942543 3293 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5b55fff4dd-9sh6j" podStartSLOduration=3.240294762 podStartE2EDuration="4.942525199s" podCreationTimestamp="2025-05-27 02:48:49 +0000 UTC" firstStartedPulling="2025-05-27 02:48:49.662151437 +0000 UTC m=+18.951468431" lastFinishedPulling="2025-05-27 02:48:51.364381866 +0000 UTC m=+20.653698868" observedRunningTime="2025-05-27 02:48:51.945071183 +0000 UTC m=+21.234388177" watchObservedRunningTime="2025-05-27 02:48:53.942525199 +0000 UTC m=+23.231842193" May 27 02:48:54.837368 kubelet[3293]: E0527 02:48:54.836705 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ds6zz" podUID="c823d13d-65d4-46eb-8fbc-bfdffdd173a7" May 27 02:48:54.934069 containerd[1879]: time="2025-05-27T02:48:54.933287504Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\"" May 27 02:48:56.836129 kubelet[3293]: E0527 02:48:56.836066 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ds6zz" podUID="c823d13d-65d4-46eb-8fbc-bfdffdd173a7" May 27 02:48:57.214024 containerd[1879]: time="2025-05-27T02:48:57.213484694Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:57.216531 containerd[1879]: time="2025-05-27T02:48:57.216492468Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.0: active requests=0, bytes read=65748976" May 27 02:48:57.219668 containerd[1879]: time="2025-05-27T02:48:57.219635783Z" level=info msg="ImageCreate event name:\"sha256:0a1b3d5412de2974bc057a3463a132f935c307bc06d5b990ad54031e1f5a351d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:57.223795 containerd[1879]: time="2025-05-27T02:48:57.223689380Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:48:57.224173 containerd[1879]: time="2025-05-27T02:48:57.224034798Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.0\" with image id \"sha256:0a1b3d5412de2974bc057a3463a132f935c307bc06d5b990ad54031e1f5a351d\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\", size \"67118217\" in 2.290708636s" May 27 02:48:57.224173 containerd[1879]: time="2025-05-27T02:48:57.224066070Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\" returns image reference \"sha256:0a1b3d5412de2974bc057a3463a132f935c307bc06d5b990ad54031e1f5a351d\"" May 27 02:48:57.226537 containerd[1879]: time="2025-05-27T02:48:57.226495932Z" level=info msg="CreateContainer within sandbox \"93ddd9d9a2389cbd3e2c7c7b6aacaa2d19b04cee50b9b836f6f40877f19e894e\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 27 02:48:57.262495 containerd[1879]: time="2025-05-27T02:48:57.262426368Z" level=info msg="Container 36d08f17277510833030d8b0b51344884cde03273372aa0423b4d9dfcbc8cdc3: CDI devices from CRI Config.CDIDevices: []" May 27 02:48:57.289795 containerd[1879]: time="2025-05-27T02:48:57.289672369Z" level=info msg="CreateContainer within sandbox \"93ddd9d9a2389cbd3e2c7c7b6aacaa2d19b04cee50b9b836f6f40877f19e894e\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"36d08f17277510833030d8b0b51344884cde03273372aa0423b4d9dfcbc8cdc3\"" May 27 02:48:57.291600 containerd[1879]: time="2025-05-27T02:48:57.290723359Z" level=info msg="StartContainer for \"36d08f17277510833030d8b0b51344884cde03273372aa0423b4d9dfcbc8cdc3\"" May 27 02:48:57.292742 containerd[1879]: time="2025-05-27T02:48:57.292710384Z" level=info msg="connecting to shim 36d08f17277510833030d8b0b51344884cde03273372aa0423b4d9dfcbc8cdc3" address="unix:///run/containerd/s/4b489aa0d577229127af0ff201c591fbb9c8ea9b35bbb64987b4b65be6445196" protocol=ttrpc version=3 May 27 02:48:57.316686 systemd[1]: Started cri-containerd-36d08f17277510833030d8b0b51344884cde03273372aa0423b4d9dfcbc8cdc3.scope - libcontainer container 36d08f17277510833030d8b0b51344884cde03273372aa0423b4d9dfcbc8cdc3. May 27 02:48:57.357748 containerd[1879]: time="2025-05-27T02:48:57.357702001Z" level=info msg="StartContainer for \"36d08f17277510833030d8b0b51344884cde03273372aa0423b4d9dfcbc8cdc3\" returns successfully" May 27 02:48:58.566786 containerd[1879]: time="2025-05-27T02:48:58.566577518Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 27 02:48:58.571023 systemd[1]: cri-containerd-36d08f17277510833030d8b0b51344884cde03273372aa0423b4d9dfcbc8cdc3.scope: Deactivated successfully. May 27 02:48:58.571767 systemd[1]: cri-containerd-36d08f17277510833030d8b0b51344884cde03273372aa0423b4d9dfcbc8cdc3.scope: Consumed 371ms CPU time, 185.3M memory peak, 165.5M written to disk. May 27 02:48:58.572884 containerd[1879]: time="2025-05-27T02:48:58.572786625Z" level=info msg="received exit event container_id:\"36d08f17277510833030d8b0b51344884cde03273372aa0423b4d9dfcbc8cdc3\" id:\"36d08f17277510833030d8b0b51344884cde03273372aa0423b4d9dfcbc8cdc3\" pid:4116 exited_at:{seconds:1748314138 nanos:572552290}" May 27 02:48:58.573056 containerd[1879]: time="2025-05-27T02:48:58.573022143Z" level=info msg="TaskExit event in podsandbox handler container_id:\"36d08f17277510833030d8b0b51344884cde03273372aa0423b4d9dfcbc8cdc3\" id:\"36d08f17277510833030d8b0b51344884cde03273372aa0423b4d9dfcbc8cdc3\" pid:4116 exited_at:{seconds:1748314138 nanos:572552290}" May 27 02:48:58.597615 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-36d08f17277510833030d8b0b51344884cde03273372aa0423b4d9dfcbc8cdc3-rootfs.mount: Deactivated successfully. May 27 02:48:58.635136 kubelet[3293]: I0527 02:48:58.635093 3293 kubelet_node_status.go:501] "Fast updating node status as it just became ready" May 27 02:48:59.002814 kubelet[3293]: I0527 02:48:58.726912 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv4zv\" (UniqueName: \"kubernetes.io/projected/5594cdcd-4432-410c-abaa-5e2764692cf8-kube-api-access-wv4zv\") pod \"calico-apiserver-fcf964649-r2t2l\" (UID: \"5594cdcd-4432-410c-abaa-5e2764692cf8\") " pod="calico-apiserver/calico-apiserver-fcf964649-r2t2l" May 27 02:48:59.002814 kubelet[3293]: I0527 02:48:58.727254 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr4xb\" (UniqueName: \"kubernetes.io/projected/9611f34a-6a9a-4745-8c18-b491e412177c-kube-api-access-vr4xb\") pod \"calico-apiserver-fcf964649-js6pq\" (UID: \"9611f34a-6a9a-4745-8c18-b491e412177c\") " pod="calico-apiserver/calico-apiserver-fcf964649-js6pq" May 27 02:48:59.002814 kubelet[3293]: I0527 02:48:58.727283 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a22698a-34a8-450c-a9f9-8a53669e60b4-config\") pod \"goldmane-78d55f7ddc-ppbfc\" (UID: \"1a22698a-34a8-450c-a9f9-8a53669e60b4\") " pod="calico-system/goldmane-78d55f7ddc-ppbfc" May 27 02:48:59.002814 kubelet[3293]: I0527 02:48:58.727297 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/1a22698a-34a8-450c-a9f9-8a53669e60b4-goldmane-key-pair\") pod \"goldmane-78d55f7ddc-ppbfc\" (UID: \"1a22698a-34a8-450c-a9f9-8a53669e60b4\") " pod="calico-system/goldmane-78d55f7ddc-ppbfc" May 27 02:48:59.002814 kubelet[3293]: I0527 02:48:58.728730 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a22698a-34a8-450c-a9f9-8a53669e60b4-goldmane-ca-bundle\") pod \"goldmane-78d55f7ddc-ppbfc\" (UID: \"1a22698a-34a8-450c-a9f9-8a53669e60b4\") " pod="calico-system/goldmane-78d55f7ddc-ppbfc" May 27 02:48:58.678311 systemd[1]: Created slice kubepods-burstable-pod7ecc9ba8_534a_4499_b17a_21b491931340.slice - libcontainer container kubepods-burstable-pod7ecc9ba8_534a_4499_b17a_21b491931340.slice. May 27 02:48:59.003491 kubelet[3293]: I0527 02:48:58.728774 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/5594cdcd-4432-410c-abaa-5e2764692cf8-calico-apiserver-certs\") pod \"calico-apiserver-fcf964649-r2t2l\" (UID: \"5594cdcd-4432-410c-abaa-5e2764692cf8\") " pod="calico-apiserver/calico-apiserver-fcf964649-r2t2l" May 27 02:48:59.003491 kubelet[3293]: I0527 02:48:58.728792 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ecc9ba8-534a-4499-b17a-21b491931340-config-volume\") pod \"coredns-668d6bf9bc-z5jn2\" (UID: \"7ecc9ba8-534a-4499-b17a-21b491931340\") " pod="kube-system/coredns-668d6bf9bc-z5jn2" May 27 02:48:59.003491 kubelet[3293]: I0527 02:48:58.728805 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvtl2\" (UniqueName: \"kubernetes.io/projected/7ecc9ba8-534a-4499-b17a-21b491931340-kube-api-access-bvtl2\") pod \"coredns-668d6bf9bc-z5jn2\" (UID: \"7ecc9ba8-534a-4499-b17a-21b491931340\") " pod="kube-system/coredns-668d6bf9bc-z5jn2" May 27 02:48:59.003491 kubelet[3293]: I0527 02:48:58.728817 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b-whisker-ca-bundle\") pod \"whisker-7d548f959d-8ckkv\" (UID: \"ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b\") " pod="calico-system/whisker-7d548f959d-8ckkv" May 27 02:48:59.003491 kubelet[3293]: I0527 02:48:58.728837 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e44b0335-44d6-403f-9889-bba3e30a3868-tigera-ca-bundle\") pod \"calico-kube-controllers-5879b6cfc8-5r62c\" (UID: \"e44b0335-44d6-403f-9889-bba3e30a3868\") " pod="calico-system/calico-kube-controllers-5879b6cfc8-5r62c" May 27 02:48:58.693761 systemd[1]: Created slice kubepods-burstable-podeadeaa32_d265_475c_ae9e_bb3b9b20cff9.slice - libcontainer container kubepods-burstable-podeadeaa32_d265_475c_ae9e_bb3b9b20cff9.slice. May 27 02:48:59.004675 kubelet[3293]: I0527 02:48:58.728852 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b-whisker-backend-key-pair\") pod \"whisker-7d548f959d-8ckkv\" (UID: \"ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b\") " pod="calico-system/whisker-7d548f959d-8ckkv" May 27 02:48:59.004675 kubelet[3293]: I0527 02:48:58.728862 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eadeaa32-d265-475c-ae9e-bb3b9b20cff9-config-volume\") pod \"coredns-668d6bf9bc-g45cr\" (UID: \"eadeaa32-d265-475c-ae9e-bb3b9b20cff9\") " pod="kube-system/coredns-668d6bf9bc-g45cr" May 27 02:48:59.004675 kubelet[3293]: I0527 02:48:58.728878 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzhh7\" (UniqueName: \"kubernetes.io/projected/e44b0335-44d6-403f-9889-bba3e30a3868-kube-api-access-zzhh7\") pod \"calico-kube-controllers-5879b6cfc8-5r62c\" (UID: \"e44b0335-44d6-403f-9889-bba3e30a3868\") " pod="calico-system/calico-kube-controllers-5879b6cfc8-5r62c" May 27 02:48:59.004675 kubelet[3293]: I0527 02:48:58.729145 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qll8k\" (UniqueName: \"kubernetes.io/projected/eadeaa32-d265-475c-ae9e-bb3b9b20cff9-kube-api-access-qll8k\") pod \"coredns-668d6bf9bc-g45cr\" (UID: \"eadeaa32-d265-475c-ae9e-bb3b9b20cff9\") " pod="kube-system/coredns-668d6bf9bc-g45cr" May 27 02:48:59.004675 kubelet[3293]: I0527 02:48:58.730427 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/9611f34a-6a9a-4745-8c18-b491e412177c-calico-apiserver-certs\") pod \"calico-apiserver-fcf964649-js6pq\" (UID: \"9611f34a-6a9a-4745-8c18-b491e412177c\") " pod="calico-apiserver/calico-apiserver-fcf964649-js6pq" May 27 02:48:58.704415 systemd[1]: Created slice kubepods-besteffort-podebe41591_be14_40ec_ad4a_c0eaa1a0fe9b.slice - libcontainer container kubepods-besteffort-podebe41591_be14_40ec_ad4a_c0eaa1a0fe9b.slice. May 27 02:48:59.004803 kubelet[3293]: I0527 02:48:58.730454 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnvzt\" (UniqueName: \"kubernetes.io/projected/1a22698a-34a8-450c-a9f9-8a53669e60b4-kube-api-access-nnvzt\") pod \"goldmane-78d55f7ddc-ppbfc\" (UID: \"1a22698a-34a8-450c-a9f9-8a53669e60b4\") " pod="calico-system/goldmane-78d55f7ddc-ppbfc" May 27 02:48:59.004803 kubelet[3293]: I0527 02:48:58.730467 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6qhj\" (UniqueName: \"kubernetes.io/projected/ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b-kube-api-access-k6qhj\") pod \"whisker-7d548f959d-8ckkv\" (UID: \"ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b\") " pod="calico-system/whisker-7d548f959d-8ckkv" May 27 02:48:58.714571 systemd[1]: Created slice kubepods-besteffort-pod5594cdcd_4432_410c_abaa_5e2764692cf8.slice - libcontainer container kubepods-besteffort-pod5594cdcd_4432_410c_abaa_5e2764692cf8.slice. May 27 02:48:58.719923 systemd[1]: Created slice kubepods-besteffort-pode44b0335_44d6_403f_9889_bba3e30a3868.slice - libcontainer container kubepods-besteffort-pode44b0335_44d6_403f_9889_bba3e30a3868.slice. May 27 02:48:58.728189 systemd[1]: Created slice kubepods-besteffort-pod1a22698a_34a8_450c_a9f9_8a53669e60b4.slice - libcontainer container kubepods-besteffort-pod1a22698a_34a8_450c_a9f9_8a53669e60b4.slice. May 27 02:48:58.738603 systemd[1]: Created slice kubepods-besteffort-pod9611f34a_6a9a_4745_8c18_b491e412177c.slice - libcontainer container kubepods-besteffort-pod9611f34a_6a9a_4745_8c18_b491e412177c.slice. May 27 02:48:58.847391 systemd[1]: Created slice kubepods-besteffort-podc823d13d_65d4_46eb_8fbc_bfdffdd173a7.slice - libcontainer container kubepods-besteffort-podc823d13d_65d4_46eb_8fbc_bfdffdd173a7.slice. May 27 02:48:59.007684 containerd[1879]: time="2025-05-27T02:48:59.007411869Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ds6zz,Uid:c823d13d-65d4-46eb-8fbc-bfdffdd173a7,Namespace:calico-system,Attempt:0,}" May 27 02:48:59.307807 containerd[1879]: time="2025-05-27T02:48:59.307606624Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-z5jn2,Uid:7ecc9ba8-534a-4499-b17a-21b491931340,Namespace:kube-system,Attempt:0,}" May 27 02:48:59.314662 containerd[1879]: time="2025-05-27T02:48:59.314603762Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-fcf964649-js6pq,Uid:9611f34a-6a9a-4745-8c18-b491e412177c,Namespace:calico-apiserver,Attempt:0,}" May 27 02:48:59.314805 containerd[1879]: time="2025-05-27T02:48:59.314605026Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-ppbfc,Uid:1a22698a-34a8-450c-a9f9-8a53669e60b4,Namespace:calico-system,Attempt:0,}" May 27 02:48:59.330975 containerd[1879]: time="2025-05-27T02:48:59.330717514Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-fcf964649-r2t2l,Uid:5594cdcd-4432-410c-abaa-5e2764692cf8,Namespace:calico-apiserver,Attempt:0,}" May 27 02:48:59.330975 containerd[1879]: time="2025-05-27T02:48:59.330835781Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-g45cr,Uid:eadeaa32-d265-475c-ae9e-bb3b9b20cff9,Namespace:kube-system,Attempt:0,}" May 27 02:48:59.331386 containerd[1879]: time="2025-05-27T02:48:59.331207944Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7d548f959d-8ckkv,Uid:ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b,Namespace:calico-system,Attempt:0,}" May 27 02:48:59.339312 containerd[1879]: time="2025-05-27T02:48:59.339207190Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5879b6cfc8-5r62c,Uid:e44b0335-44d6-403f-9889-bba3e30a3868,Namespace:calico-system,Attempt:0,}" May 27 02:48:59.529818 containerd[1879]: time="2025-05-27T02:48:59.529696992Z" level=error msg="Failed to destroy network for sandbox \"122b0903a4c6dee0c1a1824e805222f02f826a888324dd35480791cff31a51df\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:48:59.622419 containerd[1879]: time="2025-05-27T02:48:59.622286644Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-fcf964649-r2t2l,Uid:5594cdcd-4432-410c-abaa-5e2764692cf8,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"122b0903a4c6dee0c1a1824e805222f02f826a888324dd35480791cff31a51df\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:48:59.623186 kubelet[3293]: E0527 02:48:59.623123 3293 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"122b0903a4c6dee0c1a1824e805222f02f826a888324dd35480791cff31a51df\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:48:59.623252 kubelet[3293]: E0527 02:48:59.623210 3293 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"122b0903a4c6dee0c1a1824e805222f02f826a888324dd35480791cff31a51df\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-fcf964649-r2t2l" May 27 02:48:59.623252 kubelet[3293]: E0527 02:48:59.623232 3293 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"122b0903a4c6dee0c1a1824e805222f02f826a888324dd35480791cff31a51df\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-fcf964649-r2t2l" May 27 02:48:59.623644 kubelet[3293]: E0527 02:48:59.623269 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-fcf964649-r2t2l_calico-apiserver(5594cdcd-4432-410c-abaa-5e2764692cf8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-fcf964649-r2t2l_calico-apiserver(5594cdcd-4432-410c-abaa-5e2764692cf8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"122b0903a4c6dee0c1a1824e805222f02f826a888324dd35480791cff31a51df\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-fcf964649-r2t2l" podUID="5594cdcd-4432-410c-abaa-5e2764692cf8" May 27 02:48:59.684118 containerd[1879]: time="2025-05-27T02:48:59.684034296Z" level=error msg="Failed to destroy network for sandbox \"3466616c78eaf74b42d9e46bf04fc92da682bda9d4d509a720435eb661d28bdb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:48:59.687328 systemd[1]: run-netns-cni\x2de05e5aac\x2d88f5\x2da565\x2d0acb\x2d71528609ca16.mount: Deactivated successfully. May 27 02:48:59.694538 containerd[1879]: time="2025-05-27T02:48:59.694151443Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ds6zz,Uid:c823d13d-65d4-46eb-8fbc-bfdffdd173a7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3466616c78eaf74b42d9e46bf04fc92da682bda9d4d509a720435eb661d28bdb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:48:59.694838 kubelet[3293]: E0527 02:48:59.694551 3293 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3466616c78eaf74b42d9e46bf04fc92da682bda9d4d509a720435eb661d28bdb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:48:59.694838 kubelet[3293]: E0527 02:48:59.694641 3293 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3466616c78eaf74b42d9e46bf04fc92da682bda9d4d509a720435eb661d28bdb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ds6zz" May 27 02:48:59.694838 kubelet[3293]: E0527 02:48:59.694668 3293 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3466616c78eaf74b42d9e46bf04fc92da682bda9d4d509a720435eb661d28bdb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ds6zz" May 27 02:48:59.696534 kubelet[3293]: E0527 02:48:59.694727 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-ds6zz_calico-system(c823d13d-65d4-46eb-8fbc-bfdffdd173a7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-ds6zz_calico-system(c823d13d-65d4-46eb-8fbc-bfdffdd173a7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3466616c78eaf74b42d9e46bf04fc92da682bda9d4d509a720435eb661d28bdb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-ds6zz" podUID="c823d13d-65d4-46eb-8fbc-bfdffdd173a7" May 27 02:48:59.722675 containerd[1879]: time="2025-05-27T02:48:59.721640387Z" level=error msg="Failed to destroy network for sandbox \"680fffc9896b8a65904536aa4b712f2b67b47053b977eb8172b6c86c3dae36ec\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:48:59.725472 systemd[1]: run-netns-cni\x2d4267fd02\x2debab\x2d226a\x2d7e43\x2d526a3f6815b4.mount: Deactivated successfully. May 27 02:48:59.737389 containerd[1879]: time="2025-05-27T02:48:59.737275598Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-z5jn2,Uid:7ecc9ba8-534a-4499-b17a-21b491931340,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"680fffc9896b8a65904536aa4b712f2b67b47053b977eb8172b6c86c3dae36ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:48:59.738017 kubelet[3293]: E0527 02:48:59.737874 3293 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"680fffc9896b8a65904536aa4b712f2b67b47053b977eb8172b6c86c3dae36ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:48:59.738017 kubelet[3293]: E0527 02:48:59.737971 3293 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"680fffc9896b8a65904536aa4b712f2b67b47053b977eb8172b6c86c3dae36ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-z5jn2" May 27 02:48:59.738017 kubelet[3293]: E0527 02:48:59.737995 3293 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"680fffc9896b8a65904536aa4b712f2b67b47053b977eb8172b6c86c3dae36ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-z5jn2" May 27 02:48:59.738139 kubelet[3293]: E0527 02:48:59.738071 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-z5jn2_kube-system(7ecc9ba8-534a-4499-b17a-21b491931340)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-z5jn2_kube-system(7ecc9ba8-534a-4499-b17a-21b491931340)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"680fffc9896b8a65904536aa4b712f2b67b47053b977eb8172b6c86c3dae36ec\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-z5jn2" podUID="7ecc9ba8-534a-4499-b17a-21b491931340" May 27 02:48:59.758890 containerd[1879]: time="2025-05-27T02:48:59.758793722Z" level=error msg="Failed to destroy network for sandbox \"6acd5e36072a38ec23abf27da22c9b80f9109125193d769b927290688ac24f77\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:48:59.763384 systemd[1]: run-netns-cni\x2d92149bcb\x2dc745\x2de560\x2de91a\x2d880ce5d3887f.mount: Deactivated successfully. May 27 02:48:59.765886 containerd[1879]: time="2025-05-27T02:48:59.765623599Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-g45cr,Uid:eadeaa32-d265-475c-ae9e-bb3b9b20cff9,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6acd5e36072a38ec23abf27da22c9b80f9109125193d769b927290688ac24f77\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:48:59.766253 kubelet[3293]: E0527 02:48:59.766200 3293 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6acd5e36072a38ec23abf27da22c9b80f9109125193d769b927290688ac24f77\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:48:59.766526 kubelet[3293]: E0527 02:48:59.766378 3293 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6acd5e36072a38ec23abf27da22c9b80f9109125193d769b927290688ac24f77\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-g45cr" May 27 02:48:59.766526 kubelet[3293]: E0527 02:48:59.766404 3293 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6acd5e36072a38ec23abf27da22c9b80f9109125193d769b927290688ac24f77\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-g45cr" May 27 02:48:59.766526 kubelet[3293]: E0527 02:48:59.766471 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-g45cr_kube-system(eadeaa32-d265-475c-ae9e-bb3b9b20cff9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-g45cr_kube-system(eadeaa32-d265-475c-ae9e-bb3b9b20cff9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6acd5e36072a38ec23abf27da22c9b80f9109125193d769b927290688ac24f77\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-g45cr" podUID="eadeaa32-d265-475c-ae9e-bb3b9b20cff9" May 27 02:48:59.789897 containerd[1879]: time="2025-05-27T02:48:59.789811928Z" level=error msg="Failed to destroy network for sandbox \"716524a78df54dd03285680c0ca04d219959dc9a5fca7988d2fc99b373fe653c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:48:59.793365 systemd[1]: run-netns-cni\x2d21a9ada0\x2d54be\x2dc9c4\x2ddd50\x2d34ce1e0a7f9f.mount: Deactivated successfully. May 27 02:48:59.800941 containerd[1879]: time="2025-05-27T02:48:59.800740683Z" level=error msg="Failed to destroy network for sandbox \"72df461b5068a751009e491790ef59304bfcee7b0a3fb18c41875aa83aa4f837\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:48:59.803818 containerd[1879]: time="2025-05-27T02:48:59.803364542Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-ppbfc,Uid:1a22698a-34a8-450c-a9f9-8a53669e60b4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"716524a78df54dd03285680c0ca04d219959dc9a5fca7988d2fc99b373fe653c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:48:59.804923 kubelet[3293]: E0527 02:48:59.804546 3293 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"716524a78df54dd03285680c0ca04d219959dc9a5fca7988d2fc99b373fe653c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:48:59.806135 kubelet[3293]: E0527 02:48:59.805181 3293 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"716524a78df54dd03285680c0ca04d219959dc9a5fca7988d2fc99b373fe653c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-ppbfc" May 27 02:48:59.806135 kubelet[3293]: E0527 02:48:59.805213 3293 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"716524a78df54dd03285680c0ca04d219959dc9a5fca7988d2fc99b373fe653c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-ppbfc" May 27 02:48:59.806135 kubelet[3293]: E0527 02:48:59.805276 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-78d55f7ddc-ppbfc_calico-system(1a22698a-34a8-450c-a9f9-8a53669e60b4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-78d55f7ddc-ppbfc_calico-system(1a22698a-34a8-450c-a9f9-8a53669e60b4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"716524a78df54dd03285680c0ca04d219959dc9a5fca7988d2fc99b373fe653c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-78d55f7ddc-ppbfc" podUID="1a22698a-34a8-450c-a9f9-8a53669e60b4" May 27 02:48:59.807368 containerd[1879]: time="2025-05-27T02:48:59.806901996Z" level=error msg="Failed to destroy network for sandbox \"aec8e09b193da8a6855c8336e7315c57305997bb11f51e5c9e432279dd57a477\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:48:59.808056 containerd[1879]: time="2025-05-27T02:48:59.807759789Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-fcf964649-js6pq,Uid:9611f34a-6a9a-4745-8c18-b491e412177c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"72df461b5068a751009e491790ef59304bfcee7b0a3fb18c41875aa83aa4f837\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:48:59.808413 kubelet[3293]: E0527 02:48:59.808216 3293 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72df461b5068a751009e491790ef59304bfcee7b0a3fb18c41875aa83aa4f837\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:48:59.808469 kubelet[3293]: E0527 02:48:59.808417 3293 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72df461b5068a751009e491790ef59304bfcee7b0a3fb18c41875aa83aa4f837\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-fcf964649-js6pq" May 27 02:48:59.808469 kubelet[3293]: E0527 02:48:59.808446 3293 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72df461b5068a751009e491790ef59304bfcee7b0a3fb18c41875aa83aa4f837\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-fcf964649-js6pq" May 27 02:48:59.808971 kubelet[3293]: E0527 02:48:59.808624 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-fcf964649-js6pq_calico-apiserver(9611f34a-6a9a-4745-8c18-b491e412177c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-fcf964649-js6pq_calico-apiserver(9611f34a-6a9a-4745-8c18-b491e412177c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"72df461b5068a751009e491790ef59304bfcee7b0a3fb18c41875aa83aa4f837\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-fcf964649-js6pq" podUID="9611f34a-6a9a-4745-8c18-b491e412177c" May 27 02:48:59.815901 containerd[1879]: time="2025-05-27T02:48:59.815798021Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7d548f959d-8ckkv,Uid:ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"aec8e09b193da8a6855c8336e7315c57305997bb11f51e5c9e432279dd57a477\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:48:59.817311 kubelet[3293]: E0527 02:48:59.816595 3293 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aec8e09b193da8a6855c8336e7315c57305997bb11f51e5c9e432279dd57a477\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:48:59.817696 kubelet[3293]: E0527 02:48:59.817628 3293 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aec8e09b193da8a6855c8336e7315c57305997bb11f51e5c9e432279dd57a477\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7d548f959d-8ckkv" May 27 02:48:59.817696 kubelet[3293]: E0527 02:48:59.817693 3293 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aec8e09b193da8a6855c8336e7315c57305997bb11f51e5c9e432279dd57a477\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7d548f959d-8ckkv" May 27 02:48:59.817832 kubelet[3293]: E0527 02:48:59.817754 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7d548f959d-8ckkv_calico-system(ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7d548f959d-8ckkv_calico-system(ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"aec8e09b193da8a6855c8336e7315c57305997bb11f51e5c9e432279dd57a477\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7d548f959d-8ckkv" podUID="ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b" May 27 02:48:59.821059 containerd[1879]: time="2025-05-27T02:48:59.820686626Z" level=error msg="Failed to destroy network for sandbox \"73a418941f4bb26758d2d43d5c65d22e27f3117c46958526b26d3a50cd8ea647\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:48:59.831376 containerd[1879]: time="2025-05-27T02:48:59.831278179Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5879b6cfc8-5r62c,Uid:e44b0335-44d6-403f-9889-bba3e30a3868,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"73a418941f4bb26758d2d43d5c65d22e27f3117c46958526b26d3a50cd8ea647\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:48:59.831732 kubelet[3293]: E0527 02:48:59.831670 3293 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"73a418941f4bb26758d2d43d5c65d22e27f3117c46958526b26d3a50cd8ea647\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 02:48:59.831868 kubelet[3293]: E0527 02:48:59.831842 3293 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"73a418941f4bb26758d2d43d5c65d22e27f3117c46958526b26d3a50cd8ea647\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5879b6cfc8-5r62c" May 27 02:48:59.831912 kubelet[3293]: E0527 02:48:59.831876 3293 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"73a418941f4bb26758d2d43d5c65d22e27f3117c46958526b26d3a50cd8ea647\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5879b6cfc8-5r62c" May 27 02:48:59.832006 kubelet[3293]: E0527 02:48:59.831972 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5879b6cfc8-5r62c_calico-system(e44b0335-44d6-403f-9889-bba3e30a3868)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5879b6cfc8-5r62c_calico-system(e44b0335-44d6-403f-9889-bba3e30a3868)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"73a418941f4bb26758d2d43d5c65d22e27f3117c46958526b26d3a50cd8ea647\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5879b6cfc8-5r62c" podUID="e44b0335-44d6-403f-9889-bba3e30a3868" May 27 02:48:59.952027 containerd[1879]: time="2025-05-27T02:48:59.951982655Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\"" May 27 02:49:00.597993 systemd[1]: run-netns-cni\x2dad38c170\x2df86d\x2d83c8\x2df8b2\x2d13da4768312a.mount: Deactivated successfully. May 27 02:49:00.598217 systemd[1]: run-netns-cni\x2dab0d087a\x2dfbb4\x2d3893\x2d8f2a\x2d7714a54ff907.mount: Deactivated successfully. May 27 02:49:00.598252 systemd[1]: run-netns-cni\x2d7d18477c\x2ddb31\x2d9e11\x2d7906\x2d7ddb62fff4cf.mount: Deactivated successfully. May 27 02:49:04.016836 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount997379309.mount: Deactivated successfully. May 27 02:49:04.335612 containerd[1879]: time="2025-05-27T02:49:04.335062157Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:04.339206 containerd[1879]: time="2025-05-27T02:49:04.339156069Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.0: active requests=0, bytes read=150465379" May 27 02:49:04.349042 containerd[1879]: time="2025-05-27T02:49:04.348960868Z" level=info msg="ImageCreate event name:\"sha256:f7148fde8e28b27da58f84cac134cdc53b5df321cda13c660192f06839670732\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:04.353790 containerd[1879]: time="2025-05-27T02:49:04.353727066Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:04.354238 containerd[1879]: time="2025-05-27T02:49:04.354049653Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.0\" with image id \"sha256:f7148fde8e28b27da58f84cac134cdc53b5df321cda13c660192f06839670732\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\", size \"150465241\" in 4.401824493s" May 27 02:49:04.354238 containerd[1879]: time="2025-05-27T02:49:04.354094686Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\" returns image reference \"sha256:f7148fde8e28b27da58f84cac134cdc53b5df321cda13c660192f06839670732\"" May 27 02:49:04.367783 containerd[1879]: time="2025-05-27T02:49:04.367737740Z" level=info msg="CreateContainer within sandbox \"93ddd9d9a2389cbd3e2c7c7b6aacaa2d19b04cee50b9b836f6f40877f19e894e\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 27 02:49:04.396175 containerd[1879]: time="2025-05-27T02:49:04.395248159Z" level=info msg="Container ee10df8cc0c69ed6dd960776c05f80c82a916559e7803dc4ee9638120d6dc636: CDI devices from CRI Config.CDIDevices: []" May 27 02:49:04.439958 containerd[1879]: time="2025-05-27T02:49:04.439905284Z" level=info msg="CreateContainer within sandbox \"93ddd9d9a2389cbd3e2c7c7b6aacaa2d19b04cee50b9b836f6f40877f19e894e\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"ee10df8cc0c69ed6dd960776c05f80c82a916559e7803dc4ee9638120d6dc636\"" May 27 02:49:04.441448 containerd[1879]: time="2025-05-27T02:49:04.441405142Z" level=info msg="StartContainer for \"ee10df8cc0c69ed6dd960776c05f80c82a916559e7803dc4ee9638120d6dc636\"" May 27 02:49:04.443832 containerd[1879]: time="2025-05-27T02:49:04.443670449Z" level=info msg="connecting to shim ee10df8cc0c69ed6dd960776c05f80c82a916559e7803dc4ee9638120d6dc636" address="unix:///run/containerd/s/4b489aa0d577229127af0ff201c591fbb9c8ea9b35bbb64987b4b65be6445196" protocol=ttrpc version=3 May 27 02:49:04.461663 systemd[1]: Started cri-containerd-ee10df8cc0c69ed6dd960776c05f80c82a916559e7803dc4ee9638120d6dc636.scope - libcontainer container ee10df8cc0c69ed6dd960776c05f80c82a916559e7803dc4ee9638120d6dc636. May 27 02:49:04.499591 containerd[1879]: time="2025-05-27T02:49:04.498694831Z" level=info msg="StartContainer for \"ee10df8cc0c69ed6dd960776c05f80c82a916559e7803dc4ee9638120d6dc636\" returns successfully" May 27 02:49:04.874977 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 27 02:49:04.875120 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 27 02:49:05.076934 kubelet[3293]: I0527 02:49:05.076885 3293 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b-whisker-ca-bundle\") pod \"ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b\" (UID: \"ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b\") " May 27 02:49:05.076934 kubelet[3293]: I0527 02:49:05.076944 3293 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6qhj\" (UniqueName: \"kubernetes.io/projected/ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b-kube-api-access-k6qhj\") pod \"ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b\" (UID: \"ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b\") " May 27 02:49:05.077385 kubelet[3293]: I0527 02:49:05.076968 3293 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b-whisker-backend-key-pair\") pod \"ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b\" (UID: \"ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b\") " May 27 02:49:05.081041 kubelet[3293]: I0527 02:49:05.080877 3293 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b" (UID: "ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 27 02:49:05.083312 systemd[1]: var-lib-kubelet-pods-ebe41591\x2dbe14\x2d40ec\x2dad4a\x2dc0eaa1a0fe9b-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. May 27 02:49:05.087704 kubelet[3293]: I0527 02:49:05.083614 3293 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b" (UID: "ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 27 02:49:05.088019 systemd[1]: var-lib-kubelet-pods-ebe41591\x2dbe14\x2d40ec\x2dad4a\x2dc0eaa1a0fe9b-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dk6qhj.mount: Deactivated successfully. May 27 02:49:05.095438 kubelet[3293]: I0527 02:49:05.094952 3293 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b-kube-api-access-k6qhj" (OuterVolumeSpecName: "kube-api-access-k6qhj") pod "ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b" (UID: "ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b"). InnerVolumeSpecName "kube-api-access-k6qhj". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 27 02:49:05.146447 containerd[1879]: time="2025-05-27T02:49:05.146163634Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ee10df8cc0c69ed6dd960776c05f80c82a916559e7803dc4ee9638120d6dc636\" id:\"7078de35f46efd7b4bf896e41837f230164281874e46f2b899db58e0f8d70aa0\" pid:4457 exit_status:1 exited_at:{seconds:1748314145 nanos:145805206}" May 27 02:49:05.178437 kubelet[3293]: I0527 02:49:05.177706 3293 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b-whisker-ca-bundle\") on node \"ci-4344.0.0-a-583de22c75\" DevicePath \"\"" May 27 02:49:05.178712 kubelet[3293]: I0527 02:49:05.178547 3293 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k6qhj\" (UniqueName: \"kubernetes.io/projected/ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b-kube-api-access-k6qhj\") on node \"ci-4344.0.0-a-583de22c75\" DevicePath \"\"" May 27 02:49:05.178712 kubelet[3293]: I0527 02:49:05.178567 3293 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b-whisker-backend-key-pair\") on node \"ci-4344.0.0-a-583de22c75\" DevicePath \"\"" May 27 02:49:05.980491 systemd[1]: Removed slice kubepods-besteffort-podebe41591_be14_40ec_ad4a_c0eaa1a0fe9b.slice - libcontainer container kubepods-besteffort-podebe41591_be14_40ec_ad4a_c0eaa1a0fe9b.slice. May 27 02:49:06.001501 kubelet[3293]: I0527 02:49:06.000844 3293 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-tfnsm" podStartSLOduration=2.323293842 podStartE2EDuration="17.000824816s" podCreationTimestamp="2025-05-27 02:48:49 +0000 UTC" firstStartedPulling="2025-05-27 02:48:49.677531568 +0000 UTC m=+18.966848570" lastFinishedPulling="2025-05-27 02:49:04.355062526 +0000 UTC m=+33.644379544" observedRunningTime="2025-05-27 02:49:05.009363941 +0000 UTC m=+34.298680935" watchObservedRunningTime="2025-05-27 02:49:06.000824816 +0000 UTC m=+35.290141818" May 27 02:49:06.055142 systemd[1]: Created slice kubepods-besteffort-poded0f6931_9afa_4d7f_96af_71936ce08ef4.slice - libcontainer container kubepods-besteffort-poded0f6931_9afa_4d7f_96af_71936ce08ef4.slice. May 27 02:49:06.103167 containerd[1879]: time="2025-05-27T02:49:06.103097089Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ee10df8cc0c69ed6dd960776c05f80c82a916559e7803dc4ee9638120d6dc636\" id:\"66844880a578db67a5ba7ccba7d4bdc502a7e73e588bb0f819cfa75766aafd12\" pid:4504 exit_status:1 exited_at:{seconds:1748314146 nanos:101956539}" May 27 02:49:06.186030 kubelet[3293]: I0527 02:49:06.185973 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnxrq\" (UniqueName: \"kubernetes.io/projected/ed0f6931-9afa-4d7f-96af-71936ce08ef4-kube-api-access-dnxrq\") pod \"whisker-fc8cbdb96-5hpj2\" (UID: \"ed0f6931-9afa-4d7f-96af-71936ce08ef4\") " pod="calico-system/whisker-fc8cbdb96-5hpj2" May 27 02:49:06.186030 kubelet[3293]: I0527 02:49:06.186037 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed0f6931-9afa-4d7f-96af-71936ce08ef4-whisker-ca-bundle\") pod \"whisker-fc8cbdb96-5hpj2\" (UID: \"ed0f6931-9afa-4d7f-96af-71936ce08ef4\") " pod="calico-system/whisker-fc8cbdb96-5hpj2" May 27 02:49:06.186588 kubelet[3293]: I0527 02:49:06.186050 3293 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ed0f6931-9afa-4d7f-96af-71936ce08ef4-whisker-backend-key-pair\") pod \"whisker-fc8cbdb96-5hpj2\" (UID: \"ed0f6931-9afa-4d7f-96af-71936ce08ef4\") " pod="calico-system/whisker-fc8cbdb96-5hpj2" May 27 02:49:06.363286 containerd[1879]: time="2025-05-27T02:49:06.361514218Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-fc8cbdb96-5hpj2,Uid:ed0f6931-9afa-4d7f-96af-71936ce08ef4,Namespace:calico-system,Attempt:0,}" May 27 02:49:06.551314 systemd-networkd[1487]: cali2967478fd1a: Link UP May 27 02:49:06.552152 systemd-networkd[1487]: cali2967478fd1a: Gained carrier May 27 02:49:06.580722 containerd[1879]: 2025-05-27 02:49:06.411 [INFO][4558] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 02:49:06.580722 containerd[1879]: 2025-05-27 02:49:06.438 [INFO][4558] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--583de22c75-k8s-whisker--fc8cbdb96--5hpj2-eth0 whisker-fc8cbdb96- calico-system ed0f6931-9afa-4d7f-96af-71936ce08ef4 855 0 2025-05-27 02:49:06 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:fc8cbdb96 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4344.0.0-a-583de22c75 whisker-fc8cbdb96-5hpj2 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali2967478fd1a [] [] }} ContainerID="0b92ceab0ce51d88f84944649c035a3d3690309e1cee813480dff0eacb932ac1" Namespace="calico-system" Pod="whisker-fc8cbdb96-5hpj2" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-whisker--fc8cbdb96--5hpj2-" May 27 02:49:06.580722 containerd[1879]: 2025-05-27 02:49:06.438 [INFO][4558] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0b92ceab0ce51d88f84944649c035a3d3690309e1cee813480dff0eacb932ac1" Namespace="calico-system" Pod="whisker-fc8cbdb96-5hpj2" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-whisker--fc8cbdb96--5hpj2-eth0" May 27 02:49:06.580722 containerd[1879]: 2025-05-27 02:49:06.495 [INFO][4614] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0b92ceab0ce51d88f84944649c035a3d3690309e1cee813480dff0eacb932ac1" HandleID="k8s-pod-network.0b92ceab0ce51d88f84944649c035a3d3690309e1cee813480dff0eacb932ac1" Workload="ci--4344.0.0--a--583de22c75-k8s-whisker--fc8cbdb96--5hpj2-eth0" May 27 02:49:06.581220 containerd[1879]: 2025-05-27 02:49:06.495 [INFO][4614] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0b92ceab0ce51d88f84944649c035a3d3690309e1cee813480dff0eacb932ac1" HandleID="k8s-pod-network.0b92ceab0ce51d88f84944649c035a3d3690309e1cee813480dff0eacb932ac1" Workload="ci--4344.0.0--a--583de22c75-k8s-whisker--fc8cbdb96--5hpj2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400022f670), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344.0.0-a-583de22c75", "pod":"whisker-fc8cbdb96-5hpj2", "timestamp":"2025-05-27 02:49:06.495468857 +0000 UTC"}, Hostname:"ci-4344.0.0-a-583de22c75", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 02:49:06.581220 containerd[1879]: 2025-05-27 02:49:06.495 [INFO][4614] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 02:49:06.581220 containerd[1879]: 2025-05-27 02:49:06.495 [INFO][4614] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 02:49:06.581220 containerd[1879]: 2025-05-27 02:49:06.495 [INFO][4614] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-583de22c75' May 27 02:49:06.581220 containerd[1879]: 2025-05-27 02:49:06.502 [INFO][4614] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0b92ceab0ce51d88f84944649c035a3d3690309e1cee813480dff0eacb932ac1" host="ci-4344.0.0-a-583de22c75" May 27 02:49:06.581220 containerd[1879]: 2025-05-27 02:49:06.509 [INFO][4614] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-583de22c75" May 27 02:49:06.581220 containerd[1879]: 2025-05-27 02:49:06.515 [INFO][4614] ipam/ipam.go 511: Trying affinity for 192.168.26.128/26 host="ci-4344.0.0-a-583de22c75" May 27 02:49:06.581220 containerd[1879]: 2025-05-27 02:49:06.517 [INFO][4614] ipam/ipam.go 158: Attempting to load block cidr=192.168.26.128/26 host="ci-4344.0.0-a-583de22c75" May 27 02:49:06.581220 containerd[1879]: 2025-05-27 02:49:06.519 [INFO][4614] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.26.128/26 host="ci-4344.0.0-a-583de22c75" May 27 02:49:06.581655 containerd[1879]: 2025-05-27 02:49:06.519 [INFO][4614] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.26.128/26 handle="k8s-pod-network.0b92ceab0ce51d88f84944649c035a3d3690309e1cee813480dff0eacb932ac1" host="ci-4344.0.0-a-583de22c75" May 27 02:49:06.581655 containerd[1879]: 2025-05-27 02:49:06.521 [INFO][4614] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0b92ceab0ce51d88f84944649c035a3d3690309e1cee813480dff0eacb932ac1 May 27 02:49:06.581655 containerd[1879]: 2025-05-27 02:49:06.526 [INFO][4614] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.26.128/26 handle="k8s-pod-network.0b92ceab0ce51d88f84944649c035a3d3690309e1cee813480dff0eacb932ac1" host="ci-4344.0.0-a-583de22c75" May 27 02:49:06.581655 containerd[1879]: 2025-05-27 02:49:06.535 [INFO][4614] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.26.129/26] block=192.168.26.128/26 handle="k8s-pod-network.0b92ceab0ce51d88f84944649c035a3d3690309e1cee813480dff0eacb932ac1" host="ci-4344.0.0-a-583de22c75" May 27 02:49:06.581655 containerd[1879]: 2025-05-27 02:49:06.536 [INFO][4614] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.26.129/26] handle="k8s-pod-network.0b92ceab0ce51d88f84944649c035a3d3690309e1cee813480dff0eacb932ac1" host="ci-4344.0.0-a-583de22c75" May 27 02:49:06.581655 containerd[1879]: 2025-05-27 02:49:06.536 [INFO][4614] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 02:49:06.581655 containerd[1879]: 2025-05-27 02:49:06.536 [INFO][4614] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.26.129/26] IPv6=[] ContainerID="0b92ceab0ce51d88f84944649c035a3d3690309e1cee813480dff0eacb932ac1" HandleID="k8s-pod-network.0b92ceab0ce51d88f84944649c035a3d3690309e1cee813480dff0eacb932ac1" Workload="ci--4344.0.0--a--583de22c75-k8s-whisker--fc8cbdb96--5hpj2-eth0" May 27 02:49:06.581892 containerd[1879]: 2025-05-27 02:49:06.538 [INFO][4558] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0b92ceab0ce51d88f84944649c035a3d3690309e1cee813480dff0eacb932ac1" Namespace="calico-system" Pod="whisker-fc8cbdb96-5hpj2" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-whisker--fc8cbdb96--5hpj2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--583de22c75-k8s-whisker--fc8cbdb96--5hpj2-eth0", GenerateName:"whisker-fc8cbdb96-", Namespace:"calico-system", SelfLink:"", UID:"ed0f6931-9afa-4d7f-96af-71936ce08ef4", ResourceVersion:"855", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 49, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"fc8cbdb96", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-583de22c75", ContainerID:"", Pod:"whisker-fc8cbdb96-5hpj2", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.26.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali2967478fd1a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:06.581892 containerd[1879]: 2025-05-27 02:49:06.538 [INFO][4558] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.26.129/32] ContainerID="0b92ceab0ce51d88f84944649c035a3d3690309e1cee813480dff0eacb932ac1" Namespace="calico-system" Pod="whisker-fc8cbdb96-5hpj2" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-whisker--fc8cbdb96--5hpj2-eth0" May 27 02:49:06.582085 containerd[1879]: 2025-05-27 02:49:06.538 [INFO][4558] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2967478fd1a ContainerID="0b92ceab0ce51d88f84944649c035a3d3690309e1cee813480dff0eacb932ac1" Namespace="calico-system" Pod="whisker-fc8cbdb96-5hpj2" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-whisker--fc8cbdb96--5hpj2-eth0" May 27 02:49:06.582085 containerd[1879]: 2025-05-27 02:49:06.553 [INFO][4558] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0b92ceab0ce51d88f84944649c035a3d3690309e1cee813480dff0eacb932ac1" Namespace="calico-system" Pod="whisker-fc8cbdb96-5hpj2" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-whisker--fc8cbdb96--5hpj2-eth0" May 27 02:49:06.582135 containerd[1879]: 2025-05-27 02:49:06.556 [INFO][4558] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0b92ceab0ce51d88f84944649c035a3d3690309e1cee813480dff0eacb932ac1" Namespace="calico-system" Pod="whisker-fc8cbdb96-5hpj2" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-whisker--fc8cbdb96--5hpj2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--583de22c75-k8s-whisker--fc8cbdb96--5hpj2-eth0", GenerateName:"whisker-fc8cbdb96-", Namespace:"calico-system", SelfLink:"", UID:"ed0f6931-9afa-4d7f-96af-71936ce08ef4", ResourceVersion:"855", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 49, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"fc8cbdb96", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-583de22c75", ContainerID:"0b92ceab0ce51d88f84944649c035a3d3690309e1cee813480dff0eacb932ac1", Pod:"whisker-fc8cbdb96-5hpj2", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.26.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali2967478fd1a", MAC:"1a:72:80:0f:65:bf", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:06.582175 containerd[1879]: 2025-05-27 02:49:06.576 [INFO][4558] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0b92ceab0ce51d88f84944649c035a3d3690309e1cee813480dff0eacb932ac1" Namespace="calico-system" Pod="whisker-fc8cbdb96-5hpj2" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-whisker--fc8cbdb96--5hpj2-eth0" May 27 02:49:06.646906 containerd[1879]: time="2025-05-27T02:49:06.646637204Z" level=info msg="connecting to shim 0b92ceab0ce51d88f84944649c035a3d3690309e1cee813480dff0eacb932ac1" address="unix:///run/containerd/s/c392558ee64f80193d9b0e6b1e6be425e025932527f019867f555861ee0fee02" namespace=k8s.io protocol=ttrpc version=3 May 27 02:49:06.678687 systemd[1]: Started cri-containerd-0b92ceab0ce51d88f84944649c035a3d3690309e1cee813480dff0eacb932ac1.scope - libcontainer container 0b92ceab0ce51d88f84944649c035a3d3690309e1cee813480dff0eacb932ac1. May 27 02:49:06.714490 containerd[1879]: time="2025-05-27T02:49:06.714432378Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-fc8cbdb96-5hpj2,Uid:ed0f6931-9afa-4d7f-96af-71936ce08ef4,Namespace:calico-system,Attempt:0,} returns sandbox id \"0b92ceab0ce51d88f84944649c035a3d3690309e1cee813480dff0eacb932ac1\"" May 27 02:49:06.717658 containerd[1879]: time="2025-05-27T02:49:06.717542610Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 02:49:06.838261 kubelet[3293]: I0527 02:49:06.838208 3293 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b" path="/var/lib/kubelet/pods/ebe41591-be14-40ec-ad4a-c0eaa1a0fe9b/volumes" May 27 02:49:06.889943 containerd[1879]: time="2025-05-27T02:49:06.889812042Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:49:06.893238 containerd[1879]: time="2025-05-27T02:49:06.893156442Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:49:06.893238 containerd[1879]: time="2025-05-27T02:49:06.893201411Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 02:49:06.893423 kubelet[3293]: E0527 02:49:06.893375 3293 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 02:49:06.893518 kubelet[3293]: E0527 02:49:06.893426 3293 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 02:49:06.896397 kubelet[3293]: E0527 02:49:06.896329 3293 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:781684bcf162405c83f157eefd25f0ec,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dnxrq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-fc8cbdb96-5hpj2_calico-system(ed0f6931-9afa-4d7f-96af-71936ce08ef4): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:49:06.898679 containerd[1879]: time="2025-05-27T02:49:06.898547853Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 02:49:07.061068 containerd[1879]: time="2025-05-27T02:49:07.061011384Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:49:07.064105 containerd[1879]: time="2025-05-27T02:49:07.064029556Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:49:07.064105 containerd[1879]: time="2025-05-27T02:49:07.064069893Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 02:49:07.064325 kubelet[3293]: E0527 02:49:07.064281 3293 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 02:49:07.064383 kubelet[3293]: E0527 02:49:07.064336 3293 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 02:49:07.065133 kubelet[3293]: E0527 02:49:07.064440 3293 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dnxrq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-fc8cbdb96-5hpj2_calico-system(ed0f6931-9afa-4d7f-96af-71936ce08ef4): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:49:07.070497 kubelet[3293]: E0527 02:49:07.070210 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-fc8cbdb96-5hpj2" podUID="ed0f6931-9afa-4d7f-96af-71936ce08ef4" May 27 02:49:07.695666 systemd-networkd[1487]: cali2967478fd1a: Gained IPv6LL May 27 02:49:07.979933 kubelet[3293]: E0527 02:49:07.979527 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-fc8cbdb96-5hpj2" podUID="ed0f6931-9afa-4d7f-96af-71936ce08ef4" May 27 02:49:09.848738 kernel: hrtimer: interrupt took 1188234 ns May 27 02:49:10.838494 containerd[1879]: time="2025-05-27T02:49:10.837716488Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-z5jn2,Uid:7ecc9ba8-534a-4499-b17a-21b491931340,Namespace:kube-system,Attempt:0,}" May 27 02:49:10.838494 containerd[1879]: time="2025-05-27T02:49:10.838410564Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-g45cr,Uid:eadeaa32-d265-475c-ae9e-bb3b9b20cff9,Namespace:kube-system,Attempt:0,}" May 27 02:49:10.838877 containerd[1879]: time="2025-05-27T02:49:10.837921374Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5879b6cfc8-5r62c,Uid:e44b0335-44d6-403f-9889-bba3e30a3868,Namespace:calico-system,Attempt:0,}" May 27 02:49:11.047135 systemd-networkd[1487]: cali0a2a5c1d8ed: Link UP May 27 02:49:11.048926 systemd-networkd[1487]: cali0a2a5c1d8ed: Gained carrier May 27 02:49:11.080003 containerd[1879]: 2025-05-27 02:49:10.885 [INFO][4757] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 02:49:11.080003 containerd[1879]: 2025-05-27 02:49:10.895 [INFO][4757] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--z5jn2-eth0 coredns-668d6bf9bc- kube-system 7ecc9ba8-534a-4499-b17a-21b491931340 778 0 2025-05-27 02:48:37 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4344.0.0-a-583de22c75 coredns-668d6bf9bc-z5jn2 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali0a2a5c1d8ed [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f" Namespace="kube-system" Pod="coredns-668d6bf9bc-z5jn2" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--z5jn2-" May 27 02:49:11.080003 containerd[1879]: 2025-05-27 02:49:10.896 [INFO][4757] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f" Namespace="kube-system" Pod="coredns-668d6bf9bc-z5jn2" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--z5jn2-eth0" May 27 02:49:11.080003 containerd[1879]: 2025-05-27 02:49:10.951 [INFO][4787] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f" HandleID="k8s-pod-network.1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f" Workload="ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--z5jn2-eth0" May 27 02:49:11.080314 containerd[1879]: 2025-05-27 02:49:10.951 [INFO][4787] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f" HandleID="k8s-pod-network.1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f" Workload="ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--z5jn2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d7a00), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4344.0.0-a-583de22c75", "pod":"coredns-668d6bf9bc-z5jn2", "timestamp":"2025-05-27 02:49:10.951659993 +0000 UTC"}, Hostname:"ci-4344.0.0-a-583de22c75", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 02:49:11.080314 containerd[1879]: 2025-05-27 02:49:10.951 [INFO][4787] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 02:49:11.080314 containerd[1879]: 2025-05-27 02:49:10.951 [INFO][4787] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 02:49:11.080314 containerd[1879]: 2025-05-27 02:49:10.951 [INFO][4787] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-583de22c75' May 27 02:49:11.080314 containerd[1879]: 2025-05-27 02:49:10.966 [INFO][4787] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f" host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.080314 containerd[1879]: 2025-05-27 02:49:10.974 [INFO][4787] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.080314 containerd[1879]: 2025-05-27 02:49:10.993 [INFO][4787] ipam/ipam.go 511: Trying affinity for 192.168.26.128/26 host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.080314 containerd[1879]: 2025-05-27 02:49:10.997 [INFO][4787] ipam/ipam.go 158: Attempting to load block cidr=192.168.26.128/26 host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.080314 containerd[1879]: 2025-05-27 02:49:11.002 [INFO][4787] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.26.128/26 host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.080817 containerd[1879]: 2025-05-27 02:49:11.002 [INFO][4787] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.26.128/26 handle="k8s-pod-network.1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f" host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.080817 containerd[1879]: 2025-05-27 02:49:11.004 [INFO][4787] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f May 27 02:49:11.080817 containerd[1879]: 2025-05-27 02:49:11.013 [INFO][4787] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.26.128/26 handle="k8s-pod-network.1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f" host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.080817 containerd[1879]: 2025-05-27 02:49:11.029 [INFO][4787] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.26.130/26] block=192.168.26.128/26 handle="k8s-pod-network.1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f" host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.080817 containerd[1879]: 2025-05-27 02:49:11.029 [INFO][4787] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.26.130/26] handle="k8s-pod-network.1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f" host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.080817 containerd[1879]: 2025-05-27 02:49:11.030 [INFO][4787] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 02:49:11.080817 containerd[1879]: 2025-05-27 02:49:11.030 [INFO][4787] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.26.130/26] IPv6=[] ContainerID="1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f" HandleID="k8s-pod-network.1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f" Workload="ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--z5jn2-eth0" May 27 02:49:11.080927 containerd[1879]: 2025-05-27 02:49:11.037 [INFO][4757] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f" Namespace="kube-system" Pod="coredns-668d6bf9bc-z5jn2" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--z5jn2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--z5jn2-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"7ecc9ba8-534a-4499-b17a-21b491931340", ResourceVersion:"778", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 48, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-583de22c75", ContainerID:"", Pod:"coredns-668d6bf9bc-z5jn2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.26.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0a2a5c1d8ed", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:11.080927 containerd[1879]: 2025-05-27 02:49:11.038 [INFO][4757] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.26.130/32] ContainerID="1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f" Namespace="kube-system" Pod="coredns-668d6bf9bc-z5jn2" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--z5jn2-eth0" May 27 02:49:11.080927 containerd[1879]: 2025-05-27 02:49:11.038 [INFO][4757] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0a2a5c1d8ed ContainerID="1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f" Namespace="kube-system" Pod="coredns-668d6bf9bc-z5jn2" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--z5jn2-eth0" May 27 02:49:11.080927 containerd[1879]: 2025-05-27 02:49:11.050 [INFO][4757] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f" Namespace="kube-system" Pod="coredns-668d6bf9bc-z5jn2" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--z5jn2-eth0" May 27 02:49:11.080927 containerd[1879]: 2025-05-27 02:49:11.051 [INFO][4757] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f" Namespace="kube-system" Pod="coredns-668d6bf9bc-z5jn2" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--z5jn2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--z5jn2-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"7ecc9ba8-534a-4499-b17a-21b491931340", ResourceVersion:"778", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 48, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-583de22c75", ContainerID:"1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f", Pod:"coredns-668d6bf9bc-z5jn2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.26.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0a2a5c1d8ed", MAC:"c6:13:bf:13:0e:e1", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:11.080927 containerd[1879]: 2025-05-27 02:49:11.076 [INFO][4757] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f" Namespace="kube-system" Pod="coredns-668d6bf9bc-z5jn2" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--z5jn2-eth0" May 27 02:49:11.115187 systemd-networkd[1487]: calia86cd0362ac: Link UP May 27 02:49:11.117905 systemd-networkd[1487]: calia86cd0362ac: Gained carrier May 27 02:49:11.133670 containerd[1879]: 2025-05-27 02:49:10.944 [INFO][4767] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 02:49:11.133670 containerd[1879]: 2025-05-27 02:49:10.975 [INFO][4767] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--g45cr-eth0 coredns-668d6bf9bc- kube-system eadeaa32-d265-475c-ae9e-bb3b9b20cff9 789 0 2025-05-27 02:48:37 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4344.0.0-a-583de22c75 coredns-668d6bf9bc-g45cr eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calia86cd0362ac [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861" Namespace="kube-system" Pod="coredns-668d6bf9bc-g45cr" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--g45cr-" May 27 02:49:11.133670 containerd[1879]: 2025-05-27 02:49:10.975 [INFO][4767] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861" Namespace="kube-system" Pod="coredns-668d6bf9bc-g45cr" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--g45cr-eth0" May 27 02:49:11.133670 containerd[1879]: 2025-05-27 02:49:11.046 [INFO][4806] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861" HandleID="k8s-pod-network.c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861" Workload="ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--g45cr-eth0" May 27 02:49:11.133670 containerd[1879]: 2025-05-27 02:49:11.048 [INFO][4806] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861" HandleID="k8s-pod-network.c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861" Workload="ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--g45cr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002a94d0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4344.0.0-a-583de22c75", "pod":"coredns-668d6bf9bc-g45cr", "timestamp":"2025-05-27 02:49:11.046624296 +0000 UTC"}, Hostname:"ci-4344.0.0-a-583de22c75", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 02:49:11.133670 containerd[1879]: 2025-05-27 02:49:11.049 [INFO][4806] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 02:49:11.133670 containerd[1879]: 2025-05-27 02:49:11.049 [INFO][4806] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 02:49:11.133670 containerd[1879]: 2025-05-27 02:49:11.049 [INFO][4806] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-583de22c75' May 27 02:49:11.133670 containerd[1879]: 2025-05-27 02:49:11.067 [INFO][4806] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861" host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.133670 containerd[1879]: 2025-05-27 02:49:11.076 [INFO][4806] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.133670 containerd[1879]: 2025-05-27 02:49:11.085 [INFO][4806] ipam/ipam.go 511: Trying affinity for 192.168.26.128/26 host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.133670 containerd[1879]: 2025-05-27 02:49:11.087 [INFO][4806] ipam/ipam.go 158: Attempting to load block cidr=192.168.26.128/26 host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.133670 containerd[1879]: 2025-05-27 02:49:11.090 [INFO][4806] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.26.128/26 host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.133670 containerd[1879]: 2025-05-27 02:49:11.090 [INFO][4806] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.26.128/26 handle="k8s-pod-network.c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861" host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.133670 containerd[1879]: 2025-05-27 02:49:11.092 [INFO][4806] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861 May 27 02:49:11.133670 containerd[1879]: 2025-05-27 02:49:11.096 [INFO][4806] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.26.128/26 handle="k8s-pod-network.c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861" host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.133670 containerd[1879]: 2025-05-27 02:49:11.104 [INFO][4806] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.26.131/26] block=192.168.26.128/26 handle="k8s-pod-network.c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861" host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.133670 containerd[1879]: 2025-05-27 02:49:11.106 [INFO][4806] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.26.131/26] handle="k8s-pod-network.c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861" host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.133670 containerd[1879]: 2025-05-27 02:49:11.106 [INFO][4806] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 02:49:11.133670 containerd[1879]: 2025-05-27 02:49:11.106 [INFO][4806] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.26.131/26] IPv6=[] ContainerID="c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861" HandleID="k8s-pod-network.c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861" Workload="ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--g45cr-eth0" May 27 02:49:11.134124 containerd[1879]: 2025-05-27 02:49:11.109 [INFO][4767] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861" Namespace="kube-system" Pod="coredns-668d6bf9bc-g45cr" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--g45cr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--g45cr-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"eadeaa32-d265-475c-ae9e-bb3b9b20cff9", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 48, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-583de22c75", ContainerID:"", Pod:"coredns-668d6bf9bc-g45cr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.26.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia86cd0362ac", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:11.134124 containerd[1879]: 2025-05-27 02:49:11.109 [INFO][4767] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.26.131/32] ContainerID="c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861" Namespace="kube-system" Pod="coredns-668d6bf9bc-g45cr" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--g45cr-eth0" May 27 02:49:11.134124 containerd[1879]: 2025-05-27 02:49:11.109 [INFO][4767] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia86cd0362ac ContainerID="c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861" Namespace="kube-system" Pod="coredns-668d6bf9bc-g45cr" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--g45cr-eth0" May 27 02:49:11.134124 containerd[1879]: 2025-05-27 02:49:11.118 [INFO][4767] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861" Namespace="kube-system" Pod="coredns-668d6bf9bc-g45cr" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--g45cr-eth0" May 27 02:49:11.134124 containerd[1879]: 2025-05-27 02:49:11.119 [INFO][4767] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861" Namespace="kube-system" Pod="coredns-668d6bf9bc-g45cr" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--g45cr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--g45cr-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"eadeaa32-d265-475c-ae9e-bb3b9b20cff9", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 48, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-583de22c75", ContainerID:"c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861", Pod:"coredns-668d6bf9bc-g45cr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.26.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia86cd0362ac", MAC:"62:bf:69:1d:f4:de", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:11.134124 containerd[1879]: 2025-05-27 02:49:11.130 [INFO][4767] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861" Namespace="kube-system" Pod="coredns-668d6bf9bc-g45cr" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-coredns--668d6bf9bc--g45cr-eth0" May 27 02:49:11.169510 containerd[1879]: time="2025-05-27T02:49:11.169389928Z" level=info msg="connecting to shim 1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f" address="unix:///run/containerd/s/6dcb6d79baf511535424dc724f68c422e239f67ad46d6b56d7319712b9532d7d" namespace=k8s.io protocol=ttrpc version=3 May 27 02:49:11.198685 systemd[1]: Started cri-containerd-1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f.scope - libcontainer container 1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f. May 27 02:49:11.204659 containerd[1879]: time="2025-05-27T02:49:11.204564341Z" level=info msg="connecting to shim c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861" address="unix:///run/containerd/s/3fcad7f4eb687f0b18648772d2d4750630a3366cb491ac8ab05f81924a208d11" namespace=k8s.io protocol=ttrpc version=3 May 27 02:49:11.238242 systemd-networkd[1487]: cali702789b45fa: Link UP May 27 02:49:11.238734 systemd-networkd[1487]: cali702789b45fa: Gained carrier May 27 02:49:11.240711 systemd[1]: Started cri-containerd-c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861.scope - libcontainer container c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861. May 27 02:49:11.262890 containerd[1879]: 2025-05-27 02:49:10.956 [INFO][4784] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 02:49:11.262890 containerd[1879]: 2025-05-27 02:49:10.977 [INFO][4784] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--583de22c75-k8s-calico--kube--controllers--5879b6cfc8--5r62c-eth0 calico-kube-controllers-5879b6cfc8- calico-system e44b0335-44d6-403f-9889-bba3e30a3868 791 0 2025-05-27 02:48:49 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5879b6cfc8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4344.0.0-a-583de22c75 calico-kube-controllers-5879b6cfc8-5r62c eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali702789b45fa [] [] }} ContainerID="4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2" Namespace="calico-system" Pod="calico-kube-controllers-5879b6cfc8-5r62c" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-calico--kube--controllers--5879b6cfc8--5r62c-" May 27 02:49:11.262890 containerd[1879]: 2025-05-27 02:49:10.981 [INFO][4784] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2" Namespace="calico-system" Pod="calico-kube-controllers-5879b6cfc8-5r62c" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-calico--kube--controllers--5879b6cfc8--5r62c-eth0" May 27 02:49:11.262890 containerd[1879]: 2025-05-27 02:49:11.074 [INFO][4808] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2" HandleID="k8s-pod-network.4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2" Workload="ci--4344.0.0--a--583de22c75-k8s-calico--kube--controllers--5879b6cfc8--5r62c-eth0" May 27 02:49:11.262890 containerd[1879]: 2025-05-27 02:49:11.075 [INFO][4808] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2" HandleID="k8s-pod-network.4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2" Workload="ci--4344.0.0--a--583de22c75-k8s-calico--kube--controllers--5879b6cfc8--5r62c-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000389380), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344.0.0-a-583de22c75", "pod":"calico-kube-controllers-5879b6cfc8-5r62c", "timestamp":"2025-05-27 02:49:11.074876662 +0000 UTC"}, Hostname:"ci-4344.0.0-a-583de22c75", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 02:49:11.262890 containerd[1879]: 2025-05-27 02:49:11.075 [INFO][4808] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 02:49:11.262890 containerd[1879]: 2025-05-27 02:49:11.106 [INFO][4808] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 02:49:11.262890 containerd[1879]: 2025-05-27 02:49:11.106 [INFO][4808] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-583de22c75' May 27 02:49:11.262890 containerd[1879]: 2025-05-27 02:49:11.171 [INFO][4808] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2" host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.262890 containerd[1879]: 2025-05-27 02:49:11.180 [INFO][4808] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.262890 containerd[1879]: 2025-05-27 02:49:11.187 [INFO][4808] ipam/ipam.go 511: Trying affinity for 192.168.26.128/26 host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.262890 containerd[1879]: 2025-05-27 02:49:11.193 [INFO][4808] ipam/ipam.go 158: Attempting to load block cidr=192.168.26.128/26 host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.262890 containerd[1879]: 2025-05-27 02:49:11.196 [INFO][4808] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.26.128/26 host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.262890 containerd[1879]: 2025-05-27 02:49:11.196 [INFO][4808] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.26.128/26 handle="k8s-pod-network.4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2" host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.262890 containerd[1879]: 2025-05-27 02:49:11.200 [INFO][4808] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2 May 27 02:49:11.262890 containerd[1879]: 2025-05-27 02:49:11.209 [INFO][4808] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.26.128/26 handle="k8s-pod-network.4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2" host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.262890 containerd[1879]: 2025-05-27 02:49:11.222 [INFO][4808] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.26.132/26] block=192.168.26.128/26 handle="k8s-pod-network.4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2" host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.262890 containerd[1879]: 2025-05-27 02:49:11.222 [INFO][4808] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.26.132/26] handle="k8s-pod-network.4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2" host="ci-4344.0.0-a-583de22c75" May 27 02:49:11.262890 containerd[1879]: 2025-05-27 02:49:11.223 [INFO][4808] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 02:49:11.262890 containerd[1879]: 2025-05-27 02:49:11.223 [INFO][4808] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.26.132/26] IPv6=[] ContainerID="4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2" HandleID="k8s-pod-network.4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2" Workload="ci--4344.0.0--a--583de22c75-k8s-calico--kube--controllers--5879b6cfc8--5r62c-eth0" May 27 02:49:11.263365 containerd[1879]: 2025-05-27 02:49:11.227 [INFO][4784] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2" Namespace="calico-system" Pod="calico-kube-controllers-5879b6cfc8-5r62c" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-calico--kube--controllers--5879b6cfc8--5r62c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--583de22c75-k8s-calico--kube--controllers--5879b6cfc8--5r62c-eth0", GenerateName:"calico-kube-controllers-5879b6cfc8-", Namespace:"calico-system", SelfLink:"", UID:"e44b0335-44d6-403f-9889-bba3e30a3868", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 48, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5879b6cfc8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-583de22c75", ContainerID:"", Pod:"calico-kube-controllers-5879b6cfc8-5r62c", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.26.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali702789b45fa", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:11.263365 containerd[1879]: 2025-05-27 02:49:11.228 [INFO][4784] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.26.132/32] ContainerID="4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2" Namespace="calico-system" Pod="calico-kube-controllers-5879b6cfc8-5r62c" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-calico--kube--controllers--5879b6cfc8--5r62c-eth0" May 27 02:49:11.263365 containerd[1879]: 2025-05-27 02:49:11.228 [INFO][4784] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali702789b45fa ContainerID="4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2" Namespace="calico-system" Pod="calico-kube-controllers-5879b6cfc8-5r62c" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-calico--kube--controllers--5879b6cfc8--5r62c-eth0" May 27 02:49:11.263365 containerd[1879]: 2025-05-27 02:49:11.239 [INFO][4784] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2" Namespace="calico-system" Pod="calico-kube-controllers-5879b6cfc8-5r62c" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-calico--kube--controllers--5879b6cfc8--5r62c-eth0" May 27 02:49:11.263365 containerd[1879]: 2025-05-27 02:49:11.240 [INFO][4784] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2" Namespace="calico-system" Pod="calico-kube-controllers-5879b6cfc8-5r62c" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-calico--kube--controllers--5879b6cfc8--5r62c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--583de22c75-k8s-calico--kube--controllers--5879b6cfc8--5r62c-eth0", GenerateName:"calico-kube-controllers-5879b6cfc8-", Namespace:"calico-system", SelfLink:"", UID:"e44b0335-44d6-403f-9889-bba3e30a3868", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 48, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5879b6cfc8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-583de22c75", ContainerID:"4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2", Pod:"calico-kube-controllers-5879b6cfc8-5r62c", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.26.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali702789b45fa", MAC:"12:9b:b6:4c:21:cc", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:11.263365 containerd[1879]: 2025-05-27 02:49:11.259 [INFO][4784] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2" Namespace="calico-system" Pod="calico-kube-controllers-5879b6cfc8-5r62c" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-calico--kube--controllers--5879b6cfc8--5r62c-eth0" May 27 02:49:11.287900 containerd[1879]: time="2025-05-27T02:49:11.287840971Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-z5jn2,Uid:7ecc9ba8-534a-4499-b17a-21b491931340,Namespace:kube-system,Attempt:0,} returns sandbox id \"1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f\"" May 27 02:49:11.292004 containerd[1879]: time="2025-05-27T02:49:11.291966914Z" level=info msg="CreateContainer within sandbox \"1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 27 02:49:11.326945 containerd[1879]: time="2025-05-27T02:49:11.326887680Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-g45cr,Uid:eadeaa32-d265-475c-ae9e-bb3b9b20cff9,Namespace:kube-system,Attempt:0,} returns sandbox id \"c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861\"" May 27 02:49:11.330258 containerd[1879]: time="2025-05-27T02:49:11.330217256Z" level=info msg="CreateContainer within sandbox \"c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 27 02:49:11.337706 containerd[1879]: time="2025-05-27T02:49:11.337660758Z" level=info msg="Container 6b09482cba249a42ba1a9d68df038b5763c0ee51e1dab3411d249bb2687b1fd4: CDI devices from CRI Config.CDIDevices: []" May 27 02:49:11.347159 containerd[1879]: time="2025-05-27T02:49:11.347098374Z" level=info msg="connecting to shim 4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2" address="unix:///run/containerd/s/3e99507d579dfa975eacebf455feb8603dc3e8757537e1e11a68f32e9f32d306" namespace=k8s.io protocol=ttrpc version=3 May 27 02:49:11.367899 systemd[1]: Started cri-containerd-4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2.scope - libcontainer container 4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2. May 27 02:49:11.379696 containerd[1879]: time="2025-05-27T02:49:11.379633951Z" level=info msg="CreateContainer within sandbox \"1a06bd0f0dcdc89833a376483884fc6f266d4610988b44b707bd48629715ae0f\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"6b09482cba249a42ba1a9d68df038b5763c0ee51e1dab3411d249bb2687b1fd4\"" May 27 02:49:11.380978 containerd[1879]: time="2025-05-27T02:49:11.380659741Z" level=info msg="StartContainer for \"6b09482cba249a42ba1a9d68df038b5763c0ee51e1dab3411d249bb2687b1fd4\"" May 27 02:49:11.382668 containerd[1879]: time="2025-05-27T02:49:11.382632269Z" level=info msg="connecting to shim 6b09482cba249a42ba1a9d68df038b5763c0ee51e1dab3411d249bb2687b1fd4" address="unix:///run/containerd/s/6dcb6d79baf511535424dc724f68c422e239f67ad46d6b56d7319712b9532d7d" protocol=ttrpc version=3 May 27 02:49:11.387453 kubelet[3293]: I0527 02:49:11.387300 3293 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 02:49:11.390486 containerd[1879]: time="2025-05-27T02:49:11.390395365Z" level=info msg="Container 0615803c5ab9f45212dacdbc6e9f0ab538a90870a60d012a688c8b525cde3c10: CDI devices from CRI Config.CDIDevices: []" May 27 02:49:11.412926 systemd[1]: Started cri-containerd-6b09482cba249a42ba1a9d68df038b5763c0ee51e1dab3411d249bb2687b1fd4.scope - libcontainer container 6b09482cba249a42ba1a9d68df038b5763c0ee51e1dab3411d249bb2687b1fd4. May 27 02:49:11.416154 containerd[1879]: time="2025-05-27T02:49:11.416101041Z" level=info msg="CreateContainer within sandbox \"c8949bb1cf5ab146a80f7780623c953d6678ad4d4008d8a23f3b30f22577c861\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"0615803c5ab9f45212dacdbc6e9f0ab538a90870a60d012a688c8b525cde3c10\"" May 27 02:49:11.417918 containerd[1879]: time="2025-05-27T02:49:11.417866836Z" level=info msg="StartContainer for \"0615803c5ab9f45212dacdbc6e9f0ab538a90870a60d012a688c8b525cde3c10\"" May 27 02:49:11.420280 containerd[1879]: time="2025-05-27T02:49:11.420237312Z" level=info msg="connecting to shim 0615803c5ab9f45212dacdbc6e9f0ab538a90870a60d012a688c8b525cde3c10" address="unix:///run/containerd/s/3fcad7f4eb687f0b18648772d2d4750630a3366cb491ac8ab05f81924a208d11" protocol=ttrpc version=3 May 27 02:49:11.447744 systemd[1]: Started cri-containerd-0615803c5ab9f45212dacdbc6e9f0ab538a90870a60d012a688c8b525cde3c10.scope - libcontainer container 0615803c5ab9f45212dacdbc6e9f0ab538a90870a60d012a688c8b525cde3c10. May 27 02:49:11.484171 containerd[1879]: time="2025-05-27T02:49:11.484111504Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5879b6cfc8-5r62c,Uid:e44b0335-44d6-403f-9889-bba3e30a3868,Namespace:calico-system,Attempt:0,} returns sandbox id \"4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2\"" May 27 02:49:11.489681 containerd[1879]: time="2025-05-27T02:49:11.489519228Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\"" May 27 02:49:11.509519 containerd[1879]: time="2025-05-27T02:49:11.509436985Z" level=info msg="StartContainer for \"0615803c5ab9f45212dacdbc6e9f0ab538a90870a60d012a688c8b525cde3c10\" returns successfully" May 27 02:49:11.510601 containerd[1879]: time="2025-05-27T02:49:11.509758675Z" level=info msg="StartContainer for \"6b09482cba249a42ba1a9d68df038b5763c0ee51e1dab3411d249bb2687b1fd4\" returns successfully" May 27 02:49:12.007496 kubelet[3293]: I0527 02:49:12.007194 3293 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-g45cr" podStartSLOduration=35.007176384 podStartE2EDuration="35.007176384s" podCreationTimestamp="2025-05-27 02:48:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 02:49:12.006953521 +0000 UTC m=+41.296270515" watchObservedRunningTime="2025-05-27 02:49:12.007176384 +0000 UTC m=+41.296493378" May 27 02:49:12.031595 kubelet[3293]: I0527 02:49:12.030144 3293 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-z5jn2" podStartSLOduration=35.030125021 podStartE2EDuration="35.030125021s" podCreationTimestamp="2025-05-27 02:48:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 02:49:12.029467618 +0000 UTC m=+41.318784612" watchObservedRunningTime="2025-05-27 02:49:12.030125021 +0000 UTC m=+41.319442015" May 27 02:49:12.176641 systemd-networkd[1487]: calia86cd0362ac: Gained IPv6LL May 27 02:49:12.190343 systemd-networkd[1487]: vxlan.calico: Link UP May 27 02:49:12.190351 systemd-networkd[1487]: vxlan.calico: Gained carrier May 27 02:49:12.304090 systemd-networkd[1487]: cali0a2a5c1d8ed: Gained IPv6LL May 27 02:49:12.431654 systemd-networkd[1487]: cali702789b45fa: Gained IPv6LL May 27 02:49:12.838137 containerd[1879]: time="2025-05-27T02:49:12.838016312Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-fcf964649-r2t2l,Uid:5594cdcd-4432-410c-abaa-5e2764692cf8,Namespace:calico-apiserver,Attempt:0,}" May 27 02:49:12.839650 containerd[1879]: time="2025-05-27T02:49:12.839470834Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-fcf964649-js6pq,Uid:9611f34a-6a9a-4745-8c18-b491e412177c,Namespace:calico-apiserver,Attempt:0,}" May 27 02:49:13.056300 systemd-networkd[1487]: cali68e57ae6a37: Link UP May 27 02:49:13.057927 systemd-networkd[1487]: cali68e57ae6a37: Gained carrier May 27 02:49:13.081164 containerd[1879]: 2025-05-27 02:49:12.921 [INFO][5181] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--r2t2l-eth0 calico-apiserver-fcf964649- calico-apiserver 5594cdcd-4432-410c-abaa-5e2764692cf8 790 0 2025-05-27 02:48:46 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:fcf964649 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344.0.0-a-583de22c75 calico-apiserver-fcf964649-r2t2l eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali68e57ae6a37 [] [] }} ContainerID="00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060" Namespace="calico-apiserver" Pod="calico-apiserver-fcf964649-r2t2l" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--r2t2l-" May 27 02:49:13.081164 containerd[1879]: 2025-05-27 02:49:12.922 [INFO][5181] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060" Namespace="calico-apiserver" Pod="calico-apiserver-fcf964649-r2t2l" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--r2t2l-eth0" May 27 02:49:13.081164 containerd[1879]: 2025-05-27 02:49:12.969 [INFO][5209] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060" HandleID="k8s-pod-network.00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060" Workload="ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--r2t2l-eth0" May 27 02:49:13.081164 containerd[1879]: 2025-05-27 02:49:12.970 [INFO][5209] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060" HandleID="k8s-pod-network.00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060" Workload="ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--r2t2l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d7660), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344.0.0-a-583de22c75", "pod":"calico-apiserver-fcf964649-r2t2l", "timestamp":"2025-05-27 02:49:12.969805267 +0000 UTC"}, Hostname:"ci-4344.0.0-a-583de22c75", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 02:49:13.081164 containerd[1879]: 2025-05-27 02:49:12.970 [INFO][5209] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 02:49:13.081164 containerd[1879]: 2025-05-27 02:49:12.970 [INFO][5209] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 02:49:13.081164 containerd[1879]: 2025-05-27 02:49:12.970 [INFO][5209] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-583de22c75' May 27 02:49:13.081164 containerd[1879]: 2025-05-27 02:49:12.983 [INFO][5209] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060" host="ci-4344.0.0-a-583de22c75" May 27 02:49:13.081164 containerd[1879]: 2025-05-27 02:49:12.991 [INFO][5209] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-583de22c75" May 27 02:49:13.081164 containerd[1879]: 2025-05-27 02:49:13.004 [INFO][5209] ipam/ipam.go 511: Trying affinity for 192.168.26.128/26 host="ci-4344.0.0-a-583de22c75" May 27 02:49:13.081164 containerd[1879]: 2025-05-27 02:49:13.008 [INFO][5209] ipam/ipam.go 158: Attempting to load block cidr=192.168.26.128/26 host="ci-4344.0.0-a-583de22c75" May 27 02:49:13.081164 containerd[1879]: 2025-05-27 02:49:13.013 [INFO][5209] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.26.128/26 host="ci-4344.0.0-a-583de22c75" May 27 02:49:13.081164 containerd[1879]: 2025-05-27 02:49:13.013 [INFO][5209] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.26.128/26 handle="k8s-pod-network.00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060" host="ci-4344.0.0-a-583de22c75" May 27 02:49:13.081164 containerd[1879]: 2025-05-27 02:49:13.015 [INFO][5209] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060 May 27 02:49:13.081164 containerd[1879]: 2025-05-27 02:49:13.029 [INFO][5209] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.26.128/26 handle="k8s-pod-network.00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060" host="ci-4344.0.0-a-583de22c75" May 27 02:49:13.081164 containerd[1879]: 2025-05-27 02:49:13.041 [INFO][5209] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.26.133/26] block=192.168.26.128/26 handle="k8s-pod-network.00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060" host="ci-4344.0.0-a-583de22c75" May 27 02:49:13.081164 containerd[1879]: 2025-05-27 02:49:13.041 [INFO][5209] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.26.133/26] handle="k8s-pod-network.00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060" host="ci-4344.0.0-a-583de22c75" May 27 02:49:13.081164 containerd[1879]: 2025-05-27 02:49:13.042 [INFO][5209] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 02:49:13.081164 containerd[1879]: 2025-05-27 02:49:13.042 [INFO][5209] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.26.133/26] IPv6=[] ContainerID="00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060" HandleID="k8s-pod-network.00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060" Workload="ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--r2t2l-eth0" May 27 02:49:13.082364 containerd[1879]: 2025-05-27 02:49:13.050 [INFO][5181] cni-plugin/k8s.go 418: Populated endpoint ContainerID="00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060" Namespace="calico-apiserver" Pod="calico-apiserver-fcf964649-r2t2l" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--r2t2l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--r2t2l-eth0", GenerateName:"calico-apiserver-fcf964649-", Namespace:"calico-apiserver", SelfLink:"", UID:"5594cdcd-4432-410c-abaa-5e2764692cf8", ResourceVersion:"790", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 48, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"fcf964649", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-583de22c75", ContainerID:"", Pod:"calico-apiserver-fcf964649-r2t2l", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.26.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali68e57ae6a37", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:13.082364 containerd[1879]: 2025-05-27 02:49:13.051 [INFO][5181] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.26.133/32] ContainerID="00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060" Namespace="calico-apiserver" Pod="calico-apiserver-fcf964649-r2t2l" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--r2t2l-eth0" May 27 02:49:13.082364 containerd[1879]: 2025-05-27 02:49:13.051 [INFO][5181] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali68e57ae6a37 ContainerID="00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060" Namespace="calico-apiserver" Pod="calico-apiserver-fcf964649-r2t2l" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--r2t2l-eth0" May 27 02:49:13.082364 containerd[1879]: 2025-05-27 02:49:13.056 [INFO][5181] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060" Namespace="calico-apiserver" Pod="calico-apiserver-fcf964649-r2t2l" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--r2t2l-eth0" May 27 02:49:13.082364 containerd[1879]: 2025-05-27 02:49:13.057 [INFO][5181] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060" Namespace="calico-apiserver" Pod="calico-apiserver-fcf964649-r2t2l" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--r2t2l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--r2t2l-eth0", GenerateName:"calico-apiserver-fcf964649-", Namespace:"calico-apiserver", SelfLink:"", UID:"5594cdcd-4432-410c-abaa-5e2764692cf8", ResourceVersion:"790", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 48, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"fcf964649", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-583de22c75", ContainerID:"00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060", Pod:"calico-apiserver-fcf964649-r2t2l", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.26.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali68e57ae6a37", MAC:"da:6a:56:94:a8:2f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:13.082364 containerd[1879]: 2025-05-27 02:49:13.074 [INFO][5181] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060" Namespace="calico-apiserver" Pod="calico-apiserver-fcf964649-r2t2l" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--r2t2l-eth0" May 27 02:49:13.160670 containerd[1879]: time="2025-05-27T02:49:13.160588562Z" level=info msg="connecting to shim 00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060" address="unix:///run/containerd/s/c31680cf8a0bbb49981392db9996ae8fbdba5e859b81e3be105235de52a70655" namespace=k8s.io protocol=ttrpc version=3 May 27 02:49:13.167757 systemd-networkd[1487]: cali954dfcd813d: Link UP May 27 02:49:13.171140 systemd-networkd[1487]: cali954dfcd813d: Gained carrier May 27 02:49:13.199080 systemd[1]: Started cri-containerd-00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060.scope - libcontainer container 00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060. May 27 02:49:13.210276 containerd[1879]: 2025-05-27 02:49:12.939 [INFO][5190] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--js6pq-eth0 calico-apiserver-fcf964649- calico-apiserver 9611f34a-6a9a-4745-8c18-b491e412177c 783 0 2025-05-27 02:48:46 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:fcf964649 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344.0.0-a-583de22c75 calico-apiserver-fcf964649-js6pq eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali954dfcd813d [] [] }} ContainerID="5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913" Namespace="calico-apiserver" Pod="calico-apiserver-fcf964649-js6pq" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--js6pq-" May 27 02:49:13.210276 containerd[1879]: 2025-05-27 02:49:12.940 [INFO][5190] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913" Namespace="calico-apiserver" Pod="calico-apiserver-fcf964649-js6pq" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--js6pq-eth0" May 27 02:49:13.210276 containerd[1879]: 2025-05-27 02:49:13.003 [INFO][5214] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913" HandleID="k8s-pod-network.5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913" Workload="ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--js6pq-eth0" May 27 02:49:13.210276 containerd[1879]: 2025-05-27 02:49:13.004 [INFO][5214] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913" HandleID="k8s-pod-network.5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913" Workload="ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--js6pq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cf020), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344.0.0-a-583de22c75", "pod":"calico-apiserver-fcf964649-js6pq", "timestamp":"2025-05-27 02:49:13.003352737 +0000 UTC"}, Hostname:"ci-4344.0.0-a-583de22c75", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 02:49:13.210276 containerd[1879]: 2025-05-27 02:49:13.004 [INFO][5214] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 02:49:13.210276 containerd[1879]: 2025-05-27 02:49:13.042 [INFO][5214] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 02:49:13.210276 containerd[1879]: 2025-05-27 02:49:13.042 [INFO][5214] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-583de22c75' May 27 02:49:13.210276 containerd[1879]: 2025-05-27 02:49:13.084 [INFO][5214] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913" host="ci-4344.0.0-a-583de22c75" May 27 02:49:13.210276 containerd[1879]: 2025-05-27 02:49:13.099 [INFO][5214] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-583de22c75" May 27 02:49:13.210276 containerd[1879]: 2025-05-27 02:49:13.111 [INFO][5214] ipam/ipam.go 511: Trying affinity for 192.168.26.128/26 host="ci-4344.0.0-a-583de22c75" May 27 02:49:13.210276 containerd[1879]: 2025-05-27 02:49:13.117 [INFO][5214] ipam/ipam.go 158: Attempting to load block cidr=192.168.26.128/26 host="ci-4344.0.0-a-583de22c75" May 27 02:49:13.210276 containerd[1879]: 2025-05-27 02:49:13.125 [INFO][5214] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.26.128/26 host="ci-4344.0.0-a-583de22c75" May 27 02:49:13.210276 containerd[1879]: 2025-05-27 02:49:13.125 [INFO][5214] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.26.128/26 handle="k8s-pod-network.5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913" host="ci-4344.0.0-a-583de22c75" May 27 02:49:13.210276 containerd[1879]: 2025-05-27 02:49:13.127 [INFO][5214] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913 May 27 02:49:13.210276 containerd[1879]: 2025-05-27 02:49:13.138 [INFO][5214] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.26.128/26 handle="k8s-pod-network.5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913" host="ci-4344.0.0-a-583de22c75" May 27 02:49:13.210276 containerd[1879]: 2025-05-27 02:49:13.150 [INFO][5214] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.26.134/26] block=192.168.26.128/26 handle="k8s-pod-network.5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913" host="ci-4344.0.0-a-583de22c75" May 27 02:49:13.210276 containerd[1879]: 2025-05-27 02:49:13.150 [INFO][5214] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.26.134/26] handle="k8s-pod-network.5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913" host="ci-4344.0.0-a-583de22c75" May 27 02:49:13.210276 containerd[1879]: 2025-05-27 02:49:13.150 [INFO][5214] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 02:49:13.210276 containerd[1879]: 2025-05-27 02:49:13.150 [INFO][5214] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.26.134/26] IPv6=[] ContainerID="5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913" HandleID="k8s-pod-network.5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913" Workload="ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--js6pq-eth0" May 27 02:49:13.210860 containerd[1879]: 2025-05-27 02:49:13.160 [INFO][5190] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913" Namespace="calico-apiserver" Pod="calico-apiserver-fcf964649-js6pq" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--js6pq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--js6pq-eth0", GenerateName:"calico-apiserver-fcf964649-", Namespace:"calico-apiserver", SelfLink:"", UID:"9611f34a-6a9a-4745-8c18-b491e412177c", ResourceVersion:"783", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 48, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"fcf964649", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-583de22c75", ContainerID:"", Pod:"calico-apiserver-fcf964649-js6pq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.26.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali954dfcd813d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:13.210860 containerd[1879]: 2025-05-27 02:49:13.160 [INFO][5190] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.26.134/32] ContainerID="5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913" Namespace="calico-apiserver" Pod="calico-apiserver-fcf964649-js6pq" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--js6pq-eth0" May 27 02:49:13.210860 containerd[1879]: 2025-05-27 02:49:13.160 [INFO][5190] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali954dfcd813d ContainerID="5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913" Namespace="calico-apiserver" Pod="calico-apiserver-fcf964649-js6pq" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--js6pq-eth0" May 27 02:49:13.210860 containerd[1879]: 2025-05-27 02:49:13.173 [INFO][5190] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913" Namespace="calico-apiserver" Pod="calico-apiserver-fcf964649-js6pq" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--js6pq-eth0" May 27 02:49:13.210860 containerd[1879]: 2025-05-27 02:49:13.178 [INFO][5190] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913" Namespace="calico-apiserver" Pod="calico-apiserver-fcf964649-js6pq" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--js6pq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--js6pq-eth0", GenerateName:"calico-apiserver-fcf964649-", Namespace:"calico-apiserver", SelfLink:"", UID:"9611f34a-6a9a-4745-8c18-b491e412177c", ResourceVersion:"783", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 48, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"fcf964649", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-583de22c75", ContainerID:"5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913", Pod:"calico-apiserver-fcf964649-js6pq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.26.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali954dfcd813d", MAC:"6e:f5:5b:b3:10:90", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:13.210860 containerd[1879]: 2025-05-27 02:49:13.202 [INFO][5190] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913" Namespace="calico-apiserver" Pod="calico-apiserver-fcf964649-js6pq" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-calico--apiserver--fcf964649--js6pq-eth0" May 27 02:49:13.296391 containerd[1879]: time="2025-05-27T02:49:13.296320815Z" level=info msg="connecting to shim 5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913" address="unix:///run/containerd/s/360c3d8e7ace32751656d40ce53bd6fc66511110c0b6a22ab131992d88433342" namespace=k8s.io protocol=ttrpc version=3 May 27 02:49:13.300251 containerd[1879]: time="2025-05-27T02:49:13.300209455Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-fcf964649-r2t2l,Uid:5594cdcd-4432-410c-abaa-5e2764692cf8,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060\"" May 27 02:49:13.334666 systemd[1]: Started cri-containerd-5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913.scope - libcontainer container 5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913. May 27 02:49:13.385427 containerd[1879]: time="2025-05-27T02:49:13.385384972Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-fcf964649-js6pq,Uid:9611f34a-6a9a-4745-8c18-b491e412177c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913\"" May 27 02:49:13.816491 containerd[1879]: time="2025-05-27T02:49:13.816433490Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:13.820134 containerd[1879]: time="2025-05-27T02:49:13.820085459Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.0: active requests=0, bytes read=48045219" May 27 02:49:13.826627 containerd[1879]: time="2025-05-27T02:49:13.826545341Z" level=info msg="ImageCreate event name:\"sha256:4188fe2931435deda58a0dc1767a2f6ad2bb27e47662ccec626bd07006f56373\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:13.831528 containerd[1879]: time="2025-05-27T02:49:13.831443514Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:13.832185 containerd[1879]: time="2025-05-27T02:49:13.831881695Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" with image id \"sha256:4188fe2931435deda58a0dc1767a2f6ad2bb27e47662ccec626bd07006f56373\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\", size \"49414428\" in 2.341958504s" May 27 02:49:13.832185 containerd[1879]: time="2025-05-27T02:49:13.831913416Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" returns image reference \"sha256:4188fe2931435deda58a0dc1767a2f6ad2bb27e47662ccec626bd07006f56373\"" May 27 02:49:13.833496 containerd[1879]: time="2025-05-27T02:49:13.833459572Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 27 02:49:13.837383 containerd[1879]: time="2025-05-27T02:49:13.837188040Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ds6zz,Uid:c823d13d-65d4-46eb-8fbc-bfdffdd173a7,Namespace:calico-system,Attempt:0,}" May 27 02:49:13.839534 containerd[1879]: time="2025-05-27T02:49:13.839493706Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-ppbfc,Uid:1a22698a-34a8-450c-a9f9-8a53669e60b4,Namespace:calico-system,Attempt:0,}" May 27 02:49:13.847260 containerd[1879]: time="2025-05-27T02:49:13.847218385Z" level=info msg="CreateContainer within sandbox \"4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 27 02:49:13.907064 containerd[1879]: time="2025-05-27T02:49:13.906946793Z" level=info msg="Container 86afa1ab440d2f706f68848698688f45fb9ad83803ea53c3d01fa1d3df05a82c: CDI devices from CRI Config.CDIDevices: []" May 27 02:49:13.911157 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4284542008.mount: Deactivated successfully. May 27 02:49:13.947569 containerd[1879]: time="2025-05-27T02:49:13.946364472Z" level=info msg="CreateContainer within sandbox \"4534adfeb7e9ddaee1a234043d0328814f5996fa3f3af8e5d191ac8797baade2\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"86afa1ab440d2f706f68848698688f45fb9ad83803ea53c3d01fa1d3df05a82c\"" May 27 02:49:13.949821 containerd[1879]: time="2025-05-27T02:49:13.949707856Z" level=info msg="StartContainer for \"86afa1ab440d2f706f68848698688f45fb9ad83803ea53c3d01fa1d3df05a82c\"" May 27 02:49:13.952883 containerd[1879]: time="2025-05-27T02:49:13.952835642Z" level=info msg="connecting to shim 86afa1ab440d2f706f68848698688f45fb9ad83803ea53c3d01fa1d3df05a82c" address="unix:///run/containerd/s/3e99507d579dfa975eacebf455feb8603dc3e8757537e1e11a68f32e9f32d306" protocol=ttrpc version=3 May 27 02:49:13.999985 systemd[1]: Started cri-containerd-86afa1ab440d2f706f68848698688f45fb9ad83803ea53c3d01fa1d3df05a82c.scope - libcontainer container 86afa1ab440d2f706f68848698688f45fb9ad83803ea53c3d01fa1d3df05a82c. May 27 02:49:14.038100 systemd-networkd[1487]: cali96c02d9c38d: Link UP May 27 02:49:14.039789 systemd-networkd[1487]: cali96c02d9c38d: Gained carrier May 27 02:49:14.068779 containerd[1879]: 2025-05-27 02:49:13.924 [INFO][5343] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--583de22c75-k8s-csi--node--driver--ds6zz-eth0 csi-node-driver- calico-system c823d13d-65d4-46eb-8fbc-bfdffdd173a7 680 0 2025-05-27 02:48:49 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:78f6f74485 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4344.0.0-a-583de22c75 csi-node-driver-ds6zz eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali96c02d9c38d [] [] }} ContainerID="9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c" Namespace="calico-system" Pod="csi-node-driver-ds6zz" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-csi--node--driver--ds6zz-" May 27 02:49:14.068779 containerd[1879]: 2025-05-27 02:49:13.925 [INFO][5343] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c" Namespace="calico-system" Pod="csi-node-driver-ds6zz" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-csi--node--driver--ds6zz-eth0" May 27 02:49:14.068779 containerd[1879]: 2025-05-27 02:49:13.965 [INFO][5370] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c" HandleID="k8s-pod-network.9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c" Workload="ci--4344.0.0--a--583de22c75-k8s-csi--node--driver--ds6zz-eth0" May 27 02:49:14.068779 containerd[1879]: 2025-05-27 02:49:13.966 [INFO][5370] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c" HandleID="k8s-pod-network.9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c" Workload="ci--4344.0.0--a--583de22c75-k8s-csi--node--driver--ds6zz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003307a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344.0.0-a-583de22c75", "pod":"csi-node-driver-ds6zz", "timestamp":"2025-05-27 02:49:13.965867874 +0000 UTC"}, Hostname:"ci-4344.0.0-a-583de22c75", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 02:49:14.068779 containerd[1879]: 2025-05-27 02:49:13.966 [INFO][5370] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 02:49:14.068779 containerd[1879]: 2025-05-27 02:49:13.966 [INFO][5370] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 02:49:14.068779 containerd[1879]: 2025-05-27 02:49:13.966 [INFO][5370] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-583de22c75' May 27 02:49:14.068779 containerd[1879]: 2025-05-27 02:49:13.979 [INFO][5370] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c" host="ci-4344.0.0-a-583de22c75" May 27 02:49:14.068779 containerd[1879]: 2025-05-27 02:49:13.985 [INFO][5370] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-583de22c75" May 27 02:49:14.068779 containerd[1879]: 2025-05-27 02:49:13.992 [INFO][5370] ipam/ipam.go 511: Trying affinity for 192.168.26.128/26 host="ci-4344.0.0-a-583de22c75" May 27 02:49:14.068779 containerd[1879]: 2025-05-27 02:49:13.998 [INFO][5370] ipam/ipam.go 158: Attempting to load block cidr=192.168.26.128/26 host="ci-4344.0.0-a-583de22c75" May 27 02:49:14.068779 containerd[1879]: 2025-05-27 02:49:14.003 [INFO][5370] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.26.128/26 host="ci-4344.0.0-a-583de22c75" May 27 02:49:14.068779 containerd[1879]: 2025-05-27 02:49:14.003 [INFO][5370] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.26.128/26 handle="k8s-pod-network.9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c" host="ci-4344.0.0-a-583de22c75" May 27 02:49:14.068779 containerd[1879]: 2025-05-27 02:49:14.007 [INFO][5370] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c May 27 02:49:14.068779 containerd[1879]: 2025-05-27 02:49:14.016 [INFO][5370] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.26.128/26 handle="k8s-pod-network.9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c" host="ci-4344.0.0-a-583de22c75" May 27 02:49:14.068779 containerd[1879]: 2025-05-27 02:49:14.026 [INFO][5370] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.26.135/26] block=192.168.26.128/26 handle="k8s-pod-network.9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c" host="ci-4344.0.0-a-583de22c75" May 27 02:49:14.068779 containerd[1879]: 2025-05-27 02:49:14.026 [INFO][5370] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.26.135/26] handle="k8s-pod-network.9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c" host="ci-4344.0.0-a-583de22c75" May 27 02:49:14.068779 containerd[1879]: 2025-05-27 02:49:14.026 [INFO][5370] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 02:49:14.068779 containerd[1879]: 2025-05-27 02:49:14.026 [INFO][5370] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.26.135/26] IPv6=[] ContainerID="9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c" HandleID="k8s-pod-network.9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c" Workload="ci--4344.0.0--a--583de22c75-k8s-csi--node--driver--ds6zz-eth0" May 27 02:49:14.069193 containerd[1879]: 2025-05-27 02:49:14.030 [INFO][5343] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c" Namespace="calico-system" Pod="csi-node-driver-ds6zz" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-csi--node--driver--ds6zz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--583de22c75-k8s-csi--node--driver--ds6zz-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"c823d13d-65d4-46eb-8fbc-bfdffdd173a7", ResourceVersion:"680", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 48, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78f6f74485", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-583de22c75", ContainerID:"", Pod:"csi-node-driver-ds6zz", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.26.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali96c02d9c38d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:14.069193 containerd[1879]: 2025-05-27 02:49:14.031 [INFO][5343] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.26.135/32] ContainerID="9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c" Namespace="calico-system" Pod="csi-node-driver-ds6zz" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-csi--node--driver--ds6zz-eth0" May 27 02:49:14.069193 containerd[1879]: 2025-05-27 02:49:14.031 [INFO][5343] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali96c02d9c38d ContainerID="9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c" Namespace="calico-system" Pod="csi-node-driver-ds6zz" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-csi--node--driver--ds6zz-eth0" May 27 02:49:14.069193 containerd[1879]: 2025-05-27 02:49:14.042 [INFO][5343] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c" Namespace="calico-system" Pod="csi-node-driver-ds6zz" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-csi--node--driver--ds6zz-eth0" May 27 02:49:14.069193 containerd[1879]: 2025-05-27 02:49:14.042 [INFO][5343] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c" Namespace="calico-system" Pod="csi-node-driver-ds6zz" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-csi--node--driver--ds6zz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--583de22c75-k8s-csi--node--driver--ds6zz-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"c823d13d-65d4-46eb-8fbc-bfdffdd173a7", ResourceVersion:"680", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 48, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78f6f74485", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-583de22c75", ContainerID:"9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c", Pod:"csi-node-driver-ds6zz", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.26.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali96c02d9c38d", MAC:"d2:b6:32:62:d7:ee", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:14.069193 containerd[1879]: 2025-05-27 02:49:14.060 [INFO][5343] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c" Namespace="calico-system" Pod="csi-node-driver-ds6zz" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-csi--node--driver--ds6zz-eth0" May 27 02:49:14.086602 containerd[1879]: time="2025-05-27T02:49:14.086554013Z" level=info msg="StartContainer for \"86afa1ab440d2f706f68848698688f45fb9ad83803ea53c3d01fa1d3df05a82c\" returns successfully" May 27 02:49:14.128506 systemd-networkd[1487]: cali406beafa62f: Link UP May 27 02:49:14.128718 systemd-networkd[1487]: cali406beafa62f: Gained carrier May 27 02:49:14.160704 systemd-networkd[1487]: cali68e57ae6a37: Gained IPv6LL May 27 02:49:14.162560 containerd[1879]: 2025-05-27 02:49:13.928 [INFO][5355] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--583de22c75-k8s-goldmane--78d55f7ddc--ppbfc-eth0 goldmane-78d55f7ddc- calico-system 1a22698a-34a8-450c-a9f9-8a53669e60b4 788 0 2025-05-27 02:48:49 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:78d55f7ddc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4344.0.0-a-583de22c75 goldmane-78d55f7ddc-ppbfc eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali406beafa62f [] [] }} ContainerID="a7d871ee6c48c86e83ab5eb9576cad3136c0796a76ded8126c4d5526777874f1" Namespace="calico-system" Pod="goldmane-78d55f7ddc-ppbfc" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-goldmane--78d55f7ddc--ppbfc-" May 27 02:49:14.162560 containerd[1879]: 2025-05-27 02:49:13.928 [INFO][5355] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a7d871ee6c48c86e83ab5eb9576cad3136c0796a76ded8126c4d5526777874f1" Namespace="calico-system" Pod="goldmane-78d55f7ddc-ppbfc" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-goldmane--78d55f7ddc--ppbfc-eth0" May 27 02:49:14.162560 containerd[1879]: 2025-05-27 02:49:13.978 [INFO][5375] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a7d871ee6c48c86e83ab5eb9576cad3136c0796a76ded8126c4d5526777874f1" HandleID="k8s-pod-network.a7d871ee6c48c86e83ab5eb9576cad3136c0796a76ded8126c4d5526777874f1" Workload="ci--4344.0.0--a--583de22c75-k8s-goldmane--78d55f7ddc--ppbfc-eth0" May 27 02:49:14.162560 containerd[1879]: 2025-05-27 02:49:13.978 [INFO][5375] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a7d871ee6c48c86e83ab5eb9576cad3136c0796a76ded8126c4d5526777874f1" HandleID="k8s-pod-network.a7d871ee6c48c86e83ab5eb9576cad3136c0796a76ded8126c4d5526777874f1" Workload="ci--4344.0.0--a--583de22c75-k8s-goldmane--78d55f7ddc--ppbfc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002a94c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344.0.0-a-583de22c75", "pod":"goldmane-78d55f7ddc-ppbfc", "timestamp":"2025-05-27 02:49:13.978709715 +0000 UTC"}, Hostname:"ci-4344.0.0-a-583de22c75", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 02:49:14.162560 containerd[1879]: 2025-05-27 02:49:13.979 [INFO][5375] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 02:49:14.162560 containerd[1879]: 2025-05-27 02:49:14.026 [INFO][5375] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 02:49:14.162560 containerd[1879]: 2025-05-27 02:49:14.027 [INFO][5375] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-583de22c75' May 27 02:49:14.162560 containerd[1879]: 2025-05-27 02:49:14.080 [INFO][5375] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a7d871ee6c48c86e83ab5eb9576cad3136c0796a76ded8126c4d5526777874f1" host="ci-4344.0.0-a-583de22c75" May 27 02:49:14.162560 containerd[1879]: 2025-05-27 02:49:14.085 [INFO][5375] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-583de22c75" May 27 02:49:14.162560 containerd[1879]: 2025-05-27 02:49:14.095 [INFO][5375] ipam/ipam.go 511: Trying affinity for 192.168.26.128/26 host="ci-4344.0.0-a-583de22c75" May 27 02:49:14.162560 containerd[1879]: 2025-05-27 02:49:14.097 [INFO][5375] ipam/ipam.go 158: Attempting to load block cidr=192.168.26.128/26 host="ci-4344.0.0-a-583de22c75" May 27 02:49:14.162560 containerd[1879]: 2025-05-27 02:49:14.101 [INFO][5375] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.26.128/26 host="ci-4344.0.0-a-583de22c75" May 27 02:49:14.162560 containerd[1879]: 2025-05-27 02:49:14.101 [INFO][5375] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.26.128/26 handle="k8s-pod-network.a7d871ee6c48c86e83ab5eb9576cad3136c0796a76ded8126c4d5526777874f1" host="ci-4344.0.0-a-583de22c75" May 27 02:49:14.162560 containerd[1879]: 2025-05-27 02:49:14.103 [INFO][5375] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a7d871ee6c48c86e83ab5eb9576cad3136c0796a76ded8126c4d5526777874f1 May 27 02:49:14.162560 containerd[1879]: 2025-05-27 02:49:14.107 [INFO][5375] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.26.128/26 handle="k8s-pod-network.a7d871ee6c48c86e83ab5eb9576cad3136c0796a76ded8126c4d5526777874f1" host="ci-4344.0.0-a-583de22c75" May 27 02:49:14.162560 containerd[1879]: 2025-05-27 02:49:14.118 [INFO][5375] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.26.136/26] block=192.168.26.128/26 handle="k8s-pod-network.a7d871ee6c48c86e83ab5eb9576cad3136c0796a76ded8126c4d5526777874f1" host="ci-4344.0.0-a-583de22c75" May 27 02:49:14.162560 containerd[1879]: 2025-05-27 02:49:14.118 [INFO][5375] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.26.136/26] handle="k8s-pod-network.a7d871ee6c48c86e83ab5eb9576cad3136c0796a76ded8126c4d5526777874f1" host="ci-4344.0.0-a-583de22c75" May 27 02:49:14.162560 containerd[1879]: 2025-05-27 02:49:14.118 [INFO][5375] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 02:49:14.162560 containerd[1879]: 2025-05-27 02:49:14.119 [INFO][5375] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.26.136/26] IPv6=[] ContainerID="a7d871ee6c48c86e83ab5eb9576cad3136c0796a76ded8126c4d5526777874f1" HandleID="k8s-pod-network.a7d871ee6c48c86e83ab5eb9576cad3136c0796a76ded8126c4d5526777874f1" Workload="ci--4344.0.0--a--583de22c75-k8s-goldmane--78d55f7ddc--ppbfc-eth0" May 27 02:49:14.164671 containerd[1879]: 2025-05-27 02:49:14.121 [INFO][5355] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a7d871ee6c48c86e83ab5eb9576cad3136c0796a76ded8126c4d5526777874f1" Namespace="calico-system" Pod="goldmane-78d55f7ddc-ppbfc" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-goldmane--78d55f7ddc--ppbfc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--583de22c75-k8s-goldmane--78d55f7ddc--ppbfc-eth0", GenerateName:"goldmane-78d55f7ddc-", Namespace:"calico-system", SelfLink:"", UID:"1a22698a-34a8-450c-a9f9-8a53669e60b4", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 48, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"78d55f7ddc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-583de22c75", ContainerID:"", Pod:"goldmane-78d55f7ddc-ppbfc", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.26.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali406beafa62f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:14.164671 containerd[1879]: 2025-05-27 02:49:14.123 [INFO][5355] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.26.136/32] ContainerID="a7d871ee6c48c86e83ab5eb9576cad3136c0796a76ded8126c4d5526777874f1" Namespace="calico-system" Pod="goldmane-78d55f7ddc-ppbfc" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-goldmane--78d55f7ddc--ppbfc-eth0" May 27 02:49:14.164671 containerd[1879]: 2025-05-27 02:49:14.123 [INFO][5355] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali406beafa62f ContainerID="a7d871ee6c48c86e83ab5eb9576cad3136c0796a76ded8126c4d5526777874f1" Namespace="calico-system" Pod="goldmane-78d55f7ddc-ppbfc" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-goldmane--78d55f7ddc--ppbfc-eth0" May 27 02:49:14.164671 containerd[1879]: 2025-05-27 02:49:14.129 [INFO][5355] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a7d871ee6c48c86e83ab5eb9576cad3136c0796a76ded8126c4d5526777874f1" Namespace="calico-system" Pod="goldmane-78d55f7ddc-ppbfc" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-goldmane--78d55f7ddc--ppbfc-eth0" May 27 02:49:14.164671 containerd[1879]: 2025-05-27 02:49:14.130 [INFO][5355] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a7d871ee6c48c86e83ab5eb9576cad3136c0796a76ded8126c4d5526777874f1" Namespace="calico-system" Pod="goldmane-78d55f7ddc-ppbfc" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-goldmane--78d55f7ddc--ppbfc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--583de22c75-k8s-goldmane--78d55f7ddc--ppbfc-eth0", GenerateName:"goldmane-78d55f7ddc-", Namespace:"calico-system", SelfLink:"", UID:"1a22698a-34a8-450c-a9f9-8a53669e60b4", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 2, 48, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"78d55f7ddc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-583de22c75", ContainerID:"a7d871ee6c48c86e83ab5eb9576cad3136c0796a76ded8126c4d5526777874f1", Pod:"goldmane-78d55f7ddc-ppbfc", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.26.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali406beafa62f", MAC:"0e:22:e7:34:e1:ca", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 02:49:14.164671 containerd[1879]: 2025-05-27 02:49:14.146 [INFO][5355] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a7d871ee6c48c86e83ab5eb9576cad3136c0796a76ded8126c4d5526777874f1" Namespace="calico-system" Pod="goldmane-78d55f7ddc-ppbfc" WorkloadEndpoint="ci--4344.0.0--a--583de22c75-k8s-goldmane--78d55f7ddc--ppbfc-eth0" May 27 02:49:14.165719 containerd[1879]: time="2025-05-27T02:49:14.165677956Z" level=info msg="connecting to shim 9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c" address="unix:///run/containerd/s/2caf364b986d582624791f3f7e2924d45b1e4ae49a240264e9d3c6f0e3f35ef3" namespace=k8s.io protocol=ttrpc version=3 May 27 02:49:14.188887 systemd[1]: Started cri-containerd-9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c.scope - libcontainer container 9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c. May 27 02:49:14.223859 systemd-networkd[1487]: vxlan.calico: Gained IPv6LL May 27 02:49:14.251505 containerd[1879]: time="2025-05-27T02:49:14.251101800Z" level=info msg="connecting to shim a7d871ee6c48c86e83ab5eb9576cad3136c0796a76ded8126c4d5526777874f1" address="unix:///run/containerd/s/458e3d912a3d53cd23a127004a047f59629ba64d44188f9a73832e8ea6634d87" namespace=k8s.io protocol=ttrpc version=3 May 27 02:49:14.260494 containerd[1879]: time="2025-05-27T02:49:14.259888557Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ds6zz,Uid:c823d13d-65d4-46eb-8fbc-bfdffdd173a7,Namespace:calico-system,Attempt:0,} returns sandbox id \"9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c\"" May 27 02:49:14.281694 systemd[1]: Started cri-containerd-a7d871ee6c48c86e83ab5eb9576cad3136c0796a76ded8126c4d5526777874f1.scope - libcontainer container a7d871ee6c48c86e83ab5eb9576cad3136c0796a76ded8126c4d5526777874f1. May 27 02:49:14.316695 containerd[1879]: time="2025-05-27T02:49:14.316641696Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-ppbfc,Uid:1a22698a-34a8-450c-a9f9-8a53669e60b4,Namespace:calico-system,Attempt:0,} returns sandbox id \"a7d871ee6c48c86e83ab5eb9576cad3136c0796a76ded8126c4d5526777874f1\"" May 27 02:49:14.351816 systemd-networkd[1487]: cali954dfcd813d: Gained IPv6LL May 27 02:49:15.035102 kubelet[3293]: I0527 02:49:15.034696 3293 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5879b6cfc8-5r62c" podStartSLOduration=23.690352891 podStartE2EDuration="26.034674919s" podCreationTimestamp="2025-05-27 02:48:49 +0000 UTC" firstStartedPulling="2025-05-27 02:49:11.488415524 +0000 UTC m=+40.777732518" lastFinishedPulling="2025-05-27 02:49:13.832737552 +0000 UTC m=+43.122054546" observedRunningTime="2025-05-27 02:49:15.034328381 +0000 UTC m=+44.323645383" watchObservedRunningTime="2025-05-27 02:49:15.034674919 +0000 UTC m=+44.323991913" May 27 02:49:15.054769 containerd[1879]: time="2025-05-27T02:49:15.054676279Z" level=info msg="TaskExit event in podsandbox handler container_id:\"86afa1ab440d2f706f68848698688f45fb9ad83803ea53c3d01fa1d3df05a82c\" id:\"3ef1f4b1b05b832cc5ec10ba5a34f229d1d4628750eb19f18278cd4e4e0609bd\" pid:5552 exited_at:{seconds:1748314155 nanos:54090798}" May 27 02:49:15.505096 systemd-networkd[1487]: cali96c02d9c38d: Gained IPv6LL May 27 02:49:16.079705 systemd-networkd[1487]: cali406beafa62f: Gained IPv6LL May 27 02:49:17.135166 containerd[1879]: time="2025-05-27T02:49:17.134636951Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:17.139505 containerd[1879]: time="2025-05-27T02:49:17.139454043Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=44453213" May 27 02:49:17.145163 containerd[1879]: time="2025-05-27T02:49:17.145104040Z" level=info msg="ImageCreate event name:\"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:17.150715 containerd[1879]: time="2025-05-27T02:49:17.150640953Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:17.151192 containerd[1879]: time="2025-05-27T02:49:17.151156832Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"45822470\" in 3.317657778s" May 27 02:49:17.151347 containerd[1879]: time="2025-05-27T02:49:17.151279555Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\"" May 27 02:49:17.153914 containerd[1879]: time="2025-05-27T02:49:17.153864710Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 27 02:49:17.156450 containerd[1879]: time="2025-05-27T02:49:17.156381575Z" level=info msg="CreateContainer within sandbox \"00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 02:49:17.199426 containerd[1879]: time="2025-05-27T02:49:17.198850370Z" level=info msg="Container 2e6f232b76b81119b07790e14d3a2b991f97211e2fca4418b9e47b78320f9c64: CDI devices from CRI Config.CDIDevices: []" May 27 02:49:17.216220 containerd[1879]: time="2025-05-27T02:49:17.216175642Z" level=info msg="CreateContainer within sandbox \"00bf9bbb0e1f61d603855fe8d98a3b0e1b9d5f7a643addd41fa2217b2f428060\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"2e6f232b76b81119b07790e14d3a2b991f97211e2fca4418b9e47b78320f9c64\"" May 27 02:49:17.217231 containerd[1879]: time="2025-05-27T02:49:17.217192575Z" level=info msg="StartContainer for \"2e6f232b76b81119b07790e14d3a2b991f97211e2fca4418b9e47b78320f9c64\"" May 27 02:49:17.218349 containerd[1879]: time="2025-05-27T02:49:17.218315624Z" level=info msg="connecting to shim 2e6f232b76b81119b07790e14d3a2b991f97211e2fca4418b9e47b78320f9c64" address="unix:///run/containerd/s/c31680cf8a0bbb49981392db9996ae8fbdba5e859b81e3be105235de52a70655" protocol=ttrpc version=3 May 27 02:49:17.238768 systemd[1]: Started cri-containerd-2e6f232b76b81119b07790e14d3a2b991f97211e2fca4418b9e47b78320f9c64.scope - libcontainer container 2e6f232b76b81119b07790e14d3a2b991f97211e2fca4418b9e47b78320f9c64. May 27 02:49:17.286906 containerd[1879]: time="2025-05-27T02:49:17.286864241Z" level=info msg="StartContainer for \"2e6f232b76b81119b07790e14d3a2b991f97211e2fca4418b9e47b78320f9c64\" returns successfully" May 27 02:49:17.533494 containerd[1879]: time="2025-05-27T02:49:17.531833186Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:17.534928 containerd[1879]: time="2025-05-27T02:49:17.534896179Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=77" May 27 02:49:17.536194 containerd[1879]: time="2025-05-27T02:49:17.536162047Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"45822470\" in 382.255016ms" May 27 02:49:17.536321 containerd[1879]: time="2025-05-27T02:49:17.536305660Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\"" May 27 02:49:17.537406 containerd[1879]: time="2025-05-27T02:49:17.537374651Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\"" May 27 02:49:17.538991 containerd[1879]: time="2025-05-27T02:49:17.538953393Z" level=info msg="CreateContainer within sandbox \"5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 02:49:17.577564 containerd[1879]: time="2025-05-27T02:49:17.577180168Z" level=info msg="Container f801d00f7d8ca8ed45b0da1e0593edb0edb662aa0ca57ca10fc47419291291a9: CDI devices from CRI Config.CDIDevices: []" May 27 02:49:17.597151 containerd[1879]: time="2025-05-27T02:49:17.597102499Z" level=info msg="CreateContainer within sandbox \"5d13303858c33a94383ced38a6d29c78ab1d261b1710cc6939e8dfb1ac743913\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"f801d00f7d8ca8ed45b0da1e0593edb0edb662aa0ca57ca10fc47419291291a9\"" May 27 02:49:17.597883 containerd[1879]: time="2025-05-27T02:49:17.597855833Z" level=info msg="StartContainer for \"f801d00f7d8ca8ed45b0da1e0593edb0edb662aa0ca57ca10fc47419291291a9\"" May 27 02:49:17.599109 containerd[1879]: time="2025-05-27T02:49:17.599048043Z" level=info msg="connecting to shim f801d00f7d8ca8ed45b0da1e0593edb0edb662aa0ca57ca10fc47419291291a9" address="unix:///run/containerd/s/360c3d8e7ace32751656d40ce53bd6fc66511110c0b6a22ab131992d88433342" protocol=ttrpc version=3 May 27 02:49:17.619702 systemd[1]: Started cri-containerd-f801d00f7d8ca8ed45b0da1e0593edb0edb662aa0ca57ca10fc47419291291a9.scope - libcontainer container f801d00f7d8ca8ed45b0da1e0593edb0edb662aa0ca57ca10fc47419291291a9. May 27 02:49:17.677790 containerd[1879]: time="2025-05-27T02:49:17.677746619Z" level=info msg="StartContainer for \"f801d00f7d8ca8ed45b0da1e0593edb0edb662aa0ca57ca10fc47419291291a9\" returns successfully" May 27 02:49:18.048673 kubelet[3293]: I0527 02:49:18.048560 3293 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-fcf964649-js6pq" podStartSLOduration=27.9000579 podStartE2EDuration="32.048534382s" podCreationTimestamp="2025-05-27 02:48:46 +0000 UTC" firstStartedPulling="2025-05-27 02:49:13.388566855 +0000 UTC m=+42.677883849" lastFinishedPulling="2025-05-27 02:49:17.537043313 +0000 UTC m=+46.826360331" observedRunningTime="2025-05-27 02:49:18.046649775 +0000 UTC m=+47.335966769" watchObservedRunningTime="2025-05-27 02:49:18.048534382 +0000 UTC m=+47.337851384" May 27 02:49:18.062622 kubelet[3293]: I0527 02:49:18.062540 3293 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-fcf964649-r2t2l" podStartSLOduration=28.213021997 podStartE2EDuration="32.06251814s" podCreationTimestamp="2025-05-27 02:48:46 +0000 UTC" firstStartedPulling="2025-05-27 02:49:13.302755784 +0000 UTC m=+42.592072778" lastFinishedPulling="2025-05-27 02:49:17.152251927 +0000 UTC m=+46.441568921" observedRunningTime="2025-05-27 02:49:18.062253148 +0000 UTC m=+47.351570158" watchObservedRunningTime="2025-05-27 02:49:18.06251814 +0000 UTC m=+47.351835134" May 27 02:49:18.832576 containerd[1879]: time="2025-05-27T02:49:18.831969211Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:18.839220 containerd[1879]: time="2025-05-27T02:49:18.839178853Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.0: active requests=0, bytes read=8226240" May 27 02:49:18.842468 containerd[1879]: time="2025-05-27T02:49:18.842125435Z" level=info msg="ImageCreate event name:\"sha256:ebe7e098653491dec9f15f87d7f5d33f47b09d1d6f3ef83deeaaa6237024c045\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:18.852613 containerd[1879]: time="2025-05-27T02:49:18.851260972Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:18.852981 containerd[1879]: time="2025-05-27T02:49:18.852593027Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.0\" with image id \"sha256:ebe7e098653491dec9f15f87d7f5d33f47b09d1d6f3ef83deeaaa6237024c045\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\", size \"9595481\" in 1.31518656s" May 27 02:49:18.853078 containerd[1879]: time="2025-05-27T02:49:18.853066481Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\" returns image reference \"sha256:ebe7e098653491dec9f15f87d7f5d33f47b09d1d6f3ef83deeaaa6237024c045\"" May 27 02:49:18.856796 containerd[1879]: time="2025-05-27T02:49:18.856596503Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 02:49:18.857625 containerd[1879]: time="2025-05-27T02:49:18.857591076Z" level=info msg="CreateContainer within sandbox \"9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 27 02:49:18.922057 containerd[1879]: time="2025-05-27T02:49:18.921985940Z" level=info msg="Container d283a12173a6e033eb2d9c27436cc19406df8361c4b8b61c10d1b1532c068685: CDI devices from CRI Config.CDIDevices: []" May 27 02:49:18.950229 containerd[1879]: time="2025-05-27T02:49:18.950087621Z" level=info msg="CreateContainer within sandbox \"9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"d283a12173a6e033eb2d9c27436cc19406df8361c4b8b61c10d1b1532c068685\"" May 27 02:49:18.951110 containerd[1879]: time="2025-05-27T02:49:18.951074442Z" level=info msg="StartContainer for \"d283a12173a6e033eb2d9c27436cc19406df8361c4b8b61c10d1b1532c068685\"" May 27 02:49:18.952769 containerd[1879]: time="2025-05-27T02:49:18.952740338Z" level=info msg="connecting to shim d283a12173a6e033eb2d9c27436cc19406df8361c4b8b61c10d1b1532c068685" address="unix:///run/containerd/s/2caf364b986d582624791f3f7e2924d45b1e4ae49a240264e9d3c6f0e3f35ef3" protocol=ttrpc version=3 May 27 02:49:18.975704 systemd[1]: Started cri-containerd-d283a12173a6e033eb2d9c27436cc19406df8361c4b8b61c10d1b1532c068685.scope - libcontainer container d283a12173a6e033eb2d9c27436cc19406df8361c4b8b61c10d1b1532c068685. May 27 02:49:19.013096 containerd[1879]: time="2025-05-27T02:49:19.013057780Z" level=info msg="StartContainer for \"d283a12173a6e033eb2d9c27436cc19406df8361c4b8b61c10d1b1532c068685\" returns successfully" May 27 02:49:19.042803 kubelet[3293]: I0527 02:49:19.042753 3293 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 02:49:19.043122 kubelet[3293]: I0527 02:49:19.042973 3293 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 02:49:19.054113 containerd[1879]: time="2025-05-27T02:49:19.054033611Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:49:19.059171 containerd[1879]: time="2025-05-27T02:49:19.059077069Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:49:19.059574 containerd[1879]: time="2025-05-27T02:49:19.059141079Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 02:49:19.059634 kubelet[3293]: E0527 02:49:19.059562 3293 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 02:49:19.059634 kubelet[3293]: E0527 02:49:19.059627 3293 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 02:49:19.061704 containerd[1879]: time="2025-05-27T02:49:19.061523780Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\"" May 27 02:49:19.062881 kubelet[3293]: E0527 02:49:19.062604 3293 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nnvzt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-ppbfc_calico-system(1a22698a-34a8-450c-a9f9-8a53669e60b4): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:49:19.064104 kubelet[3293]: E0527 02:49:19.064052 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-ppbfc" podUID="1a22698a-34a8-450c-a9f9-8a53669e60b4" May 27 02:49:20.044995 kubelet[3293]: E0527 02:49:20.044947 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-ppbfc" podUID="1a22698a-34a8-450c-a9f9-8a53669e60b4" May 27 02:49:20.382613 containerd[1879]: time="2025-05-27T02:49:20.382217740Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:20.385051 containerd[1879]: time="2025-05-27T02:49:20.384993773Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0: active requests=0, bytes read=13749925" May 27 02:49:20.395050 containerd[1879]: time="2025-05-27T02:49:20.394979759Z" level=info msg="ImageCreate event name:\"sha256:a5d5f2a68204ed0dbc50f8778616ee92a63c0e342d178a4620e6271484e5c8b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:20.405044 containerd[1879]: time="2025-05-27T02:49:20.404989994Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 02:49:20.405655 containerd[1879]: time="2025-05-27T02:49:20.405625117Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" with image id \"sha256:a5d5f2a68204ed0dbc50f8778616ee92a63c0e342d178a4620e6271484e5c8b2\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\", size \"15119118\" in 1.343735949s" May 27 02:49:20.405770 containerd[1879]: time="2025-05-27T02:49:20.405758352Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" returns image reference \"sha256:a5d5f2a68204ed0dbc50f8778616ee92a63c0e342d178a4620e6271484e5c8b2\"" May 27 02:49:20.409593 containerd[1879]: time="2025-05-27T02:49:20.409553999Z" level=info msg="CreateContainer within sandbox \"9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 27 02:49:20.438524 containerd[1879]: time="2025-05-27T02:49:20.437844341Z" level=info msg="Container 20d3741c474e73d486aafc4ad9c69615cd38f91c3393498c058b4dd9ef4b1239: CDI devices from CRI Config.CDIDevices: []" May 27 02:49:20.440356 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1796602482.mount: Deactivated successfully. May 27 02:49:20.457445 containerd[1879]: time="2025-05-27T02:49:20.457350364Z" level=info msg="CreateContainer within sandbox \"9f14a988caf15869338789605c5e1d229be48fe43c7272a42538126bfa41853c\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"20d3741c474e73d486aafc4ad9c69615cd38f91c3393498c058b4dd9ef4b1239\"" May 27 02:49:20.459432 containerd[1879]: time="2025-05-27T02:49:20.457893884Z" level=info msg="StartContainer for \"20d3741c474e73d486aafc4ad9c69615cd38f91c3393498c058b4dd9ef4b1239\"" May 27 02:49:20.459753 containerd[1879]: time="2025-05-27T02:49:20.459663215Z" level=info msg="connecting to shim 20d3741c474e73d486aafc4ad9c69615cd38f91c3393498c058b4dd9ef4b1239" address="unix:///run/containerd/s/2caf364b986d582624791f3f7e2924d45b1e4ae49a240264e9d3c6f0e3f35ef3" protocol=ttrpc version=3 May 27 02:49:20.497702 systemd[1]: Started cri-containerd-20d3741c474e73d486aafc4ad9c69615cd38f91c3393498c058b4dd9ef4b1239.scope - libcontainer container 20d3741c474e73d486aafc4ad9c69615cd38f91c3393498c058b4dd9ef4b1239. May 27 02:49:20.533294 containerd[1879]: time="2025-05-27T02:49:20.533235034Z" level=info msg="StartContainer for \"20d3741c474e73d486aafc4ad9c69615cd38f91c3393498c058b4dd9ef4b1239\" returns successfully" May 27 02:49:20.838485 containerd[1879]: time="2025-05-27T02:49:20.837750741Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 02:49:20.935517 kubelet[3293]: I0527 02:49:20.935463 3293 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 27 02:49:20.936312 kubelet[3293]: I0527 02:49:20.936028 3293 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 27 02:49:21.006438 containerd[1879]: time="2025-05-27T02:49:21.006244600Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:49:21.011440 containerd[1879]: time="2025-05-27T02:49:21.011378717Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:49:21.011788 containerd[1879]: time="2025-05-27T02:49:21.011625764Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 02:49:21.012126 kubelet[3293]: E0527 02:49:21.012037 3293 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 02:49:21.012126 kubelet[3293]: E0527 02:49:21.012095 3293 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 02:49:21.012397 kubelet[3293]: E0527 02:49:21.012368 3293 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:781684bcf162405c83f157eefd25f0ec,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dnxrq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-fc8cbdb96-5hpj2_calico-system(ed0f6931-9afa-4d7f-96af-71936ce08ef4): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:49:21.015631 containerd[1879]: time="2025-05-27T02:49:21.015594200Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 02:49:21.193512 containerd[1879]: time="2025-05-27T02:49:21.193290869Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:49:21.197286 containerd[1879]: time="2025-05-27T02:49:21.197189471Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:49:21.197286 containerd[1879]: time="2025-05-27T02:49:21.197248040Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 02:49:21.197680 kubelet[3293]: E0527 02:49:21.197617 3293 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 02:49:21.197680 kubelet[3293]: E0527 02:49:21.197677 3293 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 02:49:21.197825 kubelet[3293]: E0527 02:49:21.197776 3293 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dnxrq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-fc8cbdb96-5hpj2_calico-system(ed0f6931-9afa-4d7f-96af-71936ce08ef4): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:49:21.199058 kubelet[3293]: E0527 02:49:21.198997 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-fc8cbdb96-5hpj2" podUID="ed0f6931-9afa-4d7f-96af-71936ce08ef4" May 27 02:49:23.247321 kubelet[3293]: I0527 02:49:23.247111 3293 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 02:49:23.270219 kubelet[3293]: I0527 02:49:23.269837 3293 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-ds6zz" podStartSLOduration=28.126392155 podStartE2EDuration="34.269818328s" podCreationTimestamp="2025-05-27 02:48:49 +0000 UTC" firstStartedPulling="2025-05-27 02:49:14.263023208 +0000 UTC m=+43.552340210" lastFinishedPulling="2025-05-27 02:49:20.406449389 +0000 UTC m=+49.695766383" observedRunningTime="2025-05-27 02:49:21.07469699 +0000 UTC m=+50.364013984" watchObservedRunningTime="2025-05-27 02:49:23.269818328 +0000 UTC m=+52.559135322" May 27 02:49:28.312728 kubelet[3293]: I0527 02:49:28.312601 3293 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 02:49:31.838777 containerd[1879]: time="2025-05-27T02:49:31.838718356Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 02:49:32.039199 containerd[1879]: time="2025-05-27T02:49:32.039140012Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:49:32.042145 containerd[1879]: time="2025-05-27T02:49:32.042049518Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:49:32.042145 containerd[1879]: time="2025-05-27T02:49:32.042103703Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 02:49:32.042455 kubelet[3293]: E0527 02:49:32.042282 3293 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 02:49:32.042455 kubelet[3293]: E0527 02:49:32.042336 3293 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 02:49:32.042834 kubelet[3293]: E0527 02:49:32.042657 3293 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nnvzt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-ppbfc_calico-system(1a22698a-34a8-450c-a9f9-8a53669e60b4): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:49:32.043986 kubelet[3293]: E0527 02:49:32.043851 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-ppbfc" podUID="1a22698a-34a8-450c-a9f9-8a53669e60b4" May 27 02:49:34.840660 kubelet[3293]: E0527 02:49:34.840503 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-fc8cbdb96-5hpj2" podUID="ed0f6931-9afa-4d7f-96af-71936ce08ef4" May 27 02:49:36.040851 containerd[1879]: time="2025-05-27T02:49:36.040790386Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ee10df8cc0c69ed6dd960776c05f80c82a916559e7803dc4ee9638120d6dc636\" id:\"50b1949f3f203457682eb06c91b469a9b0a35b6ed67409177571c487b0b590d9\" pid:5763 exited_at:{seconds:1748314176 nanos:40450976}" May 27 02:49:43.840829 kubelet[3293]: E0527 02:49:43.840755 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-ppbfc" podUID="1a22698a-34a8-450c-a9f9-8a53669e60b4" May 27 02:49:45.048685 containerd[1879]: time="2025-05-27T02:49:45.048570534Z" level=info msg="TaskExit event in podsandbox handler container_id:\"86afa1ab440d2f706f68848698688f45fb9ad83803ea53c3d01fa1d3df05a82c\" id:\"b435736c9dc7b3601dbe97871e79004ab3f49142da2d404963c194e2ecd9868a\" pid:5790 exited_at:{seconds:1748314185 nanos:48236933}" May 27 02:49:45.838563 containerd[1879]: time="2025-05-27T02:49:45.838485758Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 02:49:46.024976 containerd[1879]: time="2025-05-27T02:49:46.024736591Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:49:46.028604 containerd[1879]: time="2025-05-27T02:49:46.028424222Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:49:46.028604 containerd[1879]: time="2025-05-27T02:49:46.028567962Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 02:49:46.029035 kubelet[3293]: E0527 02:49:46.028948 3293 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 02:49:46.029651 kubelet[3293]: E0527 02:49:46.029445 3293 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 02:49:46.029651 kubelet[3293]: E0527 02:49:46.029603 3293 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:781684bcf162405c83f157eefd25f0ec,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dnxrq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-fc8cbdb96-5hpj2_calico-system(ed0f6931-9afa-4d7f-96af-71936ce08ef4): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:49:46.032029 containerd[1879]: time="2025-05-27T02:49:46.031930689Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 02:49:46.231243 containerd[1879]: time="2025-05-27T02:49:46.231182599Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:49:46.234560 containerd[1879]: time="2025-05-27T02:49:46.234453515Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:49:46.234949 containerd[1879]: time="2025-05-27T02:49:46.234515597Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 02:49:46.235120 kubelet[3293]: E0527 02:49:46.235071 3293 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 02:49:46.235583 kubelet[3293]: E0527 02:49:46.235133 3293 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 02:49:46.235583 kubelet[3293]: E0527 02:49:46.235245 3293 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dnxrq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-fc8cbdb96-5hpj2_calico-system(ed0f6931-9afa-4d7f-96af-71936ce08ef4): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:49:46.236400 kubelet[3293]: E0527 02:49:46.236318 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-fc8cbdb96-5hpj2" podUID="ed0f6931-9afa-4d7f-96af-71936ce08ef4" May 27 02:49:53.591124 containerd[1879]: time="2025-05-27T02:49:53.591046152Z" level=info msg="TaskExit event in podsandbox handler container_id:\"86afa1ab440d2f706f68848698688f45fb9ad83803ea53c3d01fa1d3df05a82c\" id:\"5b631eba90f2c387f59bd52bc379d91b67143b7ec1b0aef2225e7d22084e05fb\" pid:5821 exited_at:{seconds:1748314193 nanos:590574683}" May 27 02:49:58.839603 containerd[1879]: time="2025-05-27T02:49:58.839560701Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 02:49:59.038774 containerd[1879]: time="2025-05-27T02:49:59.038700397Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:49:59.044532 containerd[1879]: time="2025-05-27T02:49:59.044455659Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:49:59.044807 containerd[1879]: time="2025-05-27T02:49:59.044511685Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 02:49:59.045169 kubelet[3293]: E0527 02:49:59.045104 3293 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 02:49:59.045676 kubelet[3293]: E0527 02:49:59.045184 3293 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 02:49:59.045676 kubelet[3293]: E0527 02:49:59.045312 3293 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nnvzt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-ppbfc_calico-system(1a22698a-34a8-450c-a9f9-8a53669e60b4): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:49:59.046586 kubelet[3293]: E0527 02:49:59.046523 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-ppbfc" podUID="1a22698a-34a8-450c-a9f9-8a53669e60b4" May 27 02:50:00.839914 kubelet[3293]: E0527 02:50:00.839638 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-fc8cbdb96-5hpj2" podUID="ed0f6931-9afa-4d7f-96af-71936ce08ef4" May 27 02:50:06.057881 containerd[1879]: time="2025-05-27T02:50:06.057833656Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ee10df8cc0c69ed6dd960776c05f80c82a916559e7803dc4ee9638120d6dc636\" id:\"7228c21b8c11598d8e03cc272f86aebf52431e1445b711488952cd4032da6230\" pid:5848 exited_at:{seconds:1748314206 nanos:56962236}" May 27 02:50:11.839570 kubelet[3293]: E0527 02:50:11.839514 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-ppbfc" podUID="1a22698a-34a8-450c-a9f9-8a53669e60b4" May 27 02:50:15.071880 containerd[1879]: time="2025-05-27T02:50:15.071769722Z" level=info msg="TaskExit event in podsandbox handler container_id:\"86afa1ab440d2f706f68848698688f45fb9ad83803ea53c3d01fa1d3df05a82c\" id:\"b9687515ea525c1d8dbc16fd497bc510895fe1a67799b0441da658d479fe7ca3\" pid:5876 exited_at:{seconds:1748314215 nanos:70686353}" May 27 02:50:15.838744 kubelet[3293]: E0527 02:50:15.838693 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-fc8cbdb96-5hpj2" podUID="ed0f6931-9afa-4d7f-96af-71936ce08ef4" May 27 02:50:24.838838 kubelet[3293]: E0527 02:50:24.838786 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-ppbfc" podUID="1a22698a-34a8-450c-a9f9-8a53669e60b4" May 27 02:50:30.840130 containerd[1879]: time="2025-05-27T02:50:30.839754453Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 02:50:31.027950 containerd[1879]: time="2025-05-27T02:50:31.027743337Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:50:31.030866 containerd[1879]: time="2025-05-27T02:50:31.030796418Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:50:31.031144 containerd[1879]: time="2025-05-27T02:50:31.030845907Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 02:50:31.031373 kubelet[3293]: E0527 02:50:31.031316 3293 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 02:50:31.031734 kubelet[3293]: E0527 02:50:31.031380 3293 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 02:50:31.031928 kubelet[3293]: E0527 02:50:31.031551 3293 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:781684bcf162405c83f157eefd25f0ec,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dnxrq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-fc8cbdb96-5hpj2_calico-system(ed0f6931-9afa-4d7f-96af-71936ce08ef4): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:50:31.035052 containerd[1879]: time="2025-05-27T02:50:31.034159773Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 02:50:31.199216 containerd[1879]: time="2025-05-27T02:50:31.199018719Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:50:31.202141 containerd[1879]: time="2025-05-27T02:50:31.202080520Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:50:31.202417 containerd[1879]: time="2025-05-27T02:50:31.202306072Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 02:50:31.202833 kubelet[3293]: E0527 02:50:31.202562 3293 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 02:50:31.202833 kubelet[3293]: E0527 02:50:31.202622 3293 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 02:50:31.202833 kubelet[3293]: E0527 02:50:31.202734 3293 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dnxrq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-fc8cbdb96-5hpj2_calico-system(ed0f6931-9afa-4d7f-96af-71936ce08ef4): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:50:31.204235 kubelet[3293]: E0527 02:50:31.204185 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-fc8cbdb96-5hpj2" podUID="ed0f6931-9afa-4d7f-96af-71936ce08ef4" May 27 02:50:36.033036 containerd[1879]: time="2025-05-27T02:50:36.032986502Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ee10df8cc0c69ed6dd960776c05f80c82a916559e7803dc4ee9638120d6dc636\" id:\"4e121db008bd3c9114107a7ee18d17fd9674b57b8ea34d20c7061f09b5222180\" pid:5909 exited_at:{seconds:1748314236 nanos:32431491}" May 27 02:50:37.837445 kubelet[3293]: E0527 02:50:37.837392 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-ppbfc" podUID="1a22698a-34a8-450c-a9f9-8a53669e60b4" May 27 02:50:45.048053 containerd[1879]: time="2025-05-27T02:50:45.047929390Z" level=info msg="TaskExit event in podsandbox handler container_id:\"86afa1ab440d2f706f68848698688f45fb9ad83803ea53c3d01fa1d3df05a82c\" id:\"9255f1aa01dd2f61fcf42bd07b02b14ff8faefaadcc00f69cc2e107cd1821354\" pid:5935 exited_at:{seconds:1748314245 nanos:46927147}" May 27 02:50:45.841402 kubelet[3293]: E0527 02:50:45.841139 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-fc8cbdb96-5hpj2" podUID="ed0f6931-9afa-4d7f-96af-71936ce08ef4" May 27 02:50:47.595342 systemd[1]: Started sshd@7-10.200.20.22:22-10.200.16.10:52558.service - OpenSSH per-connection server daemon (10.200.16.10:52558). May 27 02:50:48.088470 sshd[5962]: Accepted publickey for core from 10.200.16.10 port 52558 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:50:48.090338 sshd-session[5962]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:50:48.094928 systemd-logind[1862]: New session 10 of user core. May 27 02:50:48.100658 systemd[1]: Started session-10.scope - Session 10 of User core. May 27 02:50:48.548496 sshd[5964]: Connection closed by 10.200.16.10 port 52558 May 27 02:50:48.549115 sshd-session[5962]: pam_unix(sshd:session): session closed for user core May 27 02:50:48.553743 systemd[1]: sshd@7-10.200.20.22:22-10.200.16.10:52558.service: Deactivated successfully. May 27 02:50:48.556182 systemd[1]: session-10.scope: Deactivated successfully. May 27 02:50:48.557307 systemd-logind[1862]: Session 10 logged out. Waiting for processes to exit. May 27 02:50:48.560060 systemd-logind[1862]: Removed session 10. May 27 02:50:49.839527 containerd[1879]: time="2025-05-27T02:50:49.838631701Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 02:50:50.031319 containerd[1879]: time="2025-05-27T02:50:50.031115782Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:50:50.034281 containerd[1879]: time="2025-05-27T02:50:50.034129815Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:50:50.034281 containerd[1879]: time="2025-05-27T02:50:50.034174825Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 02:50:50.034677 kubelet[3293]: E0527 02:50:50.034633 3293 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 02:50:50.035012 kubelet[3293]: E0527 02:50:50.034688 3293 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 02:50:50.035012 kubelet[3293]: E0527 02:50:50.034812 3293 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nnvzt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-ppbfc_calico-system(1a22698a-34a8-450c-a9f9-8a53669e60b4): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:50:50.036243 kubelet[3293]: E0527 02:50:50.036159 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-ppbfc" podUID="1a22698a-34a8-450c-a9f9-8a53669e60b4" May 27 02:50:53.634660 systemd[1]: Started sshd@8-10.200.20.22:22-10.200.16.10:56252.service - OpenSSH per-connection server daemon (10.200.16.10:56252). May 27 02:50:53.653487 containerd[1879]: time="2025-05-27T02:50:53.653438045Z" level=info msg="TaskExit event in podsandbox handler container_id:\"86afa1ab440d2f706f68848698688f45fb9ad83803ea53c3d01fa1d3df05a82c\" id:\"76932276b35a6eac38b0febcd1772ccf8f6782cd93bebaa263fd5ddc24971310\" pid:5994 exited_at:{seconds:1748314253 nanos:652689451}" May 27 02:50:54.099894 sshd[6001]: Accepted publickey for core from 10.200.16.10 port 56252 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:50:54.101296 sshd-session[6001]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:50:54.105524 systemd-logind[1862]: New session 11 of user core. May 27 02:50:54.113867 systemd[1]: Started session-11.scope - Session 11 of User core. May 27 02:50:54.473342 sshd[6006]: Connection closed by 10.200.16.10 port 56252 May 27 02:50:54.472706 sshd-session[6001]: pam_unix(sshd:session): session closed for user core May 27 02:50:54.476677 systemd[1]: sshd@8-10.200.20.22:22-10.200.16.10:56252.service: Deactivated successfully. May 27 02:50:54.478862 systemd[1]: session-11.scope: Deactivated successfully. May 27 02:50:54.481204 systemd-logind[1862]: Session 11 logged out. Waiting for processes to exit. May 27 02:50:54.483583 systemd-logind[1862]: Removed session 11. May 27 02:50:56.839705 kubelet[3293]: E0527 02:50:56.839414 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-fc8cbdb96-5hpj2" podUID="ed0f6931-9afa-4d7f-96af-71936ce08ef4" May 27 02:50:59.562893 systemd[1]: Started sshd@9-10.200.20.22:22-10.200.16.10:38362.service - OpenSSH per-connection server daemon (10.200.16.10:38362). May 27 02:51:00.014322 sshd[6020]: Accepted publickey for core from 10.200.16.10 port 38362 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:51:00.016640 sshd-session[6020]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:51:00.024926 systemd-logind[1862]: New session 12 of user core. May 27 02:51:00.037414 systemd[1]: Started session-12.scope - Session 12 of User core. May 27 02:51:00.404577 sshd[6022]: Connection closed by 10.200.16.10 port 38362 May 27 02:51:00.405205 sshd-session[6020]: pam_unix(sshd:session): session closed for user core May 27 02:51:00.409110 systemd[1]: sshd@9-10.200.20.22:22-10.200.16.10:38362.service: Deactivated successfully. May 27 02:51:00.411437 systemd[1]: session-12.scope: Deactivated successfully. May 27 02:51:00.413571 systemd-logind[1862]: Session 12 logged out. Waiting for processes to exit. May 27 02:51:00.415697 systemd-logind[1862]: Removed session 12. May 27 02:51:01.839928 kubelet[3293]: E0527 02:51:01.839841 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-ppbfc" podUID="1a22698a-34a8-450c-a9f9-8a53669e60b4" May 27 02:51:05.502468 systemd[1]: Started sshd@10-10.200.20.22:22-10.200.16.10:38370.service - OpenSSH per-connection server daemon (10.200.16.10:38370). May 27 02:51:05.990460 sshd[6034]: Accepted publickey for core from 10.200.16.10 port 38370 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:51:05.991906 sshd-session[6034]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:51:06.001055 systemd-logind[1862]: New session 13 of user core. May 27 02:51:06.007677 systemd[1]: Started session-13.scope - Session 13 of User core. May 27 02:51:06.048757 containerd[1879]: time="2025-05-27T02:51:06.048714193Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ee10df8cc0c69ed6dd960776c05f80c82a916559e7803dc4ee9638120d6dc636\" id:\"a23073bb58374ceb9585d478b36cbf8744f08d7d08c4b4957a8a1e0e7f69a5da\" pid:6050 exited_at:{seconds:1748314266 nanos:48153102}" May 27 02:51:06.391850 sshd[6060]: Connection closed by 10.200.16.10 port 38370 May 27 02:51:06.395030 sshd-session[6034]: pam_unix(sshd:session): session closed for user core May 27 02:51:06.400047 systemd-logind[1862]: Session 13 logged out. Waiting for processes to exit. May 27 02:51:06.400321 systemd[1]: sshd@10-10.200.20.22:22-10.200.16.10:38370.service: Deactivated successfully. May 27 02:51:06.403722 systemd[1]: session-13.scope: Deactivated successfully. May 27 02:51:06.406084 systemd-logind[1862]: Removed session 13. May 27 02:51:08.839504 kubelet[3293]: E0527 02:51:08.839405 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-fc8cbdb96-5hpj2" podUID="ed0f6931-9afa-4d7f-96af-71936ce08ef4" May 27 02:51:11.482903 systemd[1]: Started sshd@11-10.200.20.22:22-10.200.16.10:60990.service - OpenSSH per-connection server daemon (10.200.16.10:60990). May 27 02:51:11.975054 sshd[6078]: Accepted publickey for core from 10.200.16.10 port 60990 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:51:11.979679 sshd-session[6078]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:51:11.986438 systemd-logind[1862]: New session 14 of user core. May 27 02:51:11.993176 systemd[1]: Started session-14.scope - Session 14 of User core. May 27 02:51:12.376999 sshd[6080]: Connection closed by 10.200.16.10 port 60990 May 27 02:51:12.377787 sshd-session[6078]: pam_unix(sshd:session): session closed for user core May 27 02:51:12.380684 systemd[1]: sshd@11-10.200.20.22:22-10.200.16.10:60990.service: Deactivated successfully. May 27 02:51:12.382960 systemd[1]: session-14.scope: Deactivated successfully. May 27 02:51:12.383900 systemd-logind[1862]: Session 14 logged out. Waiting for processes to exit. May 27 02:51:12.386964 systemd-logind[1862]: Removed session 14. May 27 02:51:15.051438 containerd[1879]: time="2025-05-27T02:51:15.051392338Z" level=info msg="TaskExit event in podsandbox handler container_id:\"86afa1ab440d2f706f68848698688f45fb9ad83803ea53c3d01fa1d3df05a82c\" id:\"08f78987cd5e0797a76d8dc1fc55a0d8052b0fd40441a427fc734cbb9ebef8d2\" pid:6104 exited_at:{seconds:1748314275 nanos:50538773}" May 27 02:51:16.838761 kubelet[3293]: E0527 02:51:16.838507 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-ppbfc" podUID="1a22698a-34a8-450c-a9f9-8a53669e60b4" May 27 02:51:17.470314 systemd[1]: Started sshd@12-10.200.20.22:22-10.200.16.10:32768.service - OpenSSH per-connection server daemon (10.200.16.10:32768). May 27 02:51:17.956556 sshd[6115]: Accepted publickey for core from 10.200.16.10 port 32768 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:51:17.957896 sshd-session[6115]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:51:17.964751 systemd-logind[1862]: New session 15 of user core. May 27 02:51:17.968658 systemd[1]: Started session-15.scope - Session 15 of User core. May 27 02:51:18.355656 sshd[6117]: Connection closed by 10.200.16.10 port 32768 May 27 02:51:18.354679 sshd-session[6115]: pam_unix(sshd:session): session closed for user core May 27 02:51:18.360256 systemd[1]: sshd@12-10.200.20.22:22-10.200.16.10:32768.service: Deactivated successfully. May 27 02:51:18.362504 systemd[1]: session-15.scope: Deactivated successfully. May 27 02:51:18.363373 systemd-logind[1862]: Session 15 logged out. Waiting for processes to exit. May 27 02:51:18.364921 systemd-logind[1862]: Removed session 15. May 27 02:51:23.442973 systemd[1]: Started sshd@13-10.200.20.22:22-10.200.16.10:54242.service - OpenSSH per-connection server daemon (10.200.16.10:54242). May 27 02:51:23.838639 kubelet[3293]: E0527 02:51:23.838300 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-fc8cbdb96-5hpj2" podUID="ed0f6931-9afa-4d7f-96af-71936ce08ef4" May 27 02:51:23.932249 sshd[6129]: Accepted publickey for core from 10.200.16.10 port 54242 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:51:23.934159 sshd-session[6129]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:51:23.938335 systemd-logind[1862]: New session 16 of user core. May 27 02:51:23.942778 systemd[1]: Started session-16.scope - Session 16 of User core. May 27 02:51:24.329193 sshd[6131]: Connection closed by 10.200.16.10 port 54242 May 27 02:51:24.328519 sshd-session[6129]: pam_unix(sshd:session): session closed for user core May 27 02:51:24.332399 systemd[1]: sshd@13-10.200.20.22:22-10.200.16.10:54242.service: Deactivated successfully. May 27 02:51:24.332446 systemd-logind[1862]: Session 16 logged out. Waiting for processes to exit. May 27 02:51:24.336203 systemd[1]: session-16.scope: Deactivated successfully. May 27 02:51:24.339454 systemd-logind[1862]: Removed session 16. May 27 02:51:29.411939 systemd[1]: Started sshd@14-10.200.20.22:22-10.200.16.10:33484.service - OpenSSH per-connection server daemon (10.200.16.10:33484). May 27 02:51:29.865512 sshd[6143]: Accepted publickey for core from 10.200.16.10 port 33484 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:51:29.866756 sshd-session[6143]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:51:29.871531 systemd-logind[1862]: New session 17 of user core. May 27 02:51:29.877746 systemd[1]: Started session-17.scope - Session 17 of User core. May 27 02:51:30.260447 sshd[6145]: Connection closed by 10.200.16.10 port 33484 May 27 02:51:30.259622 sshd-session[6143]: pam_unix(sshd:session): session closed for user core May 27 02:51:30.266047 systemd[1]: sshd@14-10.200.20.22:22-10.200.16.10:33484.service: Deactivated successfully. May 27 02:51:30.274074 systemd[1]: session-17.scope: Deactivated successfully. May 27 02:51:30.275780 systemd-logind[1862]: Session 17 logged out. Waiting for processes to exit. May 27 02:51:30.278728 systemd-logind[1862]: Removed session 17. May 27 02:51:30.348581 systemd[1]: Started sshd@15-10.200.20.22:22-10.200.16.10:33488.service - OpenSSH per-connection server daemon (10.200.16.10:33488). May 27 02:51:30.836562 sshd[6158]: Accepted publickey for core from 10.200.16.10 port 33488 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:51:30.838041 sshd-session[6158]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:51:30.845519 systemd-logind[1862]: New session 18 of user core. May 27 02:51:30.855695 systemd[1]: Started session-18.scope - Session 18 of User core. May 27 02:51:31.255577 sshd[6162]: Connection closed by 10.200.16.10 port 33488 May 27 02:51:31.257145 sshd-session[6158]: pam_unix(sshd:session): session closed for user core May 27 02:51:31.261084 systemd[1]: sshd@15-10.200.20.22:22-10.200.16.10:33488.service: Deactivated successfully. May 27 02:51:31.266143 systemd[1]: session-18.scope: Deactivated successfully. May 27 02:51:31.271679 systemd-logind[1862]: Session 18 logged out. Waiting for processes to exit. May 27 02:51:31.273297 systemd-logind[1862]: Removed session 18. May 27 02:51:31.343321 systemd[1]: Started sshd@16-10.200.20.22:22-10.200.16.10:33498.service - OpenSSH per-connection server daemon (10.200.16.10:33498). May 27 02:51:31.826582 sshd[6172]: Accepted publickey for core from 10.200.16.10 port 33498 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:51:31.827922 sshd-session[6172]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:51:31.832093 systemd-logind[1862]: New session 19 of user core. May 27 02:51:31.838633 kubelet[3293]: E0527 02:51:31.837451 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-ppbfc" podUID="1a22698a-34a8-450c-a9f9-8a53669e60b4" May 27 02:51:31.837992 systemd[1]: Started session-19.scope - Session 19 of User core. May 27 02:51:32.218553 sshd[6174]: Connection closed by 10.200.16.10 port 33498 May 27 02:51:32.219160 sshd-session[6172]: pam_unix(sshd:session): session closed for user core May 27 02:51:32.222774 systemd[1]: sshd@16-10.200.20.22:22-10.200.16.10:33498.service: Deactivated successfully. May 27 02:51:32.226630 systemd[1]: session-19.scope: Deactivated successfully. May 27 02:51:32.228323 systemd-logind[1862]: Session 19 logged out. Waiting for processes to exit. May 27 02:51:32.230054 systemd-logind[1862]: Removed session 19. May 27 02:51:36.035505 containerd[1879]: time="2025-05-27T02:51:36.035428461Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ee10df8cc0c69ed6dd960776c05f80c82a916559e7803dc4ee9638120d6dc636\" id:\"86ffcb4a77c8af4e3517e67aec8c56b6f891c91d5346b67a9ec95573f5a9ce06\" pid:6197 exited_at:{seconds:1748314296 nanos:35118882}" May 27 02:51:36.840236 kubelet[3293]: E0527 02:51:36.840172 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-fc8cbdb96-5hpj2" podUID="ed0f6931-9afa-4d7f-96af-71936ce08ef4" May 27 02:51:37.312027 systemd[1]: Started sshd@17-10.200.20.22:22-10.200.16.10:33508.service - OpenSSH per-connection server daemon (10.200.16.10:33508). May 27 02:51:37.796835 sshd[6209]: Accepted publickey for core from 10.200.16.10 port 33508 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:51:37.798223 sshd-session[6209]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:51:37.805927 systemd-logind[1862]: New session 20 of user core. May 27 02:51:37.810674 systemd[1]: Started session-20.scope - Session 20 of User core. May 27 02:51:38.199174 sshd[6214]: Connection closed by 10.200.16.10 port 33508 May 27 02:51:38.199847 sshd-session[6209]: pam_unix(sshd:session): session closed for user core May 27 02:51:38.204378 systemd-logind[1862]: Session 20 logged out. Waiting for processes to exit. May 27 02:51:38.204960 systemd[1]: sshd@17-10.200.20.22:22-10.200.16.10:33508.service: Deactivated successfully. May 27 02:51:38.207892 systemd[1]: session-20.scope: Deactivated successfully. May 27 02:51:38.210504 systemd-logind[1862]: Removed session 20. May 27 02:51:43.294766 systemd[1]: Started sshd@18-10.200.20.22:22-10.200.16.10:38964.service - OpenSSH per-connection server daemon (10.200.16.10:38964). May 27 02:51:43.775531 sshd[6225]: Accepted publickey for core from 10.200.16.10 port 38964 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:51:43.776630 sshd-session[6225]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:51:43.781794 systemd-logind[1862]: New session 21 of user core. May 27 02:51:43.790683 systemd[1]: Started session-21.scope - Session 21 of User core. May 27 02:51:43.837902 kubelet[3293]: E0527 02:51:43.837857 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-ppbfc" podUID="1a22698a-34a8-450c-a9f9-8a53669e60b4" May 27 02:51:44.193933 sshd[6227]: Connection closed by 10.200.16.10 port 38964 May 27 02:51:44.194536 sshd-session[6225]: pam_unix(sshd:session): session closed for user core May 27 02:51:44.202744 systemd-logind[1862]: Session 21 logged out. Waiting for processes to exit. May 27 02:51:44.204816 systemd[1]: sshd@18-10.200.20.22:22-10.200.16.10:38964.service: Deactivated successfully. May 27 02:51:44.209150 systemd[1]: session-21.scope: Deactivated successfully. May 27 02:51:44.211195 systemd-logind[1862]: Removed session 21. May 27 02:51:45.047729 containerd[1879]: time="2025-05-27T02:51:45.047668488Z" level=info msg="TaskExit event in podsandbox handler container_id:\"86afa1ab440d2f706f68848698688f45fb9ad83803ea53c3d01fa1d3df05a82c\" id:\"a74377b4c46ccc18c94b4d203ac5275109051c116f308e95b9e28d1fd36bf2fe\" pid:6254 exited_at:{seconds:1748314305 nanos:47181968}" May 27 02:51:47.837987 kubelet[3293]: E0527 02:51:47.837927 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-fc8cbdb96-5hpj2" podUID="ed0f6931-9afa-4d7f-96af-71936ce08ef4" May 27 02:51:49.278406 systemd[1]: Started sshd@19-10.200.20.22:22-10.200.16.10:37514.service - OpenSSH per-connection server daemon (10.200.16.10:37514). May 27 02:51:49.734375 sshd[6263]: Accepted publickey for core from 10.200.16.10 port 37514 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:51:49.735272 sshd-session[6263]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:51:49.740557 systemd-logind[1862]: New session 22 of user core. May 27 02:51:49.753703 systemd[1]: Started session-22.scope - Session 22 of User core. May 27 02:51:50.103566 sshd[6265]: Connection closed by 10.200.16.10 port 37514 May 27 02:51:50.103337 sshd-session[6263]: pam_unix(sshd:session): session closed for user core May 27 02:51:50.108597 systemd[1]: sshd@19-10.200.20.22:22-10.200.16.10:37514.service: Deactivated successfully. May 27 02:51:50.111702 systemd[1]: session-22.scope: Deactivated successfully. May 27 02:51:50.113573 systemd-logind[1862]: Session 22 logged out. Waiting for processes to exit. May 27 02:51:50.115350 systemd-logind[1862]: Removed session 22. May 27 02:51:53.584085 containerd[1879]: time="2025-05-27T02:51:53.584013164Z" level=info msg="TaskExit event in podsandbox handler container_id:\"86afa1ab440d2f706f68848698688f45fb9ad83803ea53c3d01fa1d3df05a82c\" id:\"96d4e9b9cfb063c55a63013c13714a316432d03030a5905054da78529c41c736\" pid:6294 exited_at:{seconds:1748314313 nanos:583561109}" May 27 02:51:54.838175 kubelet[3293]: E0527 02:51:54.837892 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-ppbfc" podUID="1a22698a-34a8-450c-a9f9-8a53669e60b4" May 27 02:51:55.187365 systemd[1]: Started sshd@20-10.200.20.22:22-10.200.16.10:37516.service - OpenSSH per-connection server daemon (10.200.16.10:37516). May 27 02:51:55.645067 sshd[6306]: Accepted publickey for core from 10.200.16.10 port 37516 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:51:55.646712 sshd-session[6306]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:51:55.651201 systemd-logind[1862]: New session 23 of user core. May 27 02:51:55.659693 systemd[1]: Started session-23.scope - Session 23 of User core. May 27 02:51:56.031854 sshd[6308]: Connection closed by 10.200.16.10 port 37516 May 27 02:51:56.032609 sshd-session[6306]: pam_unix(sshd:session): session closed for user core May 27 02:51:56.036658 systemd[1]: sshd@20-10.200.20.22:22-10.200.16.10:37516.service: Deactivated successfully. May 27 02:51:56.039135 systemd[1]: session-23.scope: Deactivated successfully. May 27 02:51:56.040777 systemd-logind[1862]: Session 23 logged out. Waiting for processes to exit. May 27 02:51:56.042440 systemd-logind[1862]: Removed session 23. May 27 02:52:01.121829 systemd[1]: Started sshd@21-10.200.20.22:22-10.200.16.10:45364.service - OpenSSH per-connection server daemon (10.200.16.10:45364). May 27 02:52:01.610004 sshd[6320]: Accepted publickey for core from 10.200.16.10 port 45364 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:52:01.611232 sshd-session[6320]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:52:01.615733 systemd-logind[1862]: New session 24 of user core. May 27 02:52:01.620693 systemd[1]: Started session-24.scope - Session 24 of User core. May 27 02:52:01.837916 containerd[1879]: time="2025-05-27T02:52:01.837861250Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 02:52:02.008929 sshd[6322]: Connection closed by 10.200.16.10 port 45364 May 27 02:52:02.007982 sshd-session[6320]: pam_unix(sshd:session): session closed for user core May 27 02:52:02.011609 systemd[1]: sshd@21-10.200.20.22:22-10.200.16.10:45364.service: Deactivated successfully. May 27 02:52:02.013527 systemd[1]: session-24.scope: Deactivated successfully. May 27 02:52:02.015106 systemd-logind[1862]: Session 24 logged out. Waiting for processes to exit. May 27 02:52:02.016640 systemd-logind[1862]: Removed session 24. May 27 02:52:02.037075 containerd[1879]: time="2025-05-27T02:52:02.036969842Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:52:02.040504 containerd[1879]: time="2025-05-27T02:52:02.040356066Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:52:02.040718 containerd[1879]: time="2025-05-27T02:52:02.040389515Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 02:52:02.040787 kubelet[3293]: E0527 02:52:02.040644 3293 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 02:52:02.040787 kubelet[3293]: E0527 02:52:02.040708 3293 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 02:52:02.041385 kubelet[3293]: E0527 02:52:02.040818 3293 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:781684bcf162405c83f157eefd25f0ec,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dnxrq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-fc8cbdb96-5hpj2_calico-system(ed0f6931-9afa-4d7f-96af-71936ce08ef4): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:52:02.043559 containerd[1879]: time="2025-05-27T02:52:02.043465001Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 02:52:02.226121 containerd[1879]: time="2025-05-27T02:52:02.226067766Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:52:02.228852 containerd[1879]: time="2025-05-27T02:52:02.228789584Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:52:02.229108 containerd[1879]: time="2025-05-27T02:52:02.228800544Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 02:52:02.229268 kubelet[3293]: E0527 02:52:02.229229 3293 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 02:52:02.229555 kubelet[3293]: E0527 02:52:02.229378 3293 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 02:52:02.229555 kubelet[3293]: E0527 02:52:02.229522 3293 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dnxrq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-fc8cbdb96-5hpj2_calico-system(ed0f6931-9afa-4d7f-96af-71936ce08ef4): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:52:02.230880 kubelet[3293]: E0527 02:52:02.230827 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-fc8cbdb96-5hpj2" podUID="ed0f6931-9afa-4d7f-96af-71936ce08ef4" May 27 02:52:06.034895 containerd[1879]: time="2025-05-27T02:52:06.034849522Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ee10df8cc0c69ed6dd960776c05f80c82a916559e7803dc4ee9638120d6dc636\" id:\"927b97813f13afdb950c7ad39ec9e913664fcaef03037516d5c3d76ea5a40a5a\" pid:6346 exited_at:{seconds:1748314326 nanos:34372354}" May 27 02:52:07.095052 systemd[1]: Started sshd@22-10.200.20.22:22-10.200.16.10:45380.service - OpenSSH per-connection server daemon (10.200.16.10:45380). May 27 02:52:07.549672 sshd[6359]: Accepted publickey for core from 10.200.16.10 port 45380 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:52:07.550942 sshd-session[6359]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:52:07.555638 systemd-logind[1862]: New session 25 of user core. May 27 02:52:07.563873 systemd[1]: Started session-25.scope - Session 25 of User core. May 27 02:52:07.927305 sshd[6361]: Connection closed by 10.200.16.10 port 45380 May 27 02:52:07.927913 sshd-session[6359]: pam_unix(sshd:session): session closed for user core May 27 02:52:07.932214 systemd[1]: sshd@22-10.200.20.22:22-10.200.16.10:45380.service: Deactivated successfully. May 27 02:52:07.935485 systemd[1]: session-25.scope: Deactivated successfully. May 27 02:52:07.936673 systemd-logind[1862]: Session 25 logged out. Waiting for processes to exit. May 27 02:52:07.938570 systemd-logind[1862]: Removed session 25. May 27 02:52:08.013119 systemd[1]: Started sshd@23-10.200.20.22:22-10.200.16.10:45390.service - OpenSSH per-connection server daemon (10.200.16.10:45390). May 27 02:52:08.465523 sshd[6374]: Accepted publickey for core from 10.200.16.10 port 45390 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:52:08.466576 sshd-session[6374]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:52:08.471513 systemd-logind[1862]: New session 26 of user core. May 27 02:52:08.480700 systemd[1]: Started session-26.scope - Session 26 of User core. May 27 02:52:08.840193 kubelet[3293]: E0527 02:52:08.838889 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-ppbfc" podUID="1a22698a-34a8-450c-a9f9-8a53669e60b4" May 27 02:52:08.942204 sshd[6376]: Connection closed by 10.200.16.10 port 45390 May 27 02:52:08.942812 sshd-session[6374]: pam_unix(sshd:session): session closed for user core May 27 02:52:08.946441 systemd[1]: sshd@23-10.200.20.22:22-10.200.16.10:45390.service: Deactivated successfully. May 27 02:52:08.949448 systemd[1]: session-26.scope: Deactivated successfully. May 27 02:52:08.951044 systemd-logind[1862]: Session 26 logged out. Waiting for processes to exit. May 27 02:52:08.953329 systemd-logind[1862]: Removed session 26. May 27 02:52:09.050370 systemd[1]: Started sshd@24-10.200.20.22:22-10.200.16.10:49130.service - OpenSSH per-connection server daemon (10.200.16.10:49130). May 27 02:52:09.538023 sshd[6385]: Accepted publickey for core from 10.200.16.10 port 49130 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:52:09.539968 sshd-session[6385]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:52:09.545904 systemd-logind[1862]: New session 27 of user core. May 27 02:52:09.553687 systemd[1]: Started session-27.scope - Session 27 of User core. May 27 02:52:10.650315 sshd[6389]: Connection closed by 10.200.16.10 port 49130 May 27 02:52:10.651069 sshd-session[6385]: pam_unix(sshd:session): session closed for user core May 27 02:52:10.656419 systemd[1]: sshd@24-10.200.20.22:22-10.200.16.10:49130.service: Deactivated successfully. May 27 02:52:10.658348 systemd[1]: session-27.scope: Deactivated successfully. May 27 02:52:10.659299 systemd-logind[1862]: Session 27 logged out. Waiting for processes to exit. May 27 02:52:10.661096 systemd-logind[1862]: Removed session 27. May 27 02:52:10.738933 systemd[1]: Started sshd@25-10.200.20.22:22-10.200.16.10:49146.service - OpenSSH per-connection server daemon (10.200.16.10:49146). May 27 02:52:11.198246 sshd[6407]: Accepted publickey for core from 10.200.16.10 port 49146 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:52:11.201404 sshd-session[6407]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:52:11.206552 systemd-logind[1862]: New session 28 of user core. May 27 02:52:11.213718 systemd[1]: Started session-28.scope - Session 28 of User core. May 27 02:52:11.682974 sshd[6409]: Connection closed by 10.200.16.10 port 49146 May 27 02:52:11.683615 sshd-session[6407]: pam_unix(sshd:session): session closed for user core May 27 02:52:11.688219 systemd[1]: sshd@25-10.200.20.22:22-10.200.16.10:49146.service: Deactivated successfully. May 27 02:52:11.690464 systemd[1]: session-28.scope: Deactivated successfully. May 27 02:52:11.693597 systemd-logind[1862]: Session 28 logged out. Waiting for processes to exit. May 27 02:52:11.695622 systemd-logind[1862]: Removed session 28. May 27 02:52:11.780830 systemd[1]: Started sshd@26-10.200.20.22:22-10.200.16.10:49154.service - OpenSSH per-connection server daemon (10.200.16.10:49154). May 27 02:52:12.234977 sshd[6419]: Accepted publickey for core from 10.200.16.10 port 49154 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:52:12.237267 sshd-session[6419]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:52:12.243834 systemd-logind[1862]: New session 29 of user core. May 27 02:52:12.249859 systemd[1]: Started session-29.scope - Session 29 of User core. May 27 02:52:12.623739 sshd[6421]: Connection closed by 10.200.16.10 port 49154 May 27 02:52:12.624035 sshd-session[6419]: pam_unix(sshd:session): session closed for user core May 27 02:52:12.628880 systemd[1]: sshd@26-10.200.20.22:22-10.200.16.10:49154.service: Deactivated successfully. May 27 02:52:12.632246 systemd[1]: session-29.scope: Deactivated successfully. May 27 02:52:12.633574 systemd-logind[1862]: Session 29 logged out. Waiting for processes to exit. May 27 02:52:12.635848 systemd-logind[1862]: Removed session 29. May 27 02:52:15.049063 containerd[1879]: time="2025-05-27T02:52:15.048599502Z" level=info msg="TaskExit event in podsandbox handler container_id:\"86afa1ab440d2f706f68848698688f45fb9ad83803ea53c3d01fa1d3df05a82c\" id:\"0171d95c57c2ac9cfda929fe24b8711464c0a64d36fdaf96d5d2418712bbd537\" pid:6443 exited_at:{seconds:1748314335 nanos:45772353}" May 27 02:52:17.706553 systemd[1]: Started sshd@27-10.200.20.22:22-10.200.16.10:49162.service - OpenSSH per-connection server daemon (10.200.16.10:49162). May 27 02:52:17.839150 kubelet[3293]: E0527 02:52:17.839104 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-fc8cbdb96-5hpj2" podUID="ed0f6931-9afa-4d7f-96af-71936ce08ef4" May 27 02:52:18.165661 sshd[6453]: Accepted publickey for core from 10.200.16.10 port 49162 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:52:18.166994 sshd-session[6453]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:52:18.171643 systemd-logind[1862]: New session 30 of user core. May 27 02:52:18.185978 systemd[1]: Started session-30.scope - Session 30 of User core. May 27 02:52:18.548255 sshd[6455]: Connection closed by 10.200.16.10 port 49162 May 27 02:52:18.548988 sshd-session[6453]: pam_unix(sshd:session): session closed for user core May 27 02:52:18.553307 systemd[1]: sshd@27-10.200.20.22:22-10.200.16.10:49162.service: Deactivated successfully. May 27 02:52:18.555700 systemd[1]: session-30.scope: Deactivated successfully. May 27 02:52:18.557801 systemd-logind[1862]: Session 30 logged out. Waiting for processes to exit. May 27 02:52:18.559130 systemd-logind[1862]: Removed session 30. May 27 02:52:22.839096 containerd[1879]: time="2025-05-27T02:52:22.838981998Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 02:52:23.012784 containerd[1879]: time="2025-05-27T02:52:23.012732751Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 02:52:23.016506 containerd[1879]: time="2025-05-27T02:52:23.016326488Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 02:52:23.016506 containerd[1879]: time="2025-05-27T02:52:23.016393330Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 02:52:23.017034 kubelet[3293]: E0527 02:52:23.016971 3293 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 02:52:23.018686 kubelet[3293]: E0527 02:52:23.017269 3293 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 02:52:23.018686 kubelet[3293]: E0527 02:52:23.017409 3293 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nnvzt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-ppbfc_calico-system(1a22698a-34a8-450c-a9f9-8a53669e60b4): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 02:52:23.019004 kubelet[3293]: E0527 02:52:23.018760 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-ppbfc" podUID="1a22698a-34a8-450c-a9f9-8a53669e60b4" May 27 02:52:23.638755 systemd[1]: Started sshd@28-10.200.20.22:22-10.200.16.10:33394.service - OpenSSH per-connection server daemon (10.200.16.10:33394). May 27 02:52:24.128699 sshd[6480]: Accepted publickey for core from 10.200.16.10 port 33394 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:52:24.131522 sshd-session[6480]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:52:24.138561 systemd-logind[1862]: New session 31 of user core. May 27 02:52:24.144977 systemd[1]: Started session-31.scope - Session 31 of User core. May 27 02:52:24.533973 sshd[6482]: Connection closed by 10.200.16.10 port 33394 May 27 02:52:24.536030 sshd-session[6480]: pam_unix(sshd:session): session closed for user core May 27 02:52:24.541043 systemd[1]: sshd@28-10.200.20.22:22-10.200.16.10:33394.service: Deactivated successfully. May 27 02:52:24.545185 systemd[1]: session-31.scope: Deactivated successfully. May 27 02:52:24.549051 systemd-logind[1862]: Session 31 logged out. Waiting for processes to exit. May 27 02:52:24.551278 systemd-logind[1862]: Removed session 31. May 27 02:52:28.845892 kubelet[3293]: E0527 02:52:28.845289 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-fc8cbdb96-5hpj2" podUID="ed0f6931-9afa-4d7f-96af-71936ce08ef4" May 27 02:52:29.626308 systemd[1]: Started sshd@29-10.200.20.22:22-10.200.16.10:34410.service - OpenSSH per-connection server daemon (10.200.16.10:34410). May 27 02:52:30.110540 sshd[6502]: Accepted publickey for core from 10.200.16.10 port 34410 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:52:30.111912 sshd-session[6502]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:52:30.116793 systemd-logind[1862]: New session 32 of user core. May 27 02:52:30.121831 systemd[1]: Started session-32.scope - Session 32 of User core. May 27 02:52:30.500401 sshd[6504]: Connection closed by 10.200.16.10 port 34410 May 27 02:52:30.501022 sshd-session[6502]: pam_unix(sshd:session): session closed for user core May 27 02:52:30.505014 systemd[1]: sshd@29-10.200.20.22:22-10.200.16.10:34410.service: Deactivated successfully. May 27 02:52:30.508702 systemd[1]: session-32.scope: Deactivated successfully. May 27 02:52:30.510564 systemd-logind[1862]: Session 32 logged out. Waiting for processes to exit. May 27 02:52:30.514576 systemd-logind[1862]: Removed session 32. May 27 02:52:35.592446 systemd[1]: Started sshd@30-10.200.20.22:22-10.200.16.10:34420.service - OpenSSH per-connection server daemon (10.200.16.10:34420). May 27 02:52:36.038921 containerd[1879]: time="2025-05-27T02:52:36.038878748Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ee10df8cc0c69ed6dd960776c05f80c82a916559e7803dc4ee9638120d6dc636\" id:\"9781360578da46c67256b43d0c4b1f6a7d1bb3d6c0f8b6c2fbbabc020837e5c2\" pid:6533 exited_at:{seconds:1748314356 nanos:37918701}" May 27 02:52:36.078733 sshd[6518]: Accepted publickey for core from 10.200.16.10 port 34420 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:52:36.080412 sshd-session[6518]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:52:36.084853 systemd-logind[1862]: New session 33 of user core. May 27 02:52:36.092680 systemd[1]: Started session-33.scope - Session 33 of User core. May 27 02:52:36.475392 sshd[6543]: Connection closed by 10.200.16.10 port 34420 May 27 02:52:36.474507 sshd-session[6518]: pam_unix(sshd:session): session closed for user core May 27 02:52:36.478184 systemd[1]: sshd@30-10.200.20.22:22-10.200.16.10:34420.service: Deactivated successfully. May 27 02:52:36.480813 systemd[1]: session-33.scope: Deactivated successfully. May 27 02:52:36.483381 systemd-logind[1862]: Session 33 logged out. Waiting for processes to exit. May 27 02:52:36.484994 systemd-logind[1862]: Removed session 33. May 27 02:52:36.839188 kubelet[3293]: E0527 02:52:36.839048 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-ppbfc" podUID="1a22698a-34a8-450c-a9f9-8a53669e60b4" May 27 02:52:41.558463 systemd[1]: Started sshd@31-10.200.20.22:22-10.200.16.10:49490.service - OpenSSH per-connection server daemon (10.200.16.10:49490). May 27 02:52:42.014643 sshd[6557]: Accepted publickey for core from 10.200.16.10 port 49490 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:52:42.015951 sshd-session[6557]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:52:42.020116 systemd-logind[1862]: New session 34 of user core. May 27 02:52:42.026925 systemd[1]: Started session-34.scope - Session 34 of User core. May 27 02:52:42.404527 sshd[6560]: Connection closed by 10.200.16.10 port 49490 May 27 02:52:42.405159 sshd-session[6557]: pam_unix(sshd:session): session closed for user core May 27 02:52:42.408993 systemd[1]: sshd@31-10.200.20.22:22-10.200.16.10:49490.service: Deactivated successfully. May 27 02:52:42.411075 systemd[1]: session-34.scope: Deactivated successfully. May 27 02:52:42.412069 systemd-logind[1862]: Session 34 logged out. Waiting for processes to exit. May 27 02:52:42.415239 systemd-logind[1862]: Removed session 34. May 27 02:52:42.843349 kubelet[3293]: E0527 02:52:42.842185 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-fc8cbdb96-5hpj2" podUID="ed0f6931-9afa-4d7f-96af-71936ce08ef4" May 27 02:52:45.047413 containerd[1879]: time="2025-05-27T02:52:45.047299264Z" level=info msg="TaskExit event in podsandbox handler container_id:\"86afa1ab440d2f706f68848698688f45fb9ad83803ea53c3d01fa1d3df05a82c\" id:\"90bfa24fbc5fd8d5550120c678ec77631e3531dafce8bbfec3f5dd8f03a20046\" pid:6583 exited_at:{seconds:1748314365 nanos:47013847}" May 27 02:52:47.488153 systemd[1]: Started sshd@32-10.200.20.22:22-10.200.16.10:49504.service - OpenSSH per-connection server daemon (10.200.16.10:49504). May 27 02:52:47.947984 sshd[6593]: Accepted publickey for core from 10.200.16.10 port 49504 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:52:47.949470 sshd-session[6593]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:52:47.953864 systemd-logind[1862]: New session 35 of user core. May 27 02:52:47.958628 systemd[1]: Started session-35.scope - Session 35 of User core. May 27 02:52:48.345945 sshd[6595]: Connection closed by 10.200.16.10 port 49504 May 27 02:52:48.346824 sshd-session[6593]: pam_unix(sshd:session): session closed for user core May 27 02:52:48.350630 systemd-logind[1862]: Session 35 logged out. Waiting for processes to exit. May 27 02:52:48.352631 systemd[1]: sshd@32-10.200.20.22:22-10.200.16.10:49504.service: Deactivated successfully. May 27 02:52:48.355034 systemd[1]: session-35.scope: Deactivated successfully. May 27 02:52:48.356855 systemd-logind[1862]: Removed session 35. May 27 02:52:51.837014 kubelet[3293]: E0527 02:52:51.836953 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-ppbfc" podUID="1a22698a-34a8-450c-a9f9-8a53669e60b4" May 27 02:52:53.428753 systemd[1]: Started sshd@33-10.200.20.22:22-10.200.16.10:56212.service - OpenSSH per-connection server daemon (10.200.16.10:56212). May 27 02:52:53.572884 containerd[1879]: time="2025-05-27T02:52:53.572816209Z" level=info msg="TaskExit event in podsandbox handler container_id:\"86afa1ab440d2f706f68848698688f45fb9ad83803ea53c3d01fa1d3df05a82c\" id:\"6a89c696590df899731719c64347512521917e84e1c9f4d7551e771cc7a55150\" pid:6625 exited_at:{seconds:1748314373 nanos:572568498}" May 27 02:52:53.882039 sshd[6610]: Accepted publickey for core from 10.200.16.10 port 56212 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:52:53.883454 sshd-session[6610]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:52:53.888056 systemd-logind[1862]: New session 36 of user core. May 27 02:52:53.895689 systemd[1]: Started session-36.scope - Session 36 of User core. May 27 02:52:54.272523 sshd[6634]: Connection closed by 10.200.16.10 port 56212 May 27 02:52:54.271665 sshd-session[6610]: pam_unix(sshd:session): session closed for user core May 27 02:52:54.276974 systemd[1]: sshd@33-10.200.20.22:22-10.200.16.10:56212.service: Deactivated successfully. May 27 02:52:54.282413 systemd[1]: session-36.scope: Deactivated successfully. May 27 02:52:54.284037 systemd-logind[1862]: Session 36 logged out. Waiting for processes to exit. May 27 02:52:54.285530 systemd-logind[1862]: Removed session 36. May 27 02:52:55.838575 kubelet[3293]: E0527 02:52:55.838502 3293 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-fc8cbdb96-5hpj2" podUID="ed0f6931-9afa-4d7f-96af-71936ce08ef4" May 27 02:52:59.367788 systemd[1]: Started sshd@34-10.200.20.22:22-10.200.16.10:56520.service - OpenSSH per-connection server daemon (10.200.16.10:56520). May 27 02:52:59.858236 sshd[6646]: Accepted publickey for core from 10.200.16.10 port 56520 ssh2: RSA SHA256:dQ5n/TEF0B0Zd9LSQs6hU1XnlCWpvGObsk/6/QN9ItE May 27 02:52:59.859653 sshd-session[6646]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 02:52:59.864405 systemd-logind[1862]: New session 37 of user core. May 27 02:52:59.869658 systemd[1]: Started session-37.scope - Session 37 of User core. May 27 02:53:00.251413 sshd[6648]: Connection closed by 10.200.16.10 port 56520 May 27 02:53:00.252072 sshd-session[6646]: pam_unix(sshd:session): session closed for user core May 27 02:53:00.255793 systemd[1]: sshd@34-10.200.20.22:22-10.200.16.10:56520.service: Deactivated successfully. May 27 02:53:00.257568 systemd[1]: session-37.scope: Deactivated successfully. May 27 02:53:00.260117 systemd-logind[1862]: Session 37 logged out. Waiting for processes to exit. May 27 02:53:00.261602 systemd-logind[1862]: Removed session 37.