Sep 12 17:21:20.211537 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd490] Sep 12 17:21:20.211555 kernel: Linux version 6.12.47-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Fri Sep 12 15:37:01 -00 2025 Sep 12 17:21:20.211562 kernel: KASLR enabled Sep 12 17:21:20.211566 kernel: earlycon: pl11 at MMIO 0x00000000effec000 (options '') Sep 12 17:21:20.211570 kernel: printk: legacy bootconsole [pl11] enabled Sep 12 17:21:20.211574 kernel: efi: EFI v2.7 by EDK II Sep 12 17:21:20.211594 kernel: efi: ACPI 2.0=0x3fd5f018 SMBIOS=0x3e580000 SMBIOS 3.0=0x3e560000 MEMATTR=0x3f20d018 RNG=0x3fd5f998 MEMRESERVE=0x3e471598 Sep 12 17:21:20.211598 kernel: random: crng init done Sep 12 17:21:20.211602 kernel: secureboot: Secure boot disabled Sep 12 17:21:20.211605 kernel: ACPI: Early table checksum verification disabled Sep 12 17:21:20.211609 kernel: ACPI: RSDP 0x000000003FD5F018 000024 (v02 VRTUAL) Sep 12 17:21:20.211613 kernel: ACPI: XSDT 0x000000003FD5FF18 00006C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 12 17:21:20.211617 kernel: ACPI: FACP 0x000000003FD5FC18 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 12 17:21:20.211623 kernel: ACPI: DSDT 0x000000003FD41018 01DFCD (v02 MSFTVM DSDT01 00000001 INTL 20230628) Sep 12 17:21:20.211628 kernel: ACPI: DBG2 0x000000003FD5FB18 000072 (v00 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 12 17:21:20.211632 kernel: ACPI: GTDT 0x000000003FD5FD98 000060 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 12 17:21:20.211636 kernel: ACPI: OEM0 0x000000003FD5F098 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 12 17:21:20.211640 kernel: ACPI: SPCR 0x000000003FD5FA98 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 12 17:21:20.211645 kernel: ACPI: APIC 0x000000003FD5F818 0000FC (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 12 17:21:20.211649 kernel: ACPI: SRAT 0x000000003FD5F198 000234 (v03 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 12 17:21:20.211653 kernel: ACPI: PPTT 0x000000003FD5F418 000120 (v01 VRTUAL MICROSFT 00000000 MSFT 00000000) Sep 12 17:21:20.211657 kernel: ACPI: BGRT 0x000000003FD5FE98 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Sep 12 17:21:20.211662 kernel: ACPI: SPCR: console: pl011,mmio32,0xeffec000,115200 Sep 12 17:21:20.211666 kernel: ACPI: Use ACPI SPCR as default console: No Sep 12 17:21:20.211670 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] hotplug Sep 12 17:21:20.211674 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x1bfffffff] hotplug Sep 12 17:21:20.211678 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1c0000000-0xfbfffffff] hotplug Sep 12 17:21:20.211682 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] hotplug Sep 12 17:21:20.211687 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] hotplug Sep 12 17:21:20.211692 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] hotplug Sep 12 17:21:20.211696 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] hotplug Sep 12 17:21:20.211700 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] hotplug Sep 12 17:21:20.211704 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] hotplug Sep 12 17:21:20.211708 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] hotplug Sep 12 17:21:20.211712 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] hotplug Sep 12 17:21:20.211716 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] hotplug Sep 12 17:21:20.211721 kernel: NUMA: Node 0 [mem 0x00000000-0x3fffffff] + [mem 0x100000000-0x1bfffffff] -> [mem 0x00000000-0x1bfffffff] Sep 12 17:21:20.211725 kernel: NODE_DATA(0) allocated [mem 0x1bf7fda00-0x1bf804fff] Sep 12 17:21:20.211729 kernel: Zone ranges: Sep 12 17:21:20.211733 kernel: DMA [mem 0x0000000000000000-0x00000000ffffffff] Sep 12 17:21:20.211740 kernel: DMA32 empty Sep 12 17:21:20.211744 kernel: Normal [mem 0x0000000100000000-0x00000001bfffffff] Sep 12 17:21:20.211749 kernel: Device empty Sep 12 17:21:20.211753 kernel: Movable zone start for each node Sep 12 17:21:20.211757 kernel: Early memory node ranges Sep 12 17:21:20.211762 kernel: node 0: [mem 0x0000000000000000-0x00000000007fffff] Sep 12 17:21:20.211767 kernel: node 0: [mem 0x0000000000824000-0x000000003e45ffff] Sep 12 17:21:20.211771 kernel: node 0: [mem 0x000000003e460000-0x000000003e46ffff] Sep 12 17:21:20.211776 kernel: node 0: [mem 0x000000003e470000-0x000000003e54ffff] Sep 12 17:21:20.211780 kernel: node 0: [mem 0x000000003e550000-0x000000003e87ffff] Sep 12 17:21:20.211784 kernel: node 0: [mem 0x000000003e880000-0x000000003fc7ffff] Sep 12 17:21:20.211789 kernel: node 0: [mem 0x000000003fc80000-0x000000003fcfffff] Sep 12 17:21:20.211793 kernel: node 0: [mem 0x000000003fd00000-0x000000003fffffff] Sep 12 17:21:20.211797 kernel: node 0: [mem 0x0000000100000000-0x00000001bfffffff] Sep 12 17:21:20.211802 kernel: Initmem setup node 0 [mem 0x0000000000000000-0x00000001bfffffff] Sep 12 17:21:20.211806 kernel: On node 0, zone DMA: 36 pages in unavailable ranges Sep 12 17:21:20.211811 kernel: cma: Reserved 16 MiB at 0x000000003d400000 on node -1 Sep 12 17:21:20.211816 kernel: psci: probing for conduit method from ACPI. Sep 12 17:21:20.211820 kernel: psci: PSCIv1.1 detected in firmware. Sep 12 17:21:20.211824 kernel: psci: Using standard PSCI v0.2 function IDs Sep 12 17:21:20.211829 kernel: psci: MIGRATE_INFO_TYPE not supported. Sep 12 17:21:20.211833 kernel: psci: SMC Calling Convention v1.4 Sep 12 17:21:20.211838 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Sep 12 17:21:20.211842 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Sep 12 17:21:20.211846 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Sep 12 17:21:20.211851 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Sep 12 17:21:20.211855 kernel: pcpu-alloc: [0] 0 [0] 1 Sep 12 17:21:20.211860 kernel: Detected PIPT I-cache on CPU0 Sep 12 17:21:20.211865 kernel: CPU features: detected: Address authentication (architected QARMA5 algorithm) Sep 12 17:21:20.211870 kernel: CPU features: detected: GIC system register CPU interface Sep 12 17:21:20.211874 kernel: CPU features: detected: Spectre-v4 Sep 12 17:21:20.211878 kernel: CPU features: detected: Spectre-BHB Sep 12 17:21:20.211883 kernel: CPU features: kernel page table isolation forced ON by KASLR Sep 12 17:21:20.211887 kernel: CPU features: detected: Kernel page table isolation (KPTI) Sep 12 17:21:20.211892 kernel: CPU features: detected: ARM erratum 2067961 or 2054223 Sep 12 17:21:20.211896 kernel: CPU features: detected: SSBS not fully self-synchronizing Sep 12 17:21:20.211900 kernel: alternatives: applying boot alternatives Sep 12 17:21:20.211905 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=9b01894f6bb04aff3ec9b8554b3ae56a087d51961f1a01981bc4d4f54ccefc09 Sep 12 17:21:20.211910 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 12 17:21:20.211915 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 12 17:21:20.211920 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 12 17:21:20.211924 kernel: Fallback order for Node 0: 0 Sep 12 17:21:20.211928 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1048540 Sep 12 17:21:20.211933 kernel: Policy zone: Normal Sep 12 17:21:20.211937 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 12 17:21:20.211941 kernel: software IO TLB: area num 2. Sep 12 17:21:20.211946 kernel: software IO TLB: mapped [mem 0x0000000036290000-0x000000003a290000] (64MB) Sep 12 17:21:20.211950 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 12 17:21:20.211955 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 12 17:21:20.211960 kernel: rcu: RCU event tracing is enabled. Sep 12 17:21:20.211965 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 12 17:21:20.211970 kernel: Trampoline variant of Tasks RCU enabled. Sep 12 17:21:20.211974 kernel: Tracing variant of Tasks RCU enabled. Sep 12 17:21:20.211978 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 12 17:21:20.211983 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 12 17:21:20.211987 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 17:21:20.211992 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 17:21:20.211996 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 12 17:21:20.212001 kernel: GICv3: 960 SPIs implemented Sep 12 17:21:20.212005 kernel: GICv3: 0 Extended SPIs implemented Sep 12 17:21:20.212009 kernel: Root IRQ handler: gic_handle_irq Sep 12 17:21:20.212014 kernel: GICv3: GICv3 features: 16 PPIs, RSS Sep 12 17:21:20.212019 kernel: GICv3: GICD_CTRL.DS=0, SCR_EL3.FIQ=0 Sep 12 17:21:20.212023 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000effee000 Sep 12 17:21:20.212028 kernel: ITS: No ITS available, not enabling LPIs Sep 12 17:21:20.212032 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 12 17:21:20.212037 kernel: arch_timer: cp15 timer(s) running at 1000.00MHz (virt). Sep 12 17:21:20.212041 kernel: clocksource: arch_sys_counter: mask: 0x1fffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 12 17:21:20.212046 kernel: sched_clock: 61 bits at 1000MHz, resolution 1ns, wraps every 4398046511103ns Sep 12 17:21:20.212050 kernel: Console: colour dummy device 80x25 Sep 12 17:21:20.212055 kernel: printk: legacy console [tty1] enabled Sep 12 17:21:20.212060 kernel: ACPI: Core revision 20240827 Sep 12 17:21:20.212064 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 2000.00 BogoMIPS (lpj=1000000) Sep 12 17:21:20.212070 kernel: pid_max: default: 32768 minimum: 301 Sep 12 17:21:20.212074 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 12 17:21:20.212079 kernel: landlock: Up and running. Sep 12 17:21:20.212083 kernel: SELinux: Initializing. Sep 12 17:21:20.212088 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 17:21:20.212096 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 17:21:20.212101 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3b8030, hints 0x1a0000e, misc 0x31e1 Sep 12 17:21:20.212106 kernel: Hyper-V: Host Build 10.0.26100.1261-1-0 Sep 12 17:21:20.212111 kernel: Hyper-V: enabling crash_kexec_post_notifiers Sep 12 17:21:20.212115 kernel: rcu: Hierarchical SRCU implementation. Sep 12 17:21:20.212120 kernel: rcu: Max phase no-delay instances is 400. Sep 12 17:21:20.212126 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 12 17:21:20.212131 kernel: Remapping and enabling EFI services. Sep 12 17:21:20.212135 kernel: smp: Bringing up secondary CPUs ... Sep 12 17:21:20.212140 kernel: Detected PIPT I-cache on CPU1 Sep 12 17:21:20.212145 kernel: GICv3: CPU1: found redistributor 1 region 1:0x00000000f000e000 Sep 12 17:21:20.212150 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd490] Sep 12 17:21:20.212155 kernel: smp: Brought up 1 node, 2 CPUs Sep 12 17:21:20.212160 kernel: SMP: Total of 2 processors activated. Sep 12 17:21:20.212165 kernel: CPU: All CPU(s) started at EL1 Sep 12 17:21:20.212169 kernel: CPU features: detected: 32-bit EL0 Support Sep 12 17:21:20.212174 kernel: CPU features: detected: Instruction cache invalidation not required for I/D coherence Sep 12 17:21:20.212179 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Sep 12 17:21:20.212184 kernel: CPU features: detected: Common not Private translations Sep 12 17:21:20.212188 kernel: CPU features: detected: CRC32 instructions Sep 12 17:21:20.212194 kernel: CPU features: detected: Generic authentication (architected QARMA5 algorithm) Sep 12 17:21:20.212199 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Sep 12 17:21:20.212204 kernel: CPU features: detected: LSE atomic instructions Sep 12 17:21:20.212208 kernel: CPU features: detected: Privileged Access Never Sep 12 17:21:20.212213 kernel: CPU features: detected: Speculation barrier (SB) Sep 12 17:21:20.212218 kernel: CPU features: detected: TLB range maintenance instructions Sep 12 17:21:20.212223 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Sep 12 17:21:20.212227 kernel: CPU features: detected: Scalable Vector Extension Sep 12 17:21:20.212232 kernel: alternatives: applying system-wide alternatives Sep 12 17:21:20.212238 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Sep 12 17:21:20.212242 kernel: SVE: maximum available vector length 16 bytes per vector Sep 12 17:21:20.212247 kernel: SVE: default vector length 16 bytes per vector Sep 12 17:21:20.212252 kernel: Memory: 3959668K/4194160K available (11136K kernel code, 2440K rwdata, 9068K rodata, 38912K init, 1038K bss, 213304K reserved, 16384K cma-reserved) Sep 12 17:21:20.212257 kernel: devtmpfs: initialized Sep 12 17:21:20.212262 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 12 17:21:20.212267 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 12 17:21:20.212271 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Sep 12 17:21:20.212276 kernel: 0 pages in range for non-PLT usage Sep 12 17:21:20.212282 kernel: 508576 pages in range for PLT usage Sep 12 17:21:20.212286 kernel: pinctrl core: initialized pinctrl subsystem Sep 12 17:21:20.212291 kernel: SMBIOS 3.1.0 present. Sep 12 17:21:20.212296 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 09/28/2024 Sep 12 17:21:20.212301 kernel: DMI: Memory slots populated: 2/2 Sep 12 17:21:20.212306 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 12 17:21:20.212310 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 12 17:21:20.212315 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 12 17:21:20.212320 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 12 17:21:20.212326 kernel: audit: initializing netlink subsys (disabled) Sep 12 17:21:20.212331 kernel: audit: type=2000 audit(0.059:1): state=initialized audit_enabled=0 res=1 Sep 12 17:21:20.212335 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 12 17:21:20.212340 kernel: cpuidle: using governor menu Sep 12 17:21:20.212345 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 12 17:21:20.212350 kernel: ASID allocator initialised with 32768 entries Sep 12 17:21:20.212354 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 12 17:21:20.212359 kernel: Serial: AMBA PL011 UART driver Sep 12 17:21:20.212364 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 12 17:21:20.212369 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 12 17:21:20.212374 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 12 17:21:20.212379 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 12 17:21:20.212384 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 12 17:21:20.212389 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 12 17:21:20.212393 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 12 17:21:20.212398 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 12 17:21:20.212403 kernel: ACPI: Added _OSI(Module Device) Sep 12 17:21:20.212407 kernel: ACPI: Added _OSI(Processor Device) Sep 12 17:21:20.212413 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 12 17:21:20.212418 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 12 17:21:20.212422 kernel: ACPI: Interpreter enabled Sep 12 17:21:20.212427 kernel: ACPI: Using GIC for interrupt routing Sep 12 17:21:20.212432 kernel: ARMH0011:00: ttyAMA0 at MMIO 0xeffec000 (irq = 12, base_baud = 0) is a SBSA Sep 12 17:21:20.212437 kernel: printk: legacy console [ttyAMA0] enabled Sep 12 17:21:20.212441 kernel: printk: legacy bootconsole [pl11] disabled Sep 12 17:21:20.212446 kernel: ARMH0011:01: ttyAMA1 at MMIO 0xeffeb000 (irq = 13, base_baud = 0) is a SBSA Sep 12 17:21:20.212451 kernel: ACPI: CPU0 has been hot-added Sep 12 17:21:20.212456 kernel: ACPI: CPU1 has been hot-added Sep 12 17:21:20.212461 kernel: iommu: Default domain type: Translated Sep 12 17:21:20.212466 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 12 17:21:20.212471 kernel: efivars: Registered efivars operations Sep 12 17:21:20.212475 kernel: vgaarb: loaded Sep 12 17:21:20.212480 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 12 17:21:20.212485 kernel: VFS: Disk quotas dquot_6.6.0 Sep 12 17:21:20.212490 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 12 17:21:20.212494 kernel: pnp: PnP ACPI init Sep 12 17:21:20.212500 kernel: pnp: PnP ACPI: found 0 devices Sep 12 17:21:20.212505 kernel: NET: Registered PF_INET protocol family Sep 12 17:21:20.212509 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 12 17:21:20.212514 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 12 17:21:20.212519 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 12 17:21:20.212524 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 12 17:21:20.212529 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 12 17:21:20.212533 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 12 17:21:20.212538 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 17:21:20.212544 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 17:21:20.212548 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 12 17:21:20.212553 kernel: PCI: CLS 0 bytes, default 64 Sep 12 17:21:20.212558 kernel: kvm [1]: HYP mode not available Sep 12 17:21:20.212563 kernel: Initialise system trusted keyrings Sep 12 17:21:20.212567 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 12 17:21:20.212572 kernel: Key type asymmetric registered Sep 12 17:21:20.212577 kernel: Asymmetric key parser 'x509' registered Sep 12 17:21:20.212586 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Sep 12 17:21:20.212592 kernel: io scheduler mq-deadline registered Sep 12 17:21:20.212597 kernel: io scheduler kyber registered Sep 12 17:21:20.212602 kernel: io scheduler bfq registered Sep 12 17:21:20.212606 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 12 17:21:20.212611 kernel: thunder_xcv, ver 1.0 Sep 12 17:21:20.212616 kernel: thunder_bgx, ver 1.0 Sep 12 17:21:20.212620 kernel: nicpf, ver 1.0 Sep 12 17:21:20.212625 kernel: nicvf, ver 1.0 Sep 12 17:21:20.212737 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 12 17:21:20.212788 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-12T17:21:19 UTC (1757697679) Sep 12 17:21:20.212795 kernel: efifb: probing for efifb Sep 12 17:21:20.212800 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Sep 12 17:21:20.212804 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Sep 12 17:21:20.212809 kernel: efifb: scrolling: redraw Sep 12 17:21:20.212814 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Sep 12 17:21:20.212819 kernel: Console: switching to colour frame buffer device 128x48 Sep 12 17:21:20.212823 kernel: fb0: EFI VGA frame buffer device Sep 12 17:21:20.212829 kernel: SMCCC: SOC_ID: ARCH_SOC_ID not implemented, skipping .... Sep 12 17:21:20.212834 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 12 17:21:20.212839 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Sep 12 17:21:20.212844 kernel: watchdog: NMI not fully supported Sep 12 17:21:20.212848 kernel: NET: Registered PF_INET6 protocol family Sep 12 17:21:20.212853 kernel: watchdog: Hard watchdog permanently disabled Sep 12 17:21:20.212858 kernel: Segment Routing with IPv6 Sep 12 17:21:20.212863 kernel: In-situ OAM (IOAM) with IPv6 Sep 12 17:21:20.212867 kernel: NET: Registered PF_PACKET protocol family Sep 12 17:21:20.212873 kernel: Key type dns_resolver registered Sep 12 17:21:20.212878 kernel: registered taskstats version 1 Sep 12 17:21:20.212882 kernel: Loading compiled-in X.509 certificates Sep 12 17:21:20.212887 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.47-flatcar: 7675c1947f324bc6524fdc1ee0f8f5f343acfea7' Sep 12 17:21:20.212892 kernel: Demotion targets for Node 0: null Sep 12 17:21:20.212897 kernel: Key type .fscrypt registered Sep 12 17:21:20.212902 kernel: Key type fscrypt-provisioning registered Sep 12 17:21:20.212906 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 12 17:21:20.212911 kernel: ima: Allocated hash algorithm: sha1 Sep 12 17:21:20.212917 kernel: ima: No architecture policies found Sep 12 17:21:20.212922 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 12 17:21:20.212926 kernel: clk: Disabling unused clocks Sep 12 17:21:20.212931 kernel: PM: genpd: Disabling unused power domains Sep 12 17:21:20.212936 kernel: Warning: unable to open an initial console. Sep 12 17:21:20.212941 kernel: Freeing unused kernel memory: 38912K Sep 12 17:21:20.212945 kernel: Run /init as init process Sep 12 17:21:20.212950 kernel: with arguments: Sep 12 17:21:20.212955 kernel: /init Sep 12 17:21:20.212960 kernel: with environment: Sep 12 17:21:20.212965 kernel: HOME=/ Sep 12 17:21:20.212970 kernel: TERM=linux Sep 12 17:21:20.212974 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 12 17:21:20.212980 systemd[1]: Successfully made /usr/ read-only. Sep 12 17:21:20.212987 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 17:21:20.212993 systemd[1]: Detected virtualization microsoft. Sep 12 17:21:20.212998 systemd[1]: Detected architecture arm64. Sep 12 17:21:20.213003 systemd[1]: Running in initrd. Sep 12 17:21:20.213008 systemd[1]: No hostname configured, using default hostname. Sep 12 17:21:20.213014 systemd[1]: Hostname set to . Sep 12 17:21:20.213019 systemd[1]: Initializing machine ID from random generator. Sep 12 17:21:20.213024 systemd[1]: Queued start job for default target initrd.target. Sep 12 17:21:20.213029 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:21:20.213035 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:21:20.213040 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 12 17:21:20.213046 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 17:21:20.213052 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 12 17:21:20.213058 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 12 17:21:20.213063 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 12 17:21:20.213069 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 12 17:21:20.213074 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:21:20.213080 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:21:20.213085 systemd[1]: Reached target paths.target - Path Units. Sep 12 17:21:20.213091 systemd[1]: Reached target slices.target - Slice Units. Sep 12 17:21:20.213096 systemd[1]: Reached target swap.target - Swaps. Sep 12 17:21:20.213101 systemd[1]: Reached target timers.target - Timer Units. Sep 12 17:21:20.213106 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 17:21:20.213111 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 17:21:20.213116 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 12 17:21:20.213121 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 12 17:21:20.213127 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:21:20.213133 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 17:21:20.213138 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:21:20.213143 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 17:21:20.213148 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 12 17:21:20.213153 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 17:21:20.213159 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 12 17:21:20.213164 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 12 17:21:20.213170 systemd[1]: Starting systemd-fsck-usr.service... Sep 12 17:21:20.213175 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 17:21:20.213181 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 17:21:20.213196 systemd-journald[225]: Collecting audit messages is disabled. Sep 12 17:21:20.213210 systemd-journald[225]: Journal started Sep 12 17:21:20.213224 systemd-journald[225]: Runtime Journal (/run/log/journal/2238b2c6bb7243aea665bac3761b2e26) is 8M, max 78.5M, 70.5M free. Sep 12 17:21:20.216617 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:21:20.222251 systemd-modules-load[227]: Inserted module 'overlay' Sep 12 17:21:20.249173 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 17:21:20.249206 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 12 17:21:20.259821 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 12 17:21:20.273448 kernel: Bridge firewalling registered Sep 12 17:21:20.274792 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:21:20.275867 systemd-modules-load[227]: Inserted module 'br_netfilter' Sep 12 17:21:20.297603 systemd[1]: Finished systemd-fsck-usr.service. Sep 12 17:21:20.302679 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 17:21:20.312559 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:21:20.334690 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 17:21:20.343047 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 17:21:20.366062 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 17:21:20.382699 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 17:21:20.398917 systemd-tmpfiles[249]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 12 17:21:20.401457 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:21:20.415032 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:21:20.430276 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 17:21:20.442425 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:21:20.457830 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 12 17:21:20.487353 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 17:21:20.498965 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 17:21:20.523299 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:21:20.537802 dracut-cmdline[261]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyAMA0,115200n8 earlycon=pl011,0xeffec000 flatcar.first_boot=detected acpi=force flatcar.oem.id=azure flatcar.autologin verity.usrhash=9b01894f6bb04aff3ec9b8554b3ae56a087d51961f1a01981bc4d4f54ccefc09 Sep 12 17:21:20.579376 systemd-resolved[263]: Positive Trust Anchors: Sep 12 17:21:20.579395 systemd-resolved[263]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 17:21:20.579415 systemd-resolved[263]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 17:21:20.581088 systemd-resolved[263]: Defaulting to hostname 'linux'. Sep 12 17:21:20.583995 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 17:21:20.591383 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:21:20.702591 kernel: SCSI subsystem initialized Sep 12 17:21:20.707596 kernel: Loading iSCSI transport class v2.0-870. Sep 12 17:21:20.715598 kernel: iscsi: registered transport (tcp) Sep 12 17:21:20.730070 kernel: iscsi: registered transport (qla4xxx) Sep 12 17:21:20.730108 kernel: QLogic iSCSI HBA Driver Sep 12 17:21:20.743001 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 17:21:20.771389 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 17:21:20.779242 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 17:21:20.832630 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 12 17:21:20.839548 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 12 17:21:20.905597 kernel: raid6: neonx8 gen() 18546 MB/s Sep 12 17:21:20.925599 kernel: raid6: neonx4 gen() 18574 MB/s Sep 12 17:21:20.945586 kernel: raid6: neonx2 gen() 17087 MB/s Sep 12 17:21:20.966586 kernel: raid6: neonx1 gen() 15093 MB/s Sep 12 17:21:20.986585 kernel: raid6: int64x8 gen() 10545 MB/s Sep 12 17:21:21.006584 kernel: raid6: int64x4 gen() 10598 MB/s Sep 12 17:21:21.027585 kernel: raid6: int64x2 gen() 8980 MB/s Sep 12 17:21:21.050515 kernel: raid6: int64x1 gen() 7037 MB/s Sep 12 17:21:21.050521 kernel: raid6: using algorithm neonx4 gen() 18574 MB/s Sep 12 17:21:21.072592 kernel: raid6: .... xor() 15163 MB/s, rmw enabled Sep 12 17:21:21.072657 kernel: raid6: using neon recovery algorithm Sep 12 17:21:21.086242 kernel: xor: measuring software checksum speed Sep 12 17:21:21.086249 kernel: 8regs : 28591 MB/sec Sep 12 17:21:21.089537 kernel: 32regs : 28791 MB/sec Sep 12 17:21:21.092700 kernel: arm64_neon : 37549 MB/sec Sep 12 17:21:21.096676 kernel: xor: using function: arm64_neon (37549 MB/sec) Sep 12 17:21:21.135596 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 12 17:21:21.140971 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 12 17:21:21.152941 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:21:21.186214 systemd-udevd[474]: Using default interface naming scheme 'v255'. Sep 12 17:21:21.192196 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:21:21.200908 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 12 17:21:21.232572 dracut-pre-trigger[487]: rd.md=0: removing MD RAID activation Sep 12 17:21:21.254113 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 17:21:21.261758 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 17:21:21.310132 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:21:21.317886 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 12 17:21:21.389629 kernel: hv_vmbus: Vmbus version:5.3 Sep 12 17:21:21.390377 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:21:21.390475 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:21:21.423545 kernel: hv_vmbus: registering driver hyperv_keyboard Sep 12 17:21:21.423562 kernel: hv_vmbus: registering driver hv_netvsc Sep 12 17:21:21.423569 kernel: hv_vmbus: registering driver hid_hyperv Sep 12 17:21:21.424717 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:21:21.488216 kernel: hv_vmbus: registering driver hv_storvsc Sep 12 17:21:21.488232 kernel: pps_core: LinuxPPS API ver. 1 registered Sep 12 17:21:21.488248 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input0 Sep 12 17:21:21.488255 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Sep 12 17:21:21.488261 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input1 Sep 12 17:21:21.488269 kernel: scsi host1: storvsc_host_t Sep 12 17:21:21.488393 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Sep 12 17:21:21.488455 kernel: scsi host0: storvsc_host_t Sep 12 17:21:21.441045 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:21:21.545738 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Sep 12 17:21:21.545775 kernel: hv_netvsc 000d3afc-ed22-000d-3afc-ed22000d3afc eth0: VF slot 1 added Sep 12 17:21:21.545883 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 Sep 12 17:21:21.545382 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 17:21:21.579173 kernel: PTP clock support registered Sep 12 17:21:21.579192 kernel: hv_vmbus: registering driver hv_pci Sep 12 17:21:21.579199 kernel: hv_pci 27f4a0ed-07a0-4bbd-b821-4aafdcd71d1b: PCI VMBus probing: Using version 0x10004 Sep 12 17:21:21.579326 kernel: hv_pci 27f4a0ed-07a0-4bbd-b821-4aafdcd71d1b: PCI host bridge to bus 07a0:00 Sep 12 17:21:21.562151 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:21:21.606459 kernel: pci_bus 07a0:00: root bus resource [mem 0xfc0000000-0xfc00fffff window] Sep 12 17:21:21.624342 kernel: pci_bus 07a0:00: No busn resource found for root bus, will use [bus 00-ff] Sep 12 17:21:21.624416 kernel: hv_utils: Registering HyperV Utility Driver Sep 12 17:21:21.624423 kernel: pci 07a0:00:02.0: [15b3:101a] type 00 class 0x020000 PCIe Endpoint Sep 12 17:21:21.624440 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Sep 12 17:21:21.624518 kernel: hv_vmbus: registering driver hv_utils Sep 12 17:21:21.562220 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:21:21.448498 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Sep 12 17:21:21.454587 kernel: hv_utils: Shutdown IC version 3.2 Sep 12 17:21:21.454599 kernel: pci 07a0:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref] Sep 12 17:21:21.454703 kernel: hv_utils: Heartbeat IC version 3.0 Sep 12 17:21:21.454709 kernel: pci 07a0:00:02.0: enabling Extended Tags Sep 12 17:21:21.454771 kernel: hv_utils: TimeSync IC version 4.0 Sep 12 17:21:21.454777 kernel: sd 0:0:0:0: [sda] Write Protect is off Sep 12 17:21:21.454849 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 Sep 12 17:21:21.454909 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Sep 12 17:21:21.454966 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#251 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 12 17:21:21.455025 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#194 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 12 17:21:21.455077 kernel: pci 07a0:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at 07a0:00:02.0 (capable of 252.048 Gb/s with 16.0 GT/s PCIe x16 link) Sep 12 17:21:21.456120 kernel: pci_bus 07a0:00: busn_res: [bus 00-ff] end is updated to 00 Sep 12 17:21:21.456203 kernel: pci 07a0:00:02.0: BAR 0 [mem 0xfc0000000-0xfc00fffff 64bit pref]: assigned Sep 12 17:21:21.456265 systemd-journald[225]: Time jumped backwards, rotating. Sep 12 17:21:21.456295 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:21:21.392880 systemd-resolved[263]: Clock change detected. Flushing caches. Sep 12 17:21:21.466232 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Sep 12 17:21:21.404956 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 17:21:21.481345 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Sep 12 17:21:21.481479 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 12 17:21:21.429012 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:21:21.495877 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Sep 12 17:21:21.515195 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#254 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Sep 12 17:21:21.516267 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:21:21.542140 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#228 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Sep 12 17:21:21.566054 kernel: mlx5_core 07a0:00:02.0: enabling device (0000 -> 0002) Sep 12 17:21:21.575482 kernel: mlx5_core 07a0:00:02.0: PTM is not supported by PCIe Sep 12 17:21:21.575651 kernel: mlx5_core 07a0:00:02.0: firmware version: 16.30.5006 Sep 12 17:21:21.747552 kernel: hv_netvsc 000d3afc-ed22-000d-3afc-ed22000d3afc eth0: VF registering: eth1 Sep 12 17:21:21.747754 kernel: mlx5_core 07a0:00:02.0 eth1: joined to eth0 Sep 12 17:21:21.755180 kernel: mlx5_core 07a0:00:02.0: MLX5E: StrdRq(1) RqSz(8) StrdSz(2048) RxCqeCmprss(0 basic) Sep 12 17:21:21.766152 kernel: mlx5_core 07a0:00:02.0 enP1952s1: renamed from eth1 Sep 12 17:21:22.041351 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. Sep 12 17:21:22.095657 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Sep 12 17:21:22.108448 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. Sep 12 17:21:22.140820 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. Sep 12 17:21:22.146511 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. Sep 12 17:21:22.162748 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 12 17:21:22.175592 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 17:21:22.187352 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:21:22.200376 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 17:21:22.216294 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 12 17:21:22.228259 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 12 17:21:22.252051 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 12 17:21:22.273199 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#208 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 12 17:21:22.284148 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:21:23.302151 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#246 cmd 0x5a status: scsi 0x2 srb 0x86 hv 0xc0000001 Sep 12 17:21:23.319585 disk-uuid[665]: The operation has completed successfully. Sep 12 17:21:23.325148 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:21:23.387870 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 12 17:21:23.390017 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 12 17:21:23.423855 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 12 17:21:23.442192 sh[823]: Success Sep 12 17:21:23.480698 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 12 17:21:23.480744 kernel: device-mapper: uevent: version 1.0.3 Sep 12 17:21:23.487197 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 12 17:21:23.499158 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Sep 12 17:21:23.878946 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 12 17:21:23.887554 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 12 17:21:23.904105 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 12 17:21:23.939603 kernel: BTRFS: device fsid 752cb955-bdfa-486a-ad02-b54d5e61d194 devid 1 transid 39 /dev/mapper/usr (254:0) scanned by mount (841) Sep 12 17:21:23.939634 kernel: BTRFS info (device dm-0): first mount of filesystem 752cb955-bdfa-486a-ad02-b54d5e61d194 Sep 12 17:21:23.945561 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 12 17:21:24.280269 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 12 17:21:24.280360 kernel: BTRFS info (device dm-0): enabling free space tree Sep 12 17:21:24.324049 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 12 17:21:24.328861 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 12 17:21:24.339299 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 12 17:21:24.339901 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 12 17:21:24.367551 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 12 17:21:24.403303 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (864) Sep 12 17:21:24.403343 kernel: BTRFS info (device sda6): first mount of filesystem 5f4a7913-42f7-487c-8331-8ab180fe9df7 Sep 12 17:21:24.415463 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 17:21:24.473052 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 17:21:24.494550 kernel: BTRFS info (device sda6): turning on async discard Sep 12 17:21:24.494566 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 17:21:24.494948 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 17:21:24.513700 kernel: BTRFS info (device sda6): last unmount of filesystem 5f4a7913-42f7-487c-8331-8ab180fe9df7 Sep 12 17:21:24.514226 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 12 17:21:24.521082 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 12 17:21:24.549462 systemd-networkd[1008]: lo: Link UP Sep 12 17:21:24.549472 systemd-networkd[1008]: lo: Gained carrier Sep 12 17:21:24.550397 systemd-networkd[1008]: Enumeration completed Sep 12 17:21:24.553330 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 17:21:24.557253 systemd-networkd[1008]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:21:24.557256 systemd-networkd[1008]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:21:24.563099 systemd[1]: Reached target network.target - Network. Sep 12 17:21:24.620140 kernel: mlx5_core 07a0:00:02.0 enP1952s1: Link up Sep 12 17:21:24.620324 kernel: buffer_size[0]=0 is not enough for lossless buffer Sep 12 17:21:24.657359 kernel: hv_netvsc 000d3afc-ed22-000d-3afc-ed22000d3afc eth0: Data path switched to VF: enP1952s1 Sep 12 17:21:24.657089 systemd-networkd[1008]: enP1952s1: Link UP Sep 12 17:21:24.657177 systemd-networkd[1008]: eth0: Link UP Sep 12 17:21:24.657247 systemd-networkd[1008]: eth0: Gained carrier Sep 12 17:21:24.657260 systemd-networkd[1008]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:21:24.678464 systemd-networkd[1008]: enP1952s1: Gained carrier Sep 12 17:21:24.688158 systemd-networkd[1008]: eth0: DHCPv4 address 10.200.20.44/24, gateway 10.200.20.1 acquired from 168.63.129.16 Sep 12 17:21:25.731998 ignition[1011]: Ignition 2.21.0 Sep 12 17:21:25.732013 ignition[1011]: Stage: fetch-offline Sep 12 17:21:25.736063 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 17:21:25.732085 ignition[1011]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:21:25.746922 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 12 17:21:25.732092 ignition[1011]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 12 17:21:25.732198 ignition[1011]: parsed url from cmdline: "" Sep 12 17:21:25.732201 ignition[1011]: no config URL provided Sep 12 17:21:25.732204 ignition[1011]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 17:21:25.732209 ignition[1011]: no config at "/usr/lib/ignition/user.ign" Sep 12 17:21:25.732212 ignition[1011]: failed to fetch config: resource requires networking Sep 12 17:21:25.732331 ignition[1011]: Ignition finished successfully Sep 12 17:21:25.785622 ignition[1023]: Ignition 2.21.0 Sep 12 17:21:25.785627 ignition[1023]: Stage: fetch Sep 12 17:21:25.785779 ignition[1023]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:21:25.785785 ignition[1023]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 12 17:21:25.785848 ignition[1023]: parsed url from cmdline: "" Sep 12 17:21:25.785850 ignition[1023]: no config URL provided Sep 12 17:21:25.785853 ignition[1023]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 17:21:25.785859 ignition[1023]: no config at "/usr/lib/ignition/user.ign" Sep 12 17:21:25.785905 ignition[1023]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Sep 12 17:21:25.856360 ignition[1023]: GET result: OK Sep 12 17:21:25.856409 ignition[1023]: config has been read from IMDS userdata Sep 12 17:21:25.856431 ignition[1023]: parsing config with SHA512: 84518aa6fb300824d622aa92ef70f2dd95e5618a857f08c22c7ebc0e85482b4fbc8df6f7843b11cbfa9e3fe8107ea4bdfb7d9c4d4ce0512a56c7ff52be2c7a24 Sep 12 17:21:25.861477 unknown[1023]: fetched base config from "system" Sep 12 17:21:25.861825 ignition[1023]: fetch: fetch complete Sep 12 17:21:25.861482 unknown[1023]: fetched base config from "system" Sep 12 17:21:25.861830 ignition[1023]: fetch: fetch passed Sep 12 17:21:25.861485 unknown[1023]: fetched user config from "azure" Sep 12 17:21:25.861884 ignition[1023]: Ignition finished successfully Sep 12 17:21:25.863820 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 12 17:21:25.869813 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 12 17:21:25.911725 ignition[1029]: Ignition 2.21.0 Sep 12 17:21:25.911736 ignition[1029]: Stage: kargs Sep 12 17:21:25.911912 ignition[1029]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:21:25.917814 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 12 17:21:25.911919 ignition[1029]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 12 17:21:25.925683 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 12 17:21:25.912530 ignition[1029]: kargs: kargs passed Sep 12 17:21:25.912570 ignition[1029]: Ignition finished successfully Sep 12 17:21:25.958534 ignition[1035]: Ignition 2.21.0 Sep 12 17:21:25.958539 ignition[1035]: Stage: disks Sep 12 17:21:25.964645 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 12 17:21:25.960819 ignition[1035]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:21:25.972451 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 12 17:21:25.960828 ignition[1035]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 12 17:21:25.983412 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 12 17:21:25.961652 ignition[1035]: disks: disks passed Sep 12 17:21:25.994319 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 17:21:25.961710 ignition[1035]: Ignition finished successfully Sep 12 17:21:26.004648 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 17:21:26.016074 systemd[1]: Reached target basic.target - Basic System. Sep 12 17:21:26.022491 systemd-networkd[1008]: eth0: Gained IPv6LL Sep 12 17:21:26.028788 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 12 17:21:26.132625 systemd-fsck[1043]: ROOT: clean, 15/7326000 files, 477845/7359488 blocks Sep 12 17:21:26.142000 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 12 17:21:26.149530 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 12 17:21:28.185143 kernel: EXT4-fs (sda9): mounted filesystem c902100c-52b7-422c-84ac-d834d4db2717 r/w with ordered data mode. Quota mode: none. Sep 12 17:21:28.190762 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 12 17:21:28.199475 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 12 17:21:28.234866 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 17:21:28.253662 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 12 17:21:28.258394 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 12 17:21:28.270891 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 12 17:21:28.270930 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 17:21:28.329314 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1057) Sep 12 17:21:28.329330 kernel: BTRFS info (device sda6): first mount of filesystem 5f4a7913-42f7-487c-8331-8ab180fe9df7 Sep 12 17:21:28.329337 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 17:21:28.294095 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 12 17:21:28.323508 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 12 17:21:28.357424 kernel: BTRFS info (device sda6): turning on async discard Sep 12 17:21:28.357447 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 17:21:28.359272 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 17:21:28.784143 coreos-metadata[1059]: Sep 12 17:21:28.784 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Sep 12 17:21:28.791525 coreos-metadata[1059]: Sep 12 17:21:28.791 INFO Fetch successful Sep 12 17:21:28.791525 coreos-metadata[1059]: Sep 12 17:21:28.791 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Sep 12 17:21:28.807530 coreos-metadata[1059]: Sep 12 17:21:28.807 INFO Fetch successful Sep 12 17:21:28.819713 coreos-metadata[1059]: Sep 12 17:21:28.819 INFO wrote hostname ci-4426.1.0-a-1fe763f55e to /sysroot/etc/hostname Sep 12 17:21:28.830159 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 12 17:21:29.103726 initrd-setup-root[1087]: cut: /sysroot/etc/passwd: No such file or directory Sep 12 17:21:29.159908 initrd-setup-root[1094]: cut: /sysroot/etc/group: No such file or directory Sep 12 17:21:29.178092 initrd-setup-root[1101]: cut: /sysroot/etc/shadow: No such file or directory Sep 12 17:21:29.184120 initrd-setup-root[1108]: cut: /sysroot/etc/gshadow: No such file or directory Sep 12 17:21:30.309775 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 12 17:21:30.315952 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 12 17:21:30.335743 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 12 17:21:30.349316 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 12 17:21:30.362285 kernel: BTRFS info (device sda6): last unmount of filesystem 5f4a7913-42f7-487c-8331-8ab180fe9df7 Sep 12 17:21:30.383071 ignition[1181]: INFO : Ignition 2.21.0 Sep 12 17:21:30.383071 ignition[1181]: INFO : Stage: mount Sep 12 17:21:30.383071 ignition[1181]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:21:30.383071 ignition[1181]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 12 17:21:30.419826 ignition[1181]: INFO : mount: mount passed Sep 12 17:21:30.419826 ignition[1181]: INFO : Ignition finished successfully Sep 12 17:21:30.387476 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 12 17:21:30.393636 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 12 17:21:30.404070 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 12 17:21:30.440232 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 17:21:30.484074 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1192) Sep 12 17:21:30.484106 kernel: BTRFS info (device sda6): first mount of filesystem 5f4a7913-42f7-487c-8331-8ab180fe9df7 Sep 12 17:21:30.490471 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 17:21:30.502026 kernel: BTRFS info (device sda6): turning on async discard Sep 12 17:21:30.502050 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 17:21:30.503739 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 17:21:30.533112 ignition[1210]: INFO : Ignition 2.21.0 Sep 12 17:21:30.533112 ignition[1210]: INFO : Stage: files Sep 12 17:21:30.533112 ignition[1210]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:21:30.533112 ignition[1210]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 12 17:21:30.533112 ignition[1210]: DEBUG : files: compiled without relabeling support, skipping Sep 12 17:21:30.575329 ignition[1210]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 12 17:21:30.575329 ignition[1210]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 12 17:21:30.641567 ignition[1210]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 12 17:21:30.649012 ignition[1210]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 12 17:21:30.649012 ignition[1210]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 12 17:21:30.642498 unknown[1210]: wrote ssh authorized keys file for user: core Sep 12 17:21:30.681968 ignition[1210]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Sep 12 17:21:30.692685 ignition[1210]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Sep 12 17:21:30.833313 ignition[1210]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 12 17:21:31.126189 ignition[1210]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Sep 12 17:21:31.126189 ignition[1210]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 12 17:21:31.126189 ignition[1210]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 12 17:21:31.126189 ignition[1210]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 12 17:21:31.163609 ignition[1210]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 12 17:21:31.163609 ignition[1210]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 17:21:31.163609 ignition[1210]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 17:21:31.163609 ignition[1210]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 17:21:31.163609 ignition[1210]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 17:21:31.163609 ignition[1210]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 17:21:31.163609 ignition[1210]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 17:21:31.163609 ignition[1210]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 12 17:21:31.163609 ignition[1210]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 12 17:21:31.163609 ignition[1210]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 12 17:21:31.163609 ignition[1210]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-arm64.raw: attempt #1 Sep 12 17:21:31.670018 ignition[1210]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 12 17:21:31.893546 ignition[1210]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 12 17:21:31.893546 ignition[1210]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 12 17:21:31.931962 ignition[1210]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 17:21:31.942894 ignition[1210]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 17:21:31.942894 ignition[1210]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 12 17:21:31.942894 ignition[1210]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 12 17:21:31.942894 ignition[1210]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 12 17:21:31.942894 ignition[1210]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 12 17:21:31.942894 ignition[1210]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 12 17:21:31.942894 ignition[1210]: INFO : files: files passed Sep 12 17:21:31.942894 ignition[1210]: INFO : Ignition finished successfully Sep 12 17:21:31.943251 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 12 17:21:31.959244 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 12 17:21:32.007407 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 12 17:21:32.066023 initrd-setup-root-after-ignition[1239]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:21:32.066023 initrd-setup-root-after-ignition[1239]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:21:32.018058 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 12 17:21:32.093429 initrd-setup-root-after-ignition[1243]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:21:32.018160 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 12 17:21:32.043016 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 17:21:32.050191 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 12 17:21:32.056610 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 12 17:21:32.117373 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 12 17:21:32.117458 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 12 17:21:32.128827 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 12 17:21:32.141009 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 12 17:21:32.151698 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 12 17:21:32.152349 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 12 17:21:32.206176 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 17:21:32.214559 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 12 17:21:32.246193 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:21:32.258361 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:21:32.271190 systemd[1]: Stopped target timers.target - Timer Units. Sep 12 17:21:32.282596 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 12 17:21:32.282702 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 17:21:32.298692 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 12 17:21:32.305071 systemd[1]: Stopped target basic.target - Basic System. Sep 12 17:21:32.316138 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 12 17:21:32.327414 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 17:21:32.338955 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 12 17:21:32.350059 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 12 17:21:32.362834 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 12 17:21:32.374837 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 17:21:32.386631 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 12 17:21:32.396749 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 12 17:21:32.408450 systemd[1]: Stopped target swap.target - Swaps. Sep 12 17:21:32.417547 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 12 17:21:32.417660 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 12 17:21:32.431378 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:21:32.437219 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:21:32.448017 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 12 17:21:32.452826 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:21:32.459597 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 12 17:21:32.459681 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 12 17:21:32.475069 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 12 17:21:32.475154 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 17:21:32.481580 systemd[1]: ignition-files.service: Deactivated successfully. Sep 12 17:21:32.481650 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 12 17:21:32.491576 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 12 17:21:32.491640 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 12 17:21:32.575205 ignition[1263]: INFO : Ignition 2.21.0 Sep 12 17:21:32.575205 ignition[1263]: INFO : Stage: umount Sep 12 17:21:32.575205 ignition[1263]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:21:32.575205 ignition[1263]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Sep 12 17:21:32.575205 ignition[1263]: INFO : umount: umount passed Sep 12 17:21:32.575205 ignition[1263]: INFO : Ignition finished successfully Sep 12 17:21:32.504780 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 12 17:21:32.535292 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 12 17:21:32.543302 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 12 17:21:32.543427 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:21:32.556383 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 12 17:21:32.556458 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 17:21:32.575883 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 12 17:21:32.575967 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 12 17:21:32.586011 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 12 17:21:32.586212 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 12 17:21:32.594114 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 12 17:21:32.594175 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 12 17:21:32.599535 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 12 17:21:32.599568 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 12 17:21:32.605095 systemd[1]: Stopped target network.target - Network. Sep 12 17:21:32.614597 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 12 17:21:32.614647 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 17:21:32.626616 systemd[1]: Stopped target paths.target - Path Units. Sep 12 17:21:32.634984 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 12 17:21:32.640192 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:21:32.647997 systemd[1]: Stopped target slices.target - Slice Units. Sep 12 17:21:32.656956 systemd[1]: Stopped target sockets.target - Socket Units. Sep 12 17:21:32.666178 systemd[1]: iscsid.socket: Deactivated successfully. Sep 12 17:21:32.666249 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 17:21:32.675812 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 12 17:21:32.675845 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 17:21:32.686822 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 12 17:21:32.686910 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 12 17:21:32.697218 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 12 17:21:32.697261 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 12 17:21:32.709277 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 12 17:21:32.720189 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 12 17:21:32.738185 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 12 17:21:32.738761 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 12 17:21:32.738855 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 12 17:21:32.753303 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 12 17:21:32.753510 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 12 17:21:32.753588 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 12 17:21:32.769583 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 12 17:21:33.006065 kernel: hv_netvsc 000d3afc-ed22-000d-3afc-ed22000d3afc eth0: Data path switched from VF: enP1952s1 Sep 12 17:21:32.769816 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 12 17:21:32.769895 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 12 17:21:32.780998 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 12 17:21:32.781063 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 12 17:21:32.797798 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 12 17:21:32.808320 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 12 17:21:32.808360 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:21:32.819080 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 12 17:21:32.819139 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 12 17:21:32.831488 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 12 17:21:32.851003 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 12 17:21:32.851066 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 17:21:32.863550 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 12 17:21:32.863597 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:21:32.874093 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 12 17:21:32.874145 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 12 17:21:32.884390 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 12 17:21:32.884427 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:21:32.899074 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:21:32.911392 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 12 17:21:32.911450 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 12 17:21:32.936395 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 12 17:21:32.936585 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:21:32.946707 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 12 17:21:32.946751 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 12 17:21:32.956293 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 12 17:21:32.956329 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:21:32.966230 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 12 17:21:32.966269 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 12 17:21:32.981680 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 12 17:21:32.981761 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 12 17:21:33.006142 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 17:21:33.006195 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:21:33.017418 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 12 17:21:33.035976 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 12 17:21:33.036031 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 17:21:33.046543 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 12 17:21:33.046581 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:21:33.058872 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 12 17:21:33.058916 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 17:21:33.291746 systemd-journald[225]: Received SIGTERM from PID 1 (systemd). Sep 12 17:21:33.069634 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 12 17:21:33.069672 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:21:33.076344 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:21:33.076379 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:21:33.092627 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 12 17:21:33.092695 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 12 17:21:33.092718 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 12 17:21:33.092741 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 17:21:33.093012 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 12 17:21:33.093102 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 12 17:21:33.128896 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 12 17:21:33.130158 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 12 17:21:33.138562 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 12 17:21:33.149353 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 12 17:21:33.174321 systemd[1]: Switching root. Sep 12 17:21:33.381325 systemd-journald[225]: Journal stopped Sep 12 17:21:40.951711 kernel: SELinux: policy capability network_peer_controls=1 Sep 12 17:21:40.951734 kernel: SELinux: policy capability open_perms=1 Sep 12 17:21:40.951743 kernel: SELinux: policy capability extended_socket_class=1 Sep 12 17:21:40.951749 kernel: SELinux: policy capability always_check_network=0 Sep 12 17:21:40.951755 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 12 17:21:40.951761 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 12 17:21:40.951767 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 12 17:21:40.951772 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 12 17:21:40.951777 kernel: SELinux: policy capability userspace_initial_context=0 Sep 12 17:21:40.951783 kernel: audit: type=1403 audit(1757697694.730:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 12 17:21:40.951789 systemd[1]: Successfully loaded SELinux policy in 247.914ms. Sep 12 17:21:40.951797 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 4.330ms. Sep 12 17:21:40.951803 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 17:21:40.951809 systemd[1]: Detected virtualization microsoft. Sep 12 17:21:40.951815 systemd[1]: Detected architecture arm64. Sep 12 17:21:40.951822 systemd[1]: Detected first boot. Sep 12 17:21:40.951828 systemd[1]: Hostname set to . Sep 12 17:21:40.951834 systemd[1]: Initializing machine ID from random generator. Sep 12 17:21:40.951840 zram_generator::config[1305]: No configuration found. Sep 12 17:21:40.951847 kernel: NET: Registered PF_VSOCK protocol family Sep 12 17:21:40.951852 systemd[1]: Populated /etc with preset unit settings. Sep 12 17:21:40.951859 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 12 17:21:40.951866 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 12 17:21:40.951872 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 12 17:21:40.951878 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 12 17:21:40.951884 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 12 17:21:40.951890 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 12 17:21:40.951897 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 12 17:21:40.951902 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 12 17:21:40.951909 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 12 17:21:40.951916 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 12 17:21:40.951922 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 12 17:21:40.951928 systemd[1]: Created slice user.slice - User and Session Slice. Sep 12 17:21:40.951934 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:21:40.951939 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:21:40.951946 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 12 17:21:40.951952 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 12 17:21:40.951958 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 12 17:21:40.951965 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 17:21:40.951971 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Sep 12 17:21:40.951979 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:21:40.951985 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:21:40.951991 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 12 17:21:40.951997 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 12 17:21:40.952004 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 12 17:21:40.952012 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 12 17:21:40.952018 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:21:40.952024 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 17:21:40.952030 systemd[1]: Reached target slices.target - Slice Units. Sep 12 17:21:40.952036 systemd[1]: Reached target swap.target - Swaps. Sep 12 17:21:40.952042 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 12 17:21:40.952048 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 12 17:21:40.952056 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 12 17:21:40.952062 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:21:40.952068 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 17:21:40.952075 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:21:40.952081 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 12 17:21:40.952087 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 12 17:21:40.952094 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 12 17:21:40.952100 systemd[1]: Mounting media.mount - External Media Directory... Sep 12 17:21:40.952106 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 12 17:21:40.952112 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 12 17:21:40.952118 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 12 17:21:40.952124 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 12 17:21:40.952140 systemd[1]: Reached target machines.target - Containers. Sep 12 17:21:40.952146 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 12 17:21:40.952154 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:21:40.952161 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 17:21:40.952167 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 12 17:21:40.952173 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:21:40.952179 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 17:21:40.952186 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:21:40.952192 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 12 17:21:40.952198 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:21:40.952204 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 12 17:21:40.952211 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 12 17:21:40.952217 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 12 17:21:40.952224 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 12 17:21:40.952230 systemd[1]: Stopped systemd-fsck-usr.service. Sep 12 17:21:40.952236 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 17:21:40.952242 kernel: fuse: init (API version 7.41) Sep 12 17:21:40.952248 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 17:21:40.952254 kernel: loop: module loaded Sep 12 17:21:40.952261 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 17:21:40.952267 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 17:21:40.952273 kernel: ACPI: bus type drm_connector registered Sep 12 17:21:40.952279 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 12 17:21:40.952286 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 12 17:21:40.952310 systemd-journald[1409]: Collecting audit messages is disabled. Sep 12 17:21:40.952327 systemd-journald[1409]: Journal started Sep 12 17:21:40.952341 systemd-journald[1409]: Runtime Journal (/run/log/journal/3567c60b2d94452bb3f2a49f9df57005) is 8M, max 78.5M, 70.5M free. Sep 12 17:21:39.989896 systemd[1]: Queued start job for default target multi-user.target. Sep 12 17:21:40.001563 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 12 17:21:40.001931 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 12 17:21:40.002189 systemd[1]: systemd-journald.service: Consumed 3.282s CPU time. Sep 12 17:21:40.967458 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 17:21:40.979041 systemd[1]: verity-setup.service: Deactivated successfully. Sep 12 17:21:40.979069 systemd[1]: Stopped verity-setup.service. Sep 12 17:21:40.996096 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 17:21:40.996716 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 12 17:21:41.002637 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 12 17:21:41.009251 systemd[1]: Mounted media.mount - External Media Directory. Sep 12 17:21:41.014490 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 12 17:21:41.020945 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 12 17:21:41.027154 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 12 17:21:41.032392 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 12 17:21:41.038849 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:21:41.045115 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 12 17:21:41.045354 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 12 17:21:41.051745 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:21:41.051863 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:21:41.058582 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 17:21:41.058696 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 17:21:41.065544 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:21:41.067180 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:21:41.075410 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 12 17:21:41.075527 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 12 17:21:41.081463 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:21:41.082189 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:21:41.087803 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 17:21:41.094220 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 17:21:41.100845 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 12 17:21:41.107942 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 12 17:21:41.115695 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:21:41.130685 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 17:21:41.137463 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 12 17:21:41.151697 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 12 17:21:41.157990 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 12 17:21:41.158018 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 17:21:41.164401 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 12 17:21:41.172449 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 12 17:21:41.178298 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:21:41.184708 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 12 17:21:41.191468 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 12 17:21:41.197583 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 17:21:41.198260 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 12 17:21:41.204409 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 17:21:41.206269 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 17:21:41.238344 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 12 17:21:41.251461 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 17:21:41.259478 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 12 17:21:41.260554 systemd-journald[1409]: Time spent on flushing to /var/log/journal/3567c60b2d94452bb3f2a49f9df57005 is 45.499ms for 943 entries. Sep 12 17:21:41.260554 systemd-journald[1409]: System Journal (/var/log/journal/3567c60b2d94452bb3f2a49f9df57005) is 11.8M, max 2.6G, 2.6G free. Sep 12 17:21:41.359431 systemd-journald[1409]: Received client request to flush runtime journal. Sep 12 17:21:41.359464 systemd-journald[1409]: /var/log/journal/3567c60b2d94452bb3f2a49f9df57005/system.journal: Realtime clock jumped backwards relative to last journal entry, rotating. Sep 12 17:21:41.359478 kernel: loop0: detected capacity change from 0 to 211168 Sep 12 17:21:41.359485 systemd-journald[1409]: Rotating system journal. Sep 12 17:21:41.271611 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 12 17:21:41.281367 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 12 17:21:41.289690 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 12 17:21:41.316760 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 12 17:21:41.346948 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:21:41.361003 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 12 17:21:41.404167 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 12 17:21:41.404681 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 12 17:21:41.405897 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 12 17:21:41.457154 kernel: loop1: detected capacity change from 0 to 119320 Sep 12 17:21:41.458399 systemd-tmpfiles[1446]: ACLs are not supported, ignoring. Sep 12 17:21:41.458411 systemd-tmpfiles[1446]: ACLs are not supported, ignoring. Sep 12 17:21:41.460927 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 17:21:41.468544 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 12 17:21:41.925155 kernel: loop2: detected capacity change from 0 to 100608 Sep 12 17:21:42.108361 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 12 17:21:42.115198 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 17:21:42.137809 systemd-tmpfiles[1465]: ACLs are not supported, ignoring. Sep 12 17:21:42.138042 systemd-tmpfiles[1465]: ACLs are not supported, ignoring. Sep 12 17:21:42.140336 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:21:42.473159 kernel: loop3: detected capacity change from 0 to 29264 Sep 12 17:21:42.829296 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 12 17:21:42.836640 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:21:42.865115 systemd-udevd[1470]: Using default interface naming scheme 'v255'. Sep 12 17:21:42.974153 kernel: loop4: detected capacity change from 0 to 211168 Sep 12 17:21:42.990151 kernel: loop5: detected capacity change from 0 to 119320 Sep 12 17:21:43.002145 kernel: loop6: detected capacity change from 0 to 100608 Sep 12 17:21:43.016159 kernel: loop7: detected capacity change from 0 to 29264 Sep 12 17:21:43.023255 (sd-merge)[1472]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. Sep 12 17:21:43.023610 (sd-merge)[1472]: Merged extensions into '/usr'. Sep 12 17:21:43.026615 systemd[1]: Reload requested from client PID 1445 ('systemd-sysext') (unit systemd-sysext.service)... Sep 12 17:21:43.026631 systemd[1]: Reloading... Sep 12 17:21:43.076171 zram_generator::config[1494]: No configuration found. Sep 12 17:21:43.225258 systemd[1]: Reloading finished in 198 ms. Sep 12 17:21:43.243239 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 12 17:21:43.252987 systemd[1]: Starting ensure-sysext.service... Sep 12 17:21:43.258244 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 17:21:43.296543 systemd[1]: Reload requested from client PID 1553 ('systemctl') (unit ensure-sysext.service)... Sep 12 17:21:43.296558 systemd[1]: Reloading... Sep 12 17:21:43.300428 systemd-tmpfiles[1554]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 12 17:21:43.300719 systemd-tmpfiles[1554]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 12 17:21:43.301011 systemd-tmpfiles[1554]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 12 17:21:43.301272 systemd-tmpfiles[1554]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 12 17:21:43.301795 systemd-tmpfiles[1554]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 12 17:21:43.302025 systemd-tmpfiles[1554]: ACLs are not supported, ignoring. Sep 12 17:21:43.302155 systemd-tmpfiles[1554]: ACLs are not supported, ignoring. Sep 12 17:21:43.351162 zram_generator::config[1582]: No configuration found. Sep 12 17:21:43.358369 systemd-tmpfiles[1554]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 17:21:43.358379 systemd-tmpfiles[1554]: Skipping /boot Sep 12 17:21:43.364016 systemd-tmpfiles[1554]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 17:21:43.364025 systemd-tmpfiles[1554]: Skipping /boot Sep 12 17:21:43.483918 systemd[1]: Reloading finished in 187 ms. Sep 12 17:21:43.493034 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:21:43.508886 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 17:21:43.558947 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 12 17:21:43.573797 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 12 17:21:43.585213 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 17:21:43.592277 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 12 17:21:43.607248 systemd[1]: Finished ensure-sysext.service. Sep 12 17:21:43.612716 systemd[1]: Expecting device dev-ptp_hyperv.device - /dev/ptp_hyperv... Sep 12 17:21:43.618541 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:21:43.619366 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:21:43.626280 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 17:21:43.634249 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:21:43.643242 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:21:43.652345 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:21:43.652383 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 17:21:43.652423 systemd[1]: Reached target time-set.target - System Time Set. Sep 12 17:21:43.663524 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:21:43.675471 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:21:43.675614 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:21:43.687041 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 17:21:43.687633 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 17:21:43.695305 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:21:43.695538 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:21:43.707680 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:21:43.707805 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:21:43.725335 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 12 17:21:43.742284 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 17:21:43.750806 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 17:21:43.750871 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 17:21:43.752865 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 12 17:21:43.779669 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Sep 12 17:21:43.834025 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 12 17:21:43.876414 augenrules[1734]: No rules Sep 12 17:21:43.877941 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 17:21:43.878429 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 17:21:43.903144 kernel: mousedev: PS/2 mouse device common for all mice Sep 12 17:21:43.903203 kernel: hv_vmbus: registering driver hv_balloon Sep 12 17:21:43.911393 systemd[1]: Condition check resulted in dev-ptp_hyperv.device - /dev/ptp_hyperv being skipped. Sep 12 17:21:43.914159 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#53 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Sep 12 17:21:43.924491 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 12 17:21:43.952151 kernel: hv_vmbus: registering driver hyperv_fb Sep 12 17:21:43.952207 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Sep 12 17:21:43.945302 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:21:43.961192 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Sep 12 17:21:43.961237 kernel: hv_balloon: Memory hot add disabled on ARM64 Sep 12 17:21:43.961251 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Sep 12 17:21:43.977801 kernel: Console: switching to colour dummy device 80x25 Sep 12 17:21:43.980649 kernel: Console: switching to colour frame buffer device 128x48 Sep 12 17:21:43.981483 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:21:43.981641 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:21:43.998288 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:21:44.012260 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:21:44.012452 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:21:44.020647 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:21:44.105690 systemd-resolved[1648]: Positive Trust Anchors: Sep 12 17:21:44.105955 systemd-resolved[1648]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 17:21:44.106029 systemd-resolved[1648]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 17:21:44.160469 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Sep 12 17:21:44.176256 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 12 17:21:44.178308 systemd-resolved[1648]: Using system hostname 'ci-4426.1.0-a-1fe763f55e'. Sep 12 17:21:44.183327 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 17:21:44.185312 systemd-networkd[1697]: lo: Link UP Sep 12 17:21:44.185538 systemd-networkd[1697]: lo: Gained carrier Sep 12 17:21:44.186535 systemd-networkd[1697]: Enumeration completed Sep 12 17:21:44.189029 systemd-networkd[1697]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:21:44.189523 systemd-networkd[1697]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:21:44.189527 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 17:21:44.195730 systemd[1]: Reached target network.target - Network. Sep 12 17:21:44.200377 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:21:44.209346 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 12 17:21:44.217317 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 12 17:21:44.259166 kernel: mlx5_core 07a0:00:02.0 enP1952s1: Link up Sep 12 17:21:44.262922 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 12 17:21:44.270196 kernel: buffer_size[0]=0 is not enough for lossless buffer Sep 12 17:21:44.287157 kernel: hv_netvsc 000d3afc-ed22-000d-3afc-ed22000d3afc eth0: Data path switched to VF: enP1952s1 Sep 12 17:21:44.287480 systemd-networkd[1697]: enP1952s1: Link UP Sep 12 17:21:44.288418 systemd-networkd[1697]: eth0: Link UP Sep 12 17:21:44.288423 systemd-networkd[1697]: eth0: Gained carrier Sep 12 17:21:44.288442 systemd-networkd[1697]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:21:44.291750 systemd-networkd[1697]: enP1952s1: Gained carrier Sep 12 17:21:44.294214 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 12 17:21:44.301186 systemd-networkd[1697]: eth0: DHCPv4 address 10.200.20.44/24, gateway 10.200.20.1 acquired from 168.63.129.16 Sep 12 17:21:44.338147 kernel: MACsec IEEE 802.1AE Sep 12 17:21:45.300064 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:21:45.537278 systemd-networkd[1697]: eth0: Gained IPv6LL Sep 12 17:21:45.542454 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 12 17:21:45.548951 systemd[1]: Reached target network-online.target - Network is Online. Sep 12 17:21:46.169410 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 12 17:21:46.177372 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 17:21:49.945558 ldconfig[1439]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 12 17:21:49.966168 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 12 17:21:49.974239 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 12 17:21:50.027863 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 12 17:21:50.034643 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 17:21:50.041436 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 12 17:21:50.048193 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 12 17:21:50.056437 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 12 17:21:50.063310 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 12 17:21:50.071206 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 12 17:21:50.077146 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 12 17:21:50.077173 systemd[1]: Reached target paths.target - Path Units. Sep 12 17:21:50.081828 systemd[1]: Reached target timers.target - Timer Units. Sep 12 17:21:50.102354 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 12 17:21:50.110334 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 12 17:21:50.117493 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 12 17:21:50.125438 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 12 17:21:50.133330 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 12 17:21:50.141528 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 12 17:21:50.159312 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 12 17:21:50.167283 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 12 17:21:50.174219 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 17:21:50.180607 systemd[1]: Reached target basic.target - Basic System. Sep 12 17:21:50.186822 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 12 17:21:50.186841 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 12 17:21:50.217736 systemd[1]: Starting chronyd.service - NTP client/server... Sep 12 17:21:50.230227 systemd[1]: Starting containerd.service - containerd container runtime... Sep 12 17:21:50.237281 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 12 17:21:50.250315 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 12 17:21:50.259328 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 12 17:21:50.275803 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 12 17:21:50.282415 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 12 17:21:50.287961 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 12 17:21:50.288945 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. Sep 12 17:21:50.295699 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). Sep 12 17:21:50.298148 jq[1840]: false Sep 12 17:21:50.297866 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:21:50.306316 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 12 17:21:50.314839 KVP[1842]: KVP starting; pid is:1842 Sep 12 17:21:50.317268 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 12 17:21:50.317856 chronyd[1832]: chronyd version 4.7 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Sep 12 17:21:50.324255 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 12 17:21:50.332247 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 12 17:21:50.343066 KVP[1842]: KVP LIC Version: 3.1 Sep 12 17:21:50.344234 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 12 17:21:50.351711 kernel: hv_utils: KVP IC version 4.0 Sep 12 17:21:50.364270 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 12 17:21:50.372813 extend-filesystems[1841]: Found /dev/sda6 Sep 12 17:21:50.372981 chronyd[1832]: Timezone right/UTC failed leap second check, ignoring Sep 12 17:21:50.373939 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 12 17:21:50.373465 chronyd[1832]: Loaded seccomp filter (level 2) Sep 12 17:21:50.381624 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 12 17:21:50.382223 systemd[1]: Starting update-engine.service - Update Engine... Sep 12 17:21:50.390705 extend-filesystems[1841]: Found /dev/sda9 Sep 12 17:21:50.398400 extend-filesystems[1841]: Checking size of /dev/sda9 Sep 12 17:21:50.395348 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 12 17:21:50.409386 jq[1866]: true Sep 12 17:21:50.404531 systemd[1]: Started chronyd.service - NTP client/server. Sep 12 17:21:50.417499 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 12 17:21:50.426510 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 12 17:21:50.429593 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 12 17:21:50.430678 systemd[1]: motdgen.service: Deactivated successfully. Sep 12 17:21:50.432182 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 12 17:21:50.445339 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 12 17:21:50.455719 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 12 17:21:50.456041 extend-filesystems[1841]: Old size kept for /dev/sda9 Sep 12 17:21:50.456483 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 12 17:21:50.476406 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 12 17:21:50.478926 update_engine[1865]: I20250912 17:21:50.477165 1865 main.cc:92] Flatcar Update Engine starting Sep 12 17:21:50.479544 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 12 17:21:50.498394 (ntainerd)[1886]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 12 17:21:50.517144 jq[1884]: true Sep 12 17:21:50.526777 systemd-logind[1855]: New seat seat0. Sep 12 17:21:50.528301 systemd-logind[1855]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 12 17:21:50.528496 systemd[1]: Started systemd-logind.service - User Login Management. Sep 12 17:21:50.596080 tar[1878]: linux-arm64/LICENSE Sep 12 17:21:50.596080 tar[1878]: linux-arm64/helm Sep 12 17:21:50.746679 bash[1964]: Updated "/home/core/.ssh/authorized_keys" Sep 12 17:21:50.749423 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 12 17:21:50.762165 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 12 17:21:50.797617 dbus-daemon[1835]: [system] SELinux support is enabled Sep 12 17:21:50.797771 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 12 17:21:50.806773 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 12 17:21:50.807355 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 12 17:21:50.814832 update_engine[1865]: I20250912 17:21:50.814626 1865 update_check_scheduler.cc:74] Next update check in 5m29s Sep 12 17:21:50.818267 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 12 17:21:50.818364 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 12 17:21:50.830981 systemd[1]: Started update-engine.service - Update Engine. Sep 12 17:21:50.831217 dbus-daemon[1835]: [system] Successfully activated service 'org.freedesktop.systemd1' Sep 12 17:21:50.851012 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 12 17:21:50.894181 coreos-metadata[1834]: Sep 12 17:21:50.894 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Sep 12 17:21:50.899241 coreos-metadata[1834]: Sep 12 17:21:50.899 INFO Fetch successful Sep 12 17:21:50.899326 coreos-metadata[1834]: Sep 12 17:21:50.899 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 Sep 12 17:21:50.904111 coreos-metadata[1834]: Sep 12 17:21:50.904 INFO Fetch successful Sep 12 17:21:50.906160 coreos-metadata[1834]: Sep 12 17:21:50.906 INFO Fetching http://168.63.129.16/machine/a7685dec-0980-46b1-8bd1-35cc48ad6aa6/ea386357%2D4c2d%2D4874%2Dac68%2D93b8226ebd1a.%5Fci%2D4426.1.0%2Da%2D1fe763f55e?comp=config&type=sharedConfig&incarnation=1: Attempt #1 Sep 12 17:21:50.907503 coreos-metadata[1834]: Sep 12 17:21:50.907 INFO Fetch successful Sep 12 17:21:50.907503 coreos-metadata[1834]: Sep 12 17:21:50.907 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 Sep 12 17:21:50.915965 coreos-metadata[1834]: Sep 12 17:21:50.915 INFO Fetch successful Sep 12 17:21:50.961085 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 12 17:21:50.971124 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 12 17:21:51.076574 sshd_keygen[1869]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 12 17:21:51.092155 tar[1878]: linux-arm64/README.md Sep 12 17:21:51.096106 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 12 17:21:51.110140 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 12 17:21:51.117397 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... Sep 12 17:21:51.128294 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 12 17:21:51.138741 locksmithd[1978]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 12 17:21:51.139116 systemd[1]: issuegen.service: Deactivated successfully. Sep 12 17:21:51.139924 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 12 17:21:51.152427 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 12 17:21:51.169829 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. Sep 12 17:21:51.186468 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 12 17:21:51.195140 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 12 17:21:51.203461 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Sep 12 17:21:51.235202 systemd[1]: Reached target getty.target - Login Prompts. Sep 12 17:21:51.302869 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:21:51.309297 (kubelet)[2024]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:21:51.337510 containerd[1886]: time="2025-09-12T17:21:51Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 12 17:21:51.339283 containerd[1886]: time="2025-09-12T17:21:51.339031512Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 12 17:21:51.347535 containerd[1886]: time="2025-09-12T17:21:51.347498528Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.808µs" Sep 12 17:21:51.348143 containerd[1886]: time="2025-09-12T17:21:51.347988984Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 12 17:21:51.348143 containerd[1886]: time="2025-09-12T17:21:51.348024496Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 12 17:21:51.348280 containerd[1886]: time="2025-09-12T17:21:51.348261256Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 12 17:21:51.348333 containerd[1886]: time="2025-09-12T17:21:51.348320888Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 12 17:21:51.348398 containerd[1886]: time="2025-09-12T17:21:51.348386904Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 17:21:51.348528 containerd[1886]: time="2025-09-12T17:21:51.348510616Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 17:21:51.348576 containerd[1886]: time="2025-09-12T17:21:51.348563840Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 17:21:51.348814 containerd[1886]: time="2025-09-12T17:21:51.348792520Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 17:21:51.349150 containerd[1886]: time="2025-09-12T17:21:51.348856928Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 17:21:51.349150 containerd[1886]: time="2025-09-12T17:21:51.348872400Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 17:21:51.349150 containerd[1886]: time="2025-09-12T17:21:51.348879160Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 12 17:21:51.349150 containerd[1886]: time="2025-09-12T17:21:51.348949312Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 12 17:21:51.349287 containerd[1886]: time="2025-09-12T17:21:51.349268480Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 17:21:51.349358 containerd[1886]: time="2025-09-12T17:21:51.349344440Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 17:21:51.349413 containerd[1886]: time="2025-09-12T17:21:51.349401720Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 12 17:21:51.349493 containerd[1886]: time="2025-09-12T17:21:51.349484320Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 12 17:21:51.349729 containerd[1886]: time="2025-09-12T17:21:51.349710888Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 12 17:21:51.349858 containerd[1886]: time="2025-09-12T17:21:51.349843160Z" level=info msg="metadata content store policy set" policy=shared Sep 12 17:21:51.369316 containerd[1886]: time="2025-09-12T17:21:51.369289864Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 12 17:21:51.369462 containerd[1886]: time="2025-09-12T17:21:51.369422864Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 12 17:21:51.369462 containerd[1886]: time="2025-09-12T17:21:51.369444680Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 12 17:21:51.369665 containerd[1886]: time="2025-09-12T17:21:51.369575248Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 12 17:21:51.369665 containerd[1886]: time="2025-09-12T17:21:51.369608368Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 12 17:21:51.369665 containerd[1886]: time="2025-09-12T17:21:51.369618816Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 12 17:21:51.369665 containerd[1886]: time="2025-09-12T17:21:51.369628464Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 12 17:21:51.369665 containerd[1886]: time="2025-09-12T17:21:51.369636080Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 12 17:21:51.369665 containerd[1886]: time="2025-09-12T17:21:51.369644088Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 12 17:21:51.369665 containerd[1886]: time="2025-09-12T17:21:51.369650560Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 12 17:21:51.369665 containerd[1886]: time="2025-09-12T17:21:51.369656200Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 12 17:21:51.369865 containerd[1886]: time="2025-09-12T17:21:51.369797824Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 12 17:21:51.370059 containerd[1886]: time="2025-09-12T17:21:51.369995688Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 12 17:21:51.370059 containerd[1886]: time="2025-09-12T17:21:51.370017448Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 12 17:21:51.370249 containerd[1886]: time="2025-09-12T17:21:51.370049464Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 12 17:21:51.370249 containerd[1886]: time="2025-09-12T17:21:51.370190704Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 12 17:21:51.370249 containerd[1886]: time="2025-09-12T17:21:51.370205864Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 12 17:21:51.370249 containerd[1886]: time="2025-09-12T17:21:51.370216152Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 12 17:21:51.370249 containerd[1886]: time="2025-09-12T17:21:51.370224280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 12 17:21:51.370249 containerd[1886]: time="2025-09-12T17:21:51.370231184Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 12 17:21:51.370249 containerd[1886]: time="2025-09-12T17:21:51.370239080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 12 17:21:51.370461 containerd[1886]: time="2025-09-12T17:21:51.370393568Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 12 17:21:51.370461 containerd[1886]: time="2025-09-12T17:21:51.370410400Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 12 17:21:51.370573 containerd[1886]: time="2025-09-12T17:21:51.370528088Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 12 17:21:51.370573 containerd[1886]: time="2025-09-12T17:21:51.370547752Z" level=info msg="Start snapshots syncer" Sep 12 17:21:51.370653 containerd[1886]: time="2025-09-12T17:21:51.370640456Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 12 17:21:51.370951 containerd[1886]: time="2025-09-12T17:21:51.370912128Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 12 17:21:51.371143 containerd[1886]: time="2025-09-12T17:21:51.370999080Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 12 17:21:51.371234 containerd[1886]: time="2025-09-12T17:21:51.371207552Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 12 17:21:51.371505 containerd[1886]: time="2025-09-12T17:21:51.371419656Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 12 17:21:51.371505 containerd[1886]: time="2025-09-12T17:21:51.371451392Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 12 17:21:51.371505 containerd[1886]: time="2025-09-12T17:21:51.371459040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 12 17:21:51.371505 containerd[1886]: time="2025-09-12T17:21:51.371467552Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 12 17:21:51.371505 containerd[1886]: time="2025-09-12T17:21:51.371475392Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 12 17:21:51.371505 containerd[1886]: time="2025-09-12T17:21:51.371483368Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 12 17:21:51.371711 containerd[1886]: time="2025-09-12T17:21:51.371498064Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 12 17:21:51.371711 containerd[1886]: time="2025-09-12T17:21:51.371684080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 12 17:21:51.371711 containerd[1886]: time="2025-09-12T17:21:51.371693984Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 12 17:21:51.371806 containerd[1886]: time="2025-09-12T17:21:51.371701352Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 12 17:21:51.371926 containerd[1886]: time="2025-09-12T17:21:51.371885656Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 17:21:51.371926 containerd[1886]: time="2025-09-12T17:21:51.371903080Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 17:21:51.371926 containerd[1886]: time="2025-09-12T17:21:51.371909080Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 17:21:51.371926 containerd[1886]: time="2025-09-12T17:21:51.371915176Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 17:21:51.372079 containerd[1886]: time="2025-09-12T17:21:51.371919968Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 12 17:21:51.372079 containerd[1886]: time="2025-09-12T17:21:51.372016840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 12 17:21:51.372079 containerd[1886]: time="2025-09-12T17:21:51.372028520Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 12 17:21:51.372079 containerd[1886]: time="2025-09-12T17:21:51.372044144Z" level=info msg="runtime interface created" Sep 12 17:21:51.372079 containerd[1886]: time="2025-09-12T17:21:51.372047968Z" level=info msg="created NRI interface" Sep 12 17:21:51.372079 containerd[1886]: time="2025-09-12T17:21:51.372053216Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 12 17:21:51.372079 containerd[1886]: time="2025-09-12T17:21:51.372061856Z" level=info msg="Connect containerd service" Sep 12 17:21:51.372295 containerd[1886]: time="2025-09-12T17:21:51.372237920Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 12 17:21:51.373049 containerd[1886]: time="2025-09-12T17:21:51.372999552Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 17:21:51.667021 kubelet[2024]: E0912 17:21:51.666904 2024 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:21:51.669192 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:21:51.669307 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:21:51.669613 systemd[1]: kubelet.service: Consumed 556ms CPU time, 258.9M memory peak. Sep 12 17:21:51.751149 containerd[1886]: time="2025-09-12T17:21:51.751003768Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 12 17:21:51.751149 containerd[1886]: time="2025-09-12T17:21:51.751058048Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 12 17:21:51.751149 containerd[1886]: time="2025-09-12T17:21:51.751080376Z" level=info msg="Start subscribing containerd event" Sep 12 17:21:51.751149 containerd[1886]: time="2025-09-12T17:21:51.751114376Z" level=info msg="Start recovering state" Sep 12 17:21:51.752375 containerd[1886]: time="2025-09-12T17:21:51.752217152Z" level=info msg="Start event monitor" Sep 12 17:21:51.752375 containerd[1886]: time="2025-09-12T17:21:51.752244288Z" level=info msg="Start cni network conf syncer for default" Sep 12 17:21:51.752375 containerd[1886]: time="2025-09-12T17:21:51.752250784Z" level=info msg="Start streaming server" Sep 12 17:21:51.752375 containerd[1886]: time="2025-09-12T17:21:51.752257000Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 12 17:21:51.752375 containerd[1886]: time="2025-09-12T17:21:51.752262080Z" level=info msg="runtime interface starting up..." Sep 12 17:21:51.752375 containerd[1886]: time="2025-09-12T17:21:51.752265800Z" level=info msg="starting plugins..." Sep 12 17:21:51.752375 containerd[1886]: time="2025-09-12T17:21:51.752277600Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 12 17:21:51.752495 containerd[1886]: time="2025-09-12T17:21:51.752392616Z" level=info msg="containerd successfully booted in 0.415266s" Sep 12 17:21:51.752626 systemd[1]: Started containerd.service - containerd container runtime. Sep 12 17:21:51.759664 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 12 17:21:51.765577 systemd[1]: Startup finished in 1.672s (kernel) + 15.081s (initrd) + 17.281s (userspace) = 34.035s. Sep 12 17:21:52.378037 login[2017]: pam_lastlog(login:session): file /var/log/lastlog is locked/write, retrying Sep 12 17:21:52.407358 login[2018]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:21:52.415029 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 12 17:21:52.416303 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 12 17:21:52.418540 systemd-logind[1855]: New session 1 of user core. Sep 12 17:21:52.446114 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 12 17:21:52.448050 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 12 17:21:52.470751 (systemd)[2051]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 12 17:21:52.472559 systemd-logind[1855]: New session c1 of user core. Sep 12 17:21:52.725707 systemd[2051]: Queued start job for default target default.target. Sep 12 17:21:52.732795 systemd[2051]: Created slice app.slice - User Application Slice. Sep 12 17:21:52.732817 systemd[2051]: Reached target paths.target - Paths. Sep 12 17:21:52.732927 systemd[2051]: Reached target timers.target - Timers. Sep 12 17:21:52.733883 systemd[2051]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 12 17:21:52.743069 systemd[2051]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 12 17:21:52.743216 systemd[2051]: Reached target sockets.target - Sockets. Sep 12 17:21:52.743317 systemd[2051]: Reached target basic.target - Basic System. Sep 12 17:21:52.743401 systemd[2051]: Reached target default.target - Main User Target. Sep 12 17:21:52.743435 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 12 17:21:52.743507 systemd[2051]: Startup finished in 266ms. Sep 12 17:21:52.746103 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 12 17:21:53.263129 waagent[2015]: 2025-09-12T17:21:53.263064Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 Sep 12 17:21:53.269298 waagent[2015]: 2025-09-12T17:21:53.269260Z INFO Daemon Daemon OS: flatcar 4426.1.0 Sep 12 17:21:53.273723 waagent[2015]: 2025-09-12T17:21:53.273697Z INFO Daemon Daemon Python: 3.11.13 Sep 12 17:21:53.278041 waagent[2015]: 2025-09-12T17:21:53.277980Z INFO Daemon Daemon Run daemon Sep 12 17:21:53.282111 waagent[2015]: 2025-09-12T17:21:53.282082Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4426.1.0' Sep 12 17:21:53.290660 waagent[2015]: 2025-09-12T17:21:53.290619Z INFO Daemon Daemon Using waagent for provisioning Sep 12 17:21:53.295803 waagent[2015]: 2025-09-12T17:21:53.295772Z INFO Daemon Daemon Activate resource disk Sep 12 17:21:53.299742 waagent[2015]: 2025-09-12T17:21:53.299714Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Sep 12 17:21:53.308227 waagent[2015]: 2025-09-12T17:21:53.308192Z INFO Daemon Daemon Found device: None Sep 12 17:21:53.312521 waagent[2015]: 2025-09-12T17:21:53.312496Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Sep 12 17:21:53.319524 waagent[2015]: 2025-09-12T17:21:53.319501Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Sep 12 17:21:53.329661 waagent[2015]: 2025-09-12T17:21:53.329625Z INFO Daemon Daemon Clean protocol and wireserver endpoint Sep 12 17:21:53.334668 waagent[2015]: 2025-09-12T17:21:53.334640Z INFO Daemon Daemon Running default provisioning handler Sep 12 17:21:53.344621 waagent[2015]: 2025-09-12T17:21:53.344582Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. Sep 12 17:21:53.356808 waagent[2015]: 2025-09-12T17:21:53.356773Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Sep 12 17:21:53.366655 waagent[2015]: 2025-09-12T17:21:53.366625Z INFO Daemon Daemon cloud-init is enabled: False Sep 12 17:21:53.371652 waagent[2015]: 2025-09-12T17:21:53.371622Z INFO Daemon Daemon Copying ovf-env.xml Sep 12 17:21:53.378347 login[2017]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:21:53.383360 systemd-logind[1855]: New session 2 of user core. Sep 12 17:21:53.392226 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 12 17:21:53.472404 waagent[2015]: 2025-09-12T17:21:53.472308Z INFO Daemon Daemon Successfully mounted dvd Sep 12 17:21:53.498414 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Sep 12 17:21:53.500533 waagent[2015]: 2025-09-12T17:21:53.500485Z INFO Daemon Daemon Detect protocol endpoint Sep 12 17:21:53.505396 waagent[2015]: 2025-09-12T17:21:53.505363Z INFO Daemon Daemon Clean protocol and wireserver endpoint Sep 12 17:21:53.511337 waagent[2015]: 2025-09-12T17:21:53.511307Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Sep 12 17:21:53.518070 waagent[2015]: 2025-09-12T17:21:53.518012Z INFO Daemon Daemon Test for route to 168.63.129.16 Sep 12 17:21:53.523850 waagent[2015]: 2025-09-12T17:21:53.523819Z INFO Daemon Daemon Route to 168.63.129.16 exists Sep 12 17:21:53.529075 waagent[2015]: 2025-09-12T17:21:53.529047Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Sep 12 17:21:53.604449 waagent[2015]: 2025-09-12T17:21:53.604340Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Sep 12 17:21:53.609980 waagent[2015]: 2025-09-12T17:21:53.609961Z INFO Daemon Daemon Wire protocol version:2012-11-30 Sep 12 17:21:53.615106 waagent[2015]: 2025-09-12T17:21:53.615085Z INFO Daemon Daemon Server preferred version:2015-04-05 Sep 12 17:21:53.798227 waagent[2015]: 2025-09-12T17:21:53.797881Z INFO Daemon Daemon Initializing goal state during protocol detection Sep 12 17:21:53.804036 waagent[2015]: 2025-09-12T17:21:53.803992Z INFO Daemon Daemon Forcing an update of the goal state. Sep 12 17:21:53.811756 waagent[2015]: 2025-09-12T17:21:53.811721Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] Sep 12 17:21:53.828681 waagent[2015]: 2025-09-12T17:21:53.828650Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.175 Sep 12 17:21:53.833961 waagent[2015]: 2025-09-12T17:21:53.833929Z INFO Daemon Sep 12 17:21:53.836561 waagent[2015]: 2025-09-12T17:21:53.836534Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: da88d66b-43d7-44c4-9cc2-101e051c5027 eTag: 8031359581084903470 source: Fabric] Sep 12 17:21:53.847437 waagent[2015]: 2025-09-12T17:21:53.847407Z INFO Daemon The vmSettings originated via Fabric; will ignore them. Sep 12 17:21:53.853999 waagent[2015]: 2025-09-12T17:21:53.853971Z INFO Daemon Sep 12 17:21:53.856723 waagent[2015]: 2025-09-12T17:21:53.856699Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] Sep 12 17:21:53.865807 waagent[2015]: 2025-09-12T17:21:53.865782Z INFO Daemon Daemon Downloading artifacts profile blob Sep 12 17:21:53.919854 waagent[2015]: 2025-09-12T17:21:53.919804Z INFO Daemon Downloaded certificate {'thumbprint': '0DFE3690C1FA56C0F145598A3376978206F64E9E', 'hasPrivateKey': True} Sep 12 17:21:53.929159 waagent[2015]: 2025-09-12T17:21:53.929112Z INFO Daemon Fetch goal state completed Sep 12 17:21:53.939205 waagent[2015]: 2025-09-12T17:21:53.939179Z INFO Daemon Daemon Starting provisioning Sep 12 17:21:53.944248 waagent[2015]: 2025-09-12T17:21:53.944216Z INFO Daemon Daemon Handle ovf-env.xml. Sep 12 17:21:53.948881 waagent[2015]: 2025-09-12T17:21:53.948855Z INFO Daemon Daemon Set hostname [ci-4426.1.0-a-1fe763f55e] Sep 12 17:21:53.983243 waagent[2015]: 2025-09-12T17:21:53.983205Z INFO Daemon Daemon Publish hostname [ci-4426.1.0-a-1fe763f55e] Sep 12 17:21:53.989678 waagent[2015]: 2025-09-12T17:21:53.989644Z INFO Daemon Daemon Examine /proc/net/route for primary interface Sep 12 17:21:53.996013 waagent[2015]: 2025-09-12T17:21:53.995983Z INFO Daemon Daemon Primary interface is [eth0] Sep 12 17:21:54.020448 systemd-networkd[1697]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:21:54.020457 systemd-networkd[1697]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:21:54.020502 systemd-networkd[1697]: eth0: DHCP lease lost Sep 12 17:21:54.022151 waagent[2015]: 2025-09-12T17:21:54.021284Z INFO Daemon Daemon Create user account if not exists Sep 12 17:21:54.026921 waagent[2015]: 2025-09-12T17:21:54.026879Z INFO Daemon Daemon User core already exists, skip useradd Sep 12 17:21:54.032425 waagent[2015]: 2025-09-12T17:21:54.032395Z INFO Daemon Daemon Configure sudoer Sep 12 17:21:54.046973 waagent[2015]: 2025-09-12T17:21:54.046930Z INFO Daemon Daemon Configure sshd Sep 12 17:21:54.051180 systemd-networkd[1697]: eth0: DHCPv4 address 10.200.20.44/24, gateway 10.200.20.1 acquired from 168.63.129.16 Sep 12 17:21:54.055281 waagent[2015]: 2025-09-12T17:21:54.055238Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. Sep 12 17:21:54.068792 waagent[2015]: 2025-09-12T17:21:54.068761Z INFO Daemon Daemon Deploy ssh public key. Sep 12 17:21:55.201230 waagent[2015]: 2025-09-12T17:21:55.201184Z INFO Daemon Daemon Provisioning complete Sep 12 17:21:55.214528 waagent[2015]: 2025-09-12T17:21:55.214495Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Sep 12 17:21:55.221153 waagent[2015]: 2025-09-12T17:21:55.221115Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Sep 12 17:21:55.230731 waagent[2015]: 2025-09-12T17:21:55.230707Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent Sep 12 17:21:55.326855 waagent[2103]: 2025-09-12T17:21:55.326796Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) Sep 12 17:21:55.328277 waagent[2103]: 2025-09-12T17:21:55.327297Z INFO ExtHandler ExtHandler OS: flatcar 4426.1.0 Sep 12 17:21:55.328277 waagent[2103]: 2025-09-12T17:21:55.327356Z INFO ExtHandler ExtHandler Python: 3.11.13 Sep 12 17:21:55.328277 waagent[2103]: 2025-09-12T17:21:55.327393Z INFO ExtHandler ExtHandler CPU Arch: aarch64 Sep 12 17:21:55.389690 waagent[2103]: 2025-09-12T17:21:55.389646Z INFO ExtHandler ExtHandler Distro: flatcar-4426.1.0; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.13; Arch: aarch64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.22.0; Sep 12 17:21:55.389902 waagent[2103]: 2025-09-12T17:21:55.389874Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Sep 12 17:21:55.390031 waagent[2103]: 2025-09-12T17:21:55.390004Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Sep 12 17:21:55.394951 waagent[2103]: 2025-09-12T17:21:55.394904Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Sep 12 17:21:55.399277 waagent[2103]: 2025-09-12T17:21:55.399247Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.175 Sep 12 17:21:55.399718 waagent[2103]: 2025-09-12T17:21:55.399684Z INFO ExtHandler Sep 12 17:21:55.399828 waagent[2103]: 2025-09-12T17:21:55.399808Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: 0c0913d6-3953-477a-88ad-1a06a739076f eTag: 8031359581084903470 source: Fabric] Sep 12 17:21:55.400172 waagent[2103]: 2025-09-12T17:21:55.400116Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Sep 12 17:21:55.400684 waagent[2103]: 2025-09-12T17:21:55.400650Z INFO ExtHandler Sep 12 17:21:55.400793 waagent[2103]: 2025-09-12T17:21:55.400770Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Sep 12 17:21:55.403520 waagent[2103]: 2025-09-12T17:21:55.403493Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Sep 12 17:21:55.452271 waagent[2103]: 2025-09-12T17:21:55.452197Z INFO ExtHandler Downloaded certificate {'thumbprint': '0DFE3690C1FA56C0F145598A3376978206F64E9E', 'hasPrivateKey': True} Sep 12 17:21:55.452700 waagent[2103]: 2025-09-12T17:21:55.452669Z INFO ExtHandler Fetch goal state completed Sep 12 17:21:55.463350 waagent[2103]: 2025-09-12T17:21:55.463318Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.4.1 11 Feb 2025 (Library: OpenSSL 3.4.1 11 Feb 2025) Sep 12 17:21:55.466539 waagent[2103]: 2025-09-12T17:21:55.466502Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 2103 Sep 12 17:21:55.466706 waagent[2103]: 2025-09-12T17:21:55.466681Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** Sep 12 17:21:55.467027 waagent[2103]: 2025-09-12T17:21:55.466999Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** Sep 12 17:21:55.468158 waagent[2103]: 2025-09-12T17:21:55.468109Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4426.1.0', '', 'Flatcar Container Linux by Kinvolk'] Sep 12 17:21:55.468545 waagent[2103]: 2025-09-12T17:21:55.468514Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4426.1.0', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported Sep 12 17:21:55.468730 waagent[2103]: 2025-09-12T17:21:55.468702Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False Sep 12 17:21:55.469262 waagent[2103]: 2025-09-12T17:21:55.469231Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Sep 12 17:21:55.528376 waagent[2103]: 2025-09-12T17:21:55.528011Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Sep 12 17:21:55.528376 waagent[2103]: 2025-09-12T17:21:55.528194Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Sep 12 17:21:55.532519 waagent[2103]: 2025-09-12T17:21:55.532499Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Sep 12 17:21:55.536912 systemd[1]: Reload requested from client PID 2118 ('systemctl') (unit waagent.service)... Sep 12 17:21:55.536925 systemd[1]: Reloading... Sep 12 17:21:55.610258 zram_generator::config[2159]: No configuration found. Sep 12 17:21:55.753256 systemd[1]: Reloading finished in 216 ms. Sep 12 17:21:55.775152 waagent[2103]: 2025-09-12T17:21:55.774905Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service Sep 12 17:21:55.775152 waagent[2103]: 2025-09-12T17:21:55.775035Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully Sep 12 17:22:01.919942 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 12 17:22:01.921234 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:22:07.502092 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:22:07.504847 (kubelet)[2226]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:22:07.534186 kubelet[2226]: E0912 17:22:07.534115 2226 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:22:07.536881 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:22:07.537080 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:22:07.539192 systemd[1]: kubelet.service: Consumed 111ms CPU time, 108M memory peak. Sep 12 17:22:08.100003 waagent[2103]: 2025-09-12T17:22:08.099923Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Sep 12 17:22:08.100330 waagent[2103]: 2025-09-12T17:22:08.100266Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] Sep 12 17:22:08.100932 waagent[2103]: 2025-09-12T17:22:08.100892Z INFO ExtHandler ExtHandler Starting env monitor service. Sep 12 17:22:08.101256 waagent[2103]: 2025-09-12T17:22:08.101180Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Sep 12 17:22:08.102016 waagent[2103]: 2025-09-12T17:22:08.101428Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Sep 12 17:22:08.102016 waagent[2103]: 2025-09-12T17:22:08.101501Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Sep 12 17:22:08.102016 waagent[2103]: 2025-09-12T17:22:08.101664Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Sep 12 17:22:08.102016 waagent[2103]: 2025-09-12T17:22:08.101794Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Sep 12 17:22:08.102016 waagent[2103]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Sep 12 17:22:08.102016 waagent[2103]: eth0 00000000 0114C80A 0003 0 0 1024 00000000 0 0 0 Sep 12 17:22:08.102016 waagent[2103]: eth0 0014C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Sep 12 17:22:08.102016 waagent[2103]: eth0 0114C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Sep 12 17:22:08.102016 waagent[2103]: eth0 10813FA8 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Sep 12 17:22:08.102016 waagent[2103]: eth0 FEA9FEA9 0114C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Sep 12 17:22:08.102324 waagent[2103]: 2025-09-12T17:22:08.102286Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Sep 12 17:22:08.102365 waagent[2103]: 2025-09-12T17:22:08.102332Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Sep 12 17:22:08.102640 waagent[2103]: 2025-09-12T17:22:08.102607Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Sep 12 17:22:08.102752 waagent[2103]: 2025-09-12T17:22:08.102714Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Sep 12 17:22:08.103228 waagent[2103]: 2025-09-12T17:22:08.103197Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Sep 12 17:22:08.103326 waagent[2103]: 2025-09-12T17:22:08.103302Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Sep 12 17:22:08.104007 waagent[2103]: 2025-09-12T17:22:08.103986Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Sep 12 17:22:08.104288 waagent[2103]: 2025-09-12T17:22:08.104258Z INFO EnvHandler ExtHandler Configure routes Sep 12 17:22:08.104419 waagent[2103]: 2025-09-12T17:22:08.104395Z INFO EnvHandler ExtHandler Gateway:None Sep 12 17:22:08.104524 waagent[2103]: 2025-09-12T17:22:08.104503Z INFO EnvHandler ExtHandler Routes:None Sep 12 17:22:08.110371 waagent[2103]: 2025-09-12T17:22:08.110335Z INFO ExtHandler ExtHandler Sep 12 17:22:08.110430 waagent[2103]: 2025-09-12T17:22:08.110399Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: eaf313c6-cd04-46fe-a59b-953ba2976931 correlation 1176eafc-f8fe-45b8-94c2-ff6bb1969907 created: 2025-09-12T17:20:37.615913Z] Sep 12 17:22:08.110673 waagent[2103]: 2025-09-12T17:22:08.110641Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Sep 12 17:22:08.111054 waagent[2103]: 2025-09-12T17:22:08.111029Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 0 ms] Sep 12 17:22:08.134865 waagent[2103]: 2025-09-12T17:22:08.134516Z WARNING ExtHandler ExtHandler Failed to get firewall packets: 'iptables -w -t security -L OUTPUT --zero OUTPUT -nxv' failed: 2 (iptables v1.8.11 (nf_tables): Illegal option `--numeric' with this command Sep 12 17:22:08.134865 waagent[2103]: Try `iptables -h' or 'iptables --help' for more information.) Sep 12 17:22:08.134865 waagent[2103]: 2025-09-12T17:22:08.134799Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: 0AA087A0-292A-4D57-8C8B-37A708184DE9;DroppedPackets: -1;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] Sep 12 17:22:08.179924 waagent[2103]: 2025-09-12T17:22:08.179886Z INFO MonitorHandler ExtHandler Network interfaces: Sep 12 17:22:08.179924 waagent[2103]: Executing ['ip', '-a', '-o', 'link']: Sep 12 17:22:08.179924 waagent[2103]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Sep 12 17:22:08.179924 waagent[2103]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 00:0d:3a:fc:ed:22 brd ff:ff:ff:ff:ff:ff Sep 12 17:22:08.179924 waagent[2103]: 3: enP1952s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 00:0d:3a:fc:ed:22 brd ff:ff:ff:ff:ff:ff\ altname enP1952p0s2 Sep 12 17:22:08.179924 waagent[2103]: Executing ['ip', '-4', '-a', '-o', 'address']: Sep 12 17:22:08.179924 waagent[2103]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Sep 12 17:22:08.179924 waagent[2103]: 2: eth0 inet 10.200.20.44/24 metric 1024 brd 10.200.20.255 scope global eth0\ valid_lft forever preferred_lft forever Sep 12 17:22:08.179924 waagent[2103]: Executing ['ip', '-6', '-a', '-o', 'address']: Sep 12 17:22:08.179924 waagent[2103]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever Sep 12 17:22:08.179924 waagent[2103]: 2: eth0 inet6 fe80::20d:3aff:fefc:ed22/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Sep 12 17:22:08.242345 waagent[2103]: 2025-09-12T17:22:08.242298Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: Sep 12 17:22:08.242345 waagent[2103]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Sep 12 17:22:08.242345 waagent[2103]: pkts bytes target prot opt in out source destination Sep 12 17:22:08.242345 waagent[2103]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Sep 12 17:22:08.242345 waagent[2103]: pkts bytes target prot opt in out source destination Sep 12 17:22:08.242345 waagent[2103]: Chain OUTPUT (policy ACCEPT 4 packets, 416 bytes) Sep 12 17:22:08.242345 waagent[2103]: pkts bytes target prot opt in out source destination Sep 12 17:22:08.242345 waagent[2103]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Sep 12 17:22:08.242345 waagent[2103]: 8 564 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Sep 12 17:22:08.242345 waagent[2103]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Sep 12 17:22:08.244772 waagent[2103]: 2025-09-12T17:22:08.244728Z INFO EnvHandler ExtHandler Current Firewall rules: Sep 12 17:22:08.244772 waagent[2103]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Sep 12 17:22:08.244772 waagent[2103]: pkts bytes target prot opt in out source destination Sep 12 17:22:08.244772 waagent[2103]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Sep 12 17:22:08.244772 waagent[2103]: pkts bytes target prot opt in out source destination Sep 12 17:22:08.244772 waagent[2103]: Chain OUTPUT (policy ACCEPT 4 packets, 416 bytes) Sep 12 17:22:08.244772 waagent[2103]: pkts bytes target prot opt in out source destination Sep 12 17:22:08.244772 waagent[2103]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Sep 12 17:22:08.244772 waagent[2103]: 10 1047 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Sep 12 17:22:08.244772 waagent[2103]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Sep 12 17:22:08.244952 waagent[2103]: 2025-09-12T17:22:08.244926Z INFO EnvHandler ExtHandler Set block dev timeout: sda with timeout: 300 Sep 12 17:22:14.170208 chronyd[1832]: Selected source PHC0 Sep 12 17:22:17.626172 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 12 17:22:17.628301 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:22:17.718257 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:22:17.718931 (kubelet)[2272]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:22:17.741464 kubelet[2272]: E0912 17:22:17.741412 2272 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:22:17.744104 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:22:17.744329 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:22:17.746314 systemd[1]: kubelet.service: Consumed 100ms CPU time, 104.6M memory peak. Sep 12 17:22:24.498037 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 12 17:22:24.498946 systemd[1]: Started sshd@0-10.200.20.44:22-10.200.16.10:57066.service - OpenSSH per-connection server daemon (10.200.16.10:57066). Sep 12 17:22:25.143877 sshd[2279]: Accepted publickey for core from 10.200.16.10 port 57066 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:22:25.144950 sshd-session[2279]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:22:25.149258 systemd-logind[1855]: New session 3 of user core. Sep 12 17:22:25.155384 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 12 17:22:25.539307 systemd[1]: Started sshd@1-10.200.20.44:22-10.200.16.10:57080.service - OpenSSH per-connection server daemon (10.200.16.10:57080). Sep 12 17:22:25.989386 sshd[2285]: Accepted publickey for core from 10.200.16.10 port 57080 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:22:25.990437 sshd-session[2285]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:22:25.994170 systemd-logind[1855]: New session 4 of user core. Sep 12 17:22:26.003238 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 12 17:22:26.327233 sshd[2288]: Connection closed by 10.200.16.10 port 57080 Sep 12 17:22:26.327834 sshd-session[2285]: pam_unix(sshd:session): session closed for user core Sep 12 17:22:26.331171 systemd-logind[1855]: Session 4 logged out. Waiting for processes to exit. Sep 12 17:22:26.331406 systemd[1]: sshd@1-10.200.20.44:22-10.200.16.10:57080.service: Deactivated successfully. Sep 12 17:22:26.332630 systemd[1]: session-4.scope: Deactivated successfully. Sep 12 17:22:26.334415 systemd-logind[1855]: Removed session 4. Sep 12 17:22:26.442444 systemd[1]: Started sshd@2-10.200.20.44:22-10.200.16.10:57082.service - OpenSSH per-connection server daemon (10.200.16.10:57082). Sep 12 17:22:26.899572 sshd[2294]: Accepted publickey for core from 10.200.16.10 port 57082 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:22:26.900655 sshd-session[2294]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:22:26.904078 systemd-logind[1855]: New session 5 of user core. Sep 12 17:22:26.912391 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 12 17:22:27.241493 sshd[2297]: Connection closed by 10.200.16.10 port 57082 Sep 12 17:22:27.241984 sshd-session[2294]: pam_unix(sshd:session): session closed for user core Sep 12 17:22:27.245113 systemd[1]: sshd@2-10.200.20.44:22-10.200.16.10:57082.service: Deactivated successfully. Sep 12 17:22:27.246397 systemd[1]: session-5.scope: Deactivated successfully. Sep 12 17:22:27.246929 systemd-logind[1855]: Session 5 logged out. Waiting for processes to exit. Sep 12 17:22:27.247945 systemd-logind[1855]: Removed session 5. Sep 12 17:22:27.312380 systemd[1]: Started sshd@3-10.200.20.44:22-10.200.16.10:57096.service - OpenSSH per-connection server daemon (10.200.16.10:57096). Sep 12 17:22:27.722382 sshd[2303]: Accepted publickey for core from 10.200.16.10 port 57096 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:22:27.723433 sshd-session[2303]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:22:27.726835 systemd-logind[1855]: New session 6 of user core. Sep 12 17:22:27.741236 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 12 17:22:27.744948 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 12 17:22:27.746056 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:22:27.838584 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:22:27.843370 (kubelet)[2315]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:22:27.953206 kubelet[2315]: E0912 17:22:27.952602 2315 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:22:27.955397 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:22:27.955896 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:22:27.956404 systemd[1]: kubelet.service: Consumed 103ms CPU time, 107.1M memory peak. Sep 12 17:22:28.028536 sshd[2306]: Connection closed by 10.200.16.10 port 57096 Sep 12 17:22:28.028399 sshd-session[2303]: pam_unix(sshd:session): session closed for user core Sep 12 17:22:28.031146 systemd[1]: sshd@3-10.200.20.44:22-10.200.16.10:57096.service: Deactivated successfully. Sep 12 17:22:28.032390 systemd[1]: session-6.scope: Deactivated successfully. Sep 12 17:22:28.034668 systemd-logind[1855]: Session 6 logged out. Waiting for processes to exit. Sep 12 17:22:28.035867 systemd-logind[1855]: Removed session 6. Sep 12 17:22:28.108909 systemd[1]: Started sshd@4-10.200.20.44:22-10.200.16.10:57112.service - OpenSSH per-connection server daemon (10.200.16.10:57112). Sep 12 17:22:28.559307 sshd[2327]: Accepted publickey for core from 10.200.16.10 port 57112 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:22:28.560355 sshd-session[2327]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:22:28.563873 systemd-logind[1855]: New session 7 of user core. Sep 12 17:22:28.571246 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 12 17:22:31.863114 sudo[2331]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 12 17:22:31.863379 sudo[2331]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:22:31.884481 sudo[2331]: pam_unix(sudo:session): session closed for user root Sep 12 17:22:31.968982 sshd[2330]: Connection closed by 10.200.16.10 port 57112 Sep 12 17:22:31.968239 sshd-session[2327]: pam_unix(sshd:session): session closed for user core Sep 12 17:22:31.971630 systemd[1]: sshd@4-10.200.20.44:22-10.200.16.10:57112.service: Deactivated successfully. Sep 12 17:22:31.973030 systemd[1]: session-7.scope: Deactivated successfully. Sep 12 17:22:31.973990 systemd-logind[1855]: Session 7 logged out. Waiting for processes to exit. Sep 12 17:22:31.975143 systemd-logind[1855]: Removed session 7. Sep 12 17:22:32.043842 kernel: hv_balloon: Max. dynamic memory size: 4096 MB Sep 12 17:22:32.047423 systemd[1]: Started sshd@5-10.200.20.44:22-10.200.16.10:37104.service - OpenSSH per-connection server daemon (10.200.16.10:37104). Sep 12 17:22:32.464628 sshd[2337]: Accepted publickey for core from 10.200.16.10 port 37104 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:22:32.465721 sshd-session[2337]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:22:32.469159 systemd-logind[1855]: New session 8 of user core. Sep 12 17:22:32.474231 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 12 17:22:32.700256 sudo[2342]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 12 17:22:32.700467 sudo[2342]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:22:32.707002 sudo[2342]: pam_unix(sudo:session): session closed for user root Sep 12 17:22:32.710495 sudo[2341]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 12 17:22:32.710684 sudo[2341]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:22:32.719272 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 17:22:32.746694 augenrules[2364]: No rules Sep 12 17:22:32.747725 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 17:22:32.749161 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 17:22:32.750029 sudo[2341]: pam_unix(sudo:session): session closed for user root Sep 12 17:22:32.820732 sshd[2340]: Connection closed by 10.200.16.10 port 37104 Sep 12 17:22:32.821176 sshd-session[2337]: pam_unix(sshd:session): session closed for user core Sep 12 17:22:32.823666 systemd-logind[1855]: Session 8 logged out. Waiting for processes to exit. Sep 12 17:22:32.823745 systemd[1]: sshd@5-10.200.20.44:22-10.200.16.10:37104.service: Deactivated successfully. Sep 12 17:22:32.824914 systemd[1]: session-8.scope: Deactivated successfully. Sep 12 17:22:32.827935 systemd-logind[1855]: Removed session 8. Sep 12 17:22:32.901271 systemd[1]: Started sshd@6-10.200.20.44:22-10.200.16.10:37114.service - OpenSSH per-connection server daemon (10.200.16.10:37114). Sep 12 17:22:33.352703 sshd[2373]: Accepted publickey for core from 10.200.16.10 port 37114 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:22:33.353756 sshd-session[2373]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:22:33.357069 systemd-logind[1855]: New session 9 of user core. Sep 12 17:22:33.365379 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 12 17:22:33.608155 sudo[2377]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 12 17:22:33.608362 sudo[2377]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:22:35.030534 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 12 17:22:35.039355 (dockerd)[2394]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 12 17:22:35.732753 update_engine[1865]: I20250912 17:22:35.732277 1865 update_attempter.cc:509] Updating boot flags... Sep 12 17:22:36.235151 dockerd[2394]: time="2025-09-12T17:22:36.233913106Z" level=info msg="Starting up" Sep 12 17:22:36.235974 dockerd[2394]: time="2025-09-12T17:22:36.235873848Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 12 17:22:36.245163 dockerd[2394]: time="2025-09-12T17:22:36.245122478Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 12 17:22:37.541805 dockerd[2394]: time="2025-09-12T17:22:37.541763669Z" level=info msg="Loading containers: start." Sep 12 17:22:37.634146 kernel: Initializing XFRM netlink socket Sep 12 17:22:38.126244 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 12 17:22:38.128314 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:22:38.860869 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:22:38.867361 (kubelet)[2606]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:22:38.891476 kubelet[2606]: E0912 17:22:38.891423 2606 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:22:38.893419 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:22:38.893609 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:22:38.895196 systemd[1]: kubelet.service: Consumed 100ms CPU time, 106.9M memory peak. Sep 12 17:22:42.275029 systemd-networkd[1697]: docker0: Link UP Sep 12 17:22:42.299581 dockerd[2394]: time="2025-09-12T17:22:42.299505898Z" level=info msg="Loading containers: done." Sep 12 17:22:42.933709 dockerd[2394]: time="2025-09-12T17:22:42.933312047Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 12 17:22:42.933709 dockerd[2394]: time="2025-09-12T17:22:42.933444946Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 12 17:22:42.933709 dockerd[2394]: time="2025-09-12T17:22:42.933548421Z" level=info msg="Initializing buildkit" Sep 12 17:22:43.101003 dockerd[2394]: time="2025-09-12T17:22:43.100961163Z" level=info msg="Completed buildkit initialization" Sep 12 17:22:43.106375 dockerd[2394]: time="2025-09-12T17:22:43.106333424Z" level=info msg="Daemon has completed initialization" Sep 12 17:22:43.106541 dockerd[2394]: time="2025-09-12T17:22:43.106441203Z" level=info msg="API listen on /run/docker.sock" Sep 12 17:22:43.107156 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 12 17:22:44.142086 containerd[1886]: time="2025-09-12T17:22:44.141754486Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.5\"" Sep 12 17:22:46.492463 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount689571960.mount: Deactivated successfully. Sep 12 17:22:49.126100 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Sep 12 17:22:49.127434 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:22:49.224183 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:22:49.233324 (kubelet)[2698]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:22:49.359229 kubelet[2698]: E0912 17:22:49.359176 2698 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:22:49.361335 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:22:49.361548 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:22:49.363210 systemd[1]: kubelet.service: Consumed 103ms CPU time, 105.2M memory peak. Sep 12 17:22:53.405172 containerd[1886]: time="2025-09-12T17:22:53.404850443Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:22:53.409944 containerd[1886]: time="2025-09-12T17:22:53.409778315Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.5: active requests=0, bytes read=27390228" Sep 12 17:22:53.413773 containerd[1886]: time="2025-09-12T17:22:53.413745353Z" level=info msg="ImageCreate event name:\"sha256:6a7fd297b49102b08dc3d8d4fd7f1538bcf21d3131eae8bf62ba26ce3283237f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:22:53.419143 containerd[1886]: time="2025-09-12T17:22:53.419106196Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:1b9c6c00bc1fe86860e72efb8e4148f9e436a132eba4ca636ca4f48d61d6dfb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:22:53.419755 containerd[1886]: time="2025-09-12T17:22:53.419617923Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.5\" with image id \"sha256:6a7fd297b49102b08dc3d8d4fd7f1538bcf21d3131eae8bf62ba26ce3283237f\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:1b9c6c00bc1fe86860e72efb8e4148f9e436a132eba4ca636ca4f48d61d6dfb4\", size \"27386827\" in 9.277828467s" Sep 12 17:22:53.419755 containerd[1886]: time="2025-09-12T17:22:53.419647811Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.5\" returns image reference \"sha256:6a7fd297b49102b08dc3d8d4fd7f1538bcf21d3131eae8bf62ba26ce3283237f\"" Sep 12 17:22:53.421173 containerd[1886]: time="2025-09-12T17:22:53.421149973Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.5\"" Sep 12 17:22:54.735040 containerd[1886]: time="2025-09-12T17:22:54.734969620Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:22:54.739393 containerd[1886]: time="2025-09-12T17:22:54.739362239Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.5: active requests=0, bytes read=23547917" Sep 12 17:22:54.748814 containerd[1886]: time="2025-09-12T17:22:54.748775293Z" level=info msg="ImageCreate event name:\"sha256:2dd4c25a937008b7b8a6cdca70d816403b5078b51550926721b7a7762139cd23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:22:54.757341 containerd[1886]: time="2025-09-12T17:22:54.757305404Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:1082a6ab67fb46397314dd36b36cb197ba4a4c5365033e9ad22bc7edaaaabd5c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:22:54.758604 containerd[1886]: time="2025-09-12T17:22:54.758492693Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.5\" with image id \"sha256:2dd4c25a937008b7b8a6cdca70d816403b5078b51550926721b7a7762139cd23\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:1082a6ab67fb46397314dd36b36cb197ba4a4c5365033e9ad22bc7edaaaabd5c\", size \"25135832\" in 1.337313911s" Sep 12 17:22:54.758604 containerd[1886]: time="2025-09-12T17:22:54.758520686Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.5\" returns image reference \"sha256:2dd4c25a937008b7b8a6cdca70d816403b5078b51550926721b7a7762139cd23\"" Sep 12 17:22:54.759149 containerd[1886]: time="2025-09-12T17:22:54.759087133Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.5\"" Sep 12 17:22:56.764684 containerd[1886]: time="2025-09-12T17:22:56.764630042Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:22:56.768284 containerd[1886]: time="2025-09-12T17:22:56.768259855Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.5: active requests=0, bytes read=18295977" Sep 12 17:22:56.772570 containerd[1886]: time="2025-09-12T17:22:56.772534607Z" level=info msg="ImageCreate event name:\"sha256:5e600beaed8620718e0650dd2721266869ce1d737488c004a869333273e6ec15\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:22:56.777683 containerd[1886]: time="2025-09-12T17:22:56.777500938Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:3e7b57c9d9f06b77f0064e5be7f3df61e0151101160acd5fdecce911df28a189\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:22:56.778491 containerd[1886]: time="2025-09-12T17:22:56.778464356Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.5\" with image id \"sha256:5e600beaed8620718e0650dd2721266869ce1d737488c004a869333273e6ec15\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:3e7b57c9d9f06b77f0064e5be7f3df61e0151101160acd5fdecce911df28a189\", size \"19883910\" in 2.019354214s" Sep 12 17:22:56.778522 containerd[1886]: time="2025-09-12T17:22:56.778492925Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.5\" returns image reference \"sha256:5e600beaed8620718e0650dd2721266869ce1d737488c004a869333273e6ec15\"" Sep 12 17:22:56.779135 containerd[1886]: time="2025-09-12T17:22:56.779111303Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.5\"" Sep 12 17:22:58.876822 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2713575446.mount: Deactivated successfully. Sep 12 17:22:59.287452 containerd[1886]: time="2025-09-12T17:22:59.287318626Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:22:59.291017 containerd[1886]: time="2025-09-12T17:22:59.290993600Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.5: active requests=0, bytes read=28240106" Sep 12 17:22:59.334318 containerd[1886]: time="2025-09-12T17:22:59.334263384Z" level=info msg="ImageCreate event name:\"sha256:021a8d45ab0c346664e47d95595ff5180ce90a22a681ea27904c65ae90788e70\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:22:59.339216 containerd[1886]: time="2025-09-12T17:22:59.339169545Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:71445ec84ad98bd52a7784865a9d31b1b50b56092d3f7699edc39eefd71befe1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:22:59.339697 containerd[1886]: time="2025-09-12T17:22:59.339556571Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.5\" with image id \"sha256:021a8d45ab0c346664e47d95595ff5180ce90a22a681ea27904c65ae90788e70\", repo tag \"registry.k8s.io/kube-proxy:v1.33.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:71445ec84ad98bd52a7784865a9d31b1b50b56092d3f7699edc39eefd71befe1\", size \"28239125\" in 2.560413172s" Sep 12 17:22:59.339697 containerd[1886]: time="2025-09-12T17:22:59.339580428Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.5\" returns image reference \"sha256:021a8d45ab0c346664e47d95595ff5180ce90a22a681ea27904c65ae90788e70\"" Sep 12 17:22:59.339946 containerd[1886]: time="2025-09-12T17:22:59.339926726Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Sep 12 17:22:59.376100 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Sep 12 17:22:59.377385 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:22:59.475271 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:22:59.480341 (kubelet)[2778]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:22:59.507847 kubelet[2778]: E0912 17:22:59.507785 2778 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:22:59.510010 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:22:59.510233 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:22:59.510700 systemd[1]: kubelet.service: Consumed 104ms CPU time, 104.6M memory peak. Sep 12 17:23:09.626242 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. Sep 12 17:23:09.627828 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:23:11.678099 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:23:11.683314 (kubelet)[2793]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:23:11.710296 kubelet[2793]: E0912 17:23:11.710251 2793 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:23:11.712269 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:23:11.712473 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:23:11.712907 systemd[1]: kubelet.service: Consumed 104ms CPU time, 106.7M memory peak. Sep 12 17:23:12.816637 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount362596457.mount: Deactivated successfully. Sep 12 17:23:13.845926 containerd[1886]: time="2025-09-12T17:23:13.845310987Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:23:13.850475 containerd[1886]: time="2025-09-12T17:23:13.850452172Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=19152117" Sep 12 17:23:13.854813 containerd[1886]: time="2025-09-12T17:23:13.854793040Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:23:13.859417 containerd[1886]: time="2025-09-12T17:23:13.859394275Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:23:13.859958 containerd[1886]: time="2025-09-12T17:23:13.859930617Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 14.519980995s" Sep 12 17:23:13.859958 containerd[1886]: time="2025-09-12T17:23:13.859959410Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Sep 12 17:23:13.860371 containerd[1886]: time="2025-09-12T17:23:13.860322172Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 12 17:23:14.467551 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1540291520.mount: Deactivated successfully. Sep 12 17:23:14.503165 containerd[1886]: time="2025-09-12T17:23:14.502758664Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:23:14.506152 containerd[1886]: time="2025-09-12T17:23:14.506122465Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" Sep 12 17:23:14.510173 containerd[1886]: time="2025-09-12T17:23:14.510132764Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:23:14.522106 containerd[1886]: time="2025-09-12T17:23:14.522073083Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:23:14.522548 containerd[1886]: time="2025-09-12T17:23:14.522432141Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 662.087137ms" Sep 12 17:23:14.522548 containerd[1886]: time="2025-09-12T17:23:14.522459454Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 12 17:23:14.523023 containerd[1886]: time="2025-09-12T17:23:14.522974507Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Sep 12 17:23:15.171257 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1722010935.mount: Deactivated successfully. Sep 12 17:23:17.674882 containerd[1886]: time="2025-09-12T17:23:17.674756440Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:23:17.679097 containerd[1886]: time="2025-09-12T17:23:17.678919120Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=69465857" Sep 12 17:23:17.682667 containerd[1886]: time="2025-09-12T17:23:17.682644067Z" level=info msg="ImageCreate event name:\"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:23:17.687520 containerd[1886]: time="2025-09-12T17:23:17.687495317Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:23:17.688117 containerd[1886]: time="2025-09-12T17:23:17.688003194Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"70026017\" in 3.165001838s" Sep 12 17:23:17.688117 containerd[1886]: time="2025-09-12T17:23:17.688029691Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\"" Sep 12 17:23:21.185622 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:23:21.186218 systemd[1]: kubelet.service: Consumed 104ms CPU time, 106.7M memory peak. Sep 12 17:23:21.188656 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:23:21.204773 systemd[1]: Reload requested from client PID 2938 ('systemctl') (unit session-9.scope)... Sep 12 17:23:21.204788 systemd[1]: Reloading... Sep 12 17:23:21.306159 zram_generator::config[2993]: No configuration found. Sep 12 17:23:21.447478 systemd[1]: Reloading finished in 242 ms. Sep 12 17:23:21.492490 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 12 17:23:21.492543 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 12 17:23:21.492736 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:23:21.492772 systemd[1]: kubelet.service: Consumed 71ms CPU time, 95M memory peak. Sep 12 17:23:21.493771 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:23:21.706781 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:23:21.712452 (kubelet)[3051]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 17:23:21.738404 kubelet[3051]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:23:21.740159 kubelet[3051]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 17:23:21.740159 kubelet[3051]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:23:21.740159 kubelet[3051]: I0912 17:23:21.738712 3051 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 17:23:22.563865 kubelet[3051]: I0912 17:23:22.563831 3051 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 12 17:23:22.564016 kubelet[3051]: I0912 17:23:22.564006 3051 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 17:23:22.564255 kubelet[3051]: I0912 17:23:22.564240 3051 server.go:956] "Client rotation is on, will bootstrap in background" Sep 12 17:23:22.577300 kubelet[3051]: E0912 17:23:22.577271 3051 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.200.20.44:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.20.44:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Sep 12 17:23:22.578550 kubelet[3051]: I0912 17:23:22.578527 3051 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 17:23:22.586834 kubelet[3051]: I0912 17:23:22.586819 3051 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 17:23:22.589329 kubelet[3051]: I0912 17:23:22.589314 3051 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 17:23:22.590528 kubelet[3051]: I0912 17:23:22.590502 3051 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 17:23:22.590725 kubelet[3051]: I0912 17:23:22.590603 3051 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4426.1.0-a-1fe763f55e","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 17:23:22.590842 kubelet[3051]: I0912 17:23:22.590832 3051 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 17:23:22.590890 kubelet[3051]: I0912 17:23:22.590883 3051 container_manager_linux.go:303] "Creating device plugin manager" Sep 12 17:23:22.591558 kubelet[3051]: I0912 17:23:22.591544 3051 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:23:22.594036 kubelet[3051]: I0912 17:23:22.594021 3051 kubelet.go:480] "Attempting to sync node with API server" Sep 12 17:23:22.594116 kubelet[3051]: I0912 17:23:22.594107 3051 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 17:23:22.594195 kubelet[3051]: I0912 17:23:22.594187 3051 kubelet.go:386] "Adding apiserver pod source" Sep 12 17:23:22.595209 kubelet[3051]: I0912 17:23:22.595196 3051 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 17:23:22.597245 kubelet[3051]: E0912 17:23:22.597219 3051 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.200.20.44:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426.1.0-a-1fe763f55e&limit=500&resourceVersion=0\": dial tcp 10.200.20.44:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 12 17:23:22.598017 kubelet[3051]: E0912 17:23:22.597997 3051 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.200.20.44:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.44:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 12 17:23:22.598090 kubelet[3051]: I0912 17:23:22.598076 3051 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 17:23:22.598443 kubelet[3051]: I0912 17:23:22.598428 3051 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 12 17:23:22.598483 kubelet[3051]: W0912 17:23:22.598474 3051 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 12 17:23:22.601119 kubelet[3051]: I0912 17:23:22.601096 3051 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 17:23:22.601119 kubelet[3051]: I0912 17:23:22.601146 3051 server.go:1289] "Started kubelet" Sep 12 17:23:22.601997 kubelet[3051]: I0912 17:23:22.601977 3051 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 17:23:22.605173 kubelet[3051]: E0912 17:23:22.603576 3051 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.20.44:6443/api/v1/namespaces/default/events\": dial tcp 10.200.20.44:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4426.1.0-a-1fe763f55e.186498d6ba862474 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4426.1.0-a-1fe763f55e,UID:ci-4426.1.0-a-1fe763f55e,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4426.1.0-a-1fe763f55e,},FirstTimestamp:2025-09-12 17:23:22.601112692 +0000 UTC m=+0.885894140,LastTimestamp:2025-09-12 17:23:22.601112692 +0000 UTC m=+0.885894140,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4426.1.0-a-1fe763f55e,}" Sep 12 17:23:22.605173 kubelet[3051]: I0912 17:23:22.604530 3051 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 17:23:22.605173 kubelet[3051]: I0912 17:23:22.605038 3051 server.go:317] "Adding debug handlers to kubelet server" Sep 12 17:23:22.608612 kubelet[3051]: I0912 17:23:22.608565 3051 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 17:23:22.608833 kubelet[3051]: I0912 17:23:22.608820 3051 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 17:23:22.609312 kubelet[3051]: I0912 17:23:22.609294 3051 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 17:23:22.610237 kubelet[3051]: I0912 17:23:22.610219 3051 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 17:23:22.610381 kubelet[3051]: E0912 17:23:22.610358 3051 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" Sep 12 17:23:22.612363 kubelet[3051]: E0912 17:23:22.612345 3051 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.44:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426.1.0-a-1fe763f55e?timeout=10s\": dial tcp 10.200.20.44:6443: connect: connection refused" interval="200ms" Sep 12 17:23:22.612830 kubelet[3051]: I0912 17:23:22.612817 3051 factory.go:223] Registration of the containerd container factory successfully Sep 12 17:23:22.612900 kubelet[3051]: I0912 17:23:22.612893 3051 factory.go:223] Registration of the systemd container factory successfully Sep 12 17:23:22.612998 kubelet[3051]: I0912 17:23:22.612985 3051 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 17:23:22.613544 kubelet[3051]: I0912 17:23:22.613521 3051 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 17:23:22.613598 kubelet[3051]: I0912 17:23:22.613565 3051 reconciler.go:26] "Reconciler: start to sync state" Sep 12 17:23:22.615623 kubelet[3051]: E0912 17:23:22.615602 3051 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.200.20.44:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.44:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 12 17:23:22.618420 kubelet[3051]: E0912 17:23:22.618398 3051 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 17:23:22.636495 kubelet[3051]: I0912 17:23:22.636317 3051 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 17:23:22.636495 kubelet[3051]: I0912 17:23:22.636330 3051 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 17:23:22.636495 kubelet[3051]: I0912 17:23:22.636345 3051 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:23:22.710529 kubelet[3051]: E0912 17:23:22.710489 3051 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" Sep 12 17:23:23.095163 kubelet[3051]: E0912 17:23:22.810921 3051 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" Sep 12 17:23:23.095163 kubelet[3051]: E0912 17:23:22.813495 3051 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.44:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426.1.0-a-1fe763f55e?timeout=10s\": dial tcp 10.200.20.44:6443: connect: connection refused" interval="400ms" Sep 12 17:23:23.095163 kubelet[3051]: E0912 17:23:22.911111 3051 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" Sep 12 17:23:23.095163 kubelet[3051]: E0912 17:23:23.011622 3051 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" Sep 12 17:23:23.111848 kubelet[3051]: E0912 17:23:23.111822 3051 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" Sep 12 17:23:23.215876 kubelet[3051]: I0912 17:23:23.114238 3051 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 12 17:23:23.215876 kubelet[3051]: I0912 17:23:23.115688 3051 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 12 17:23:23.215876 kubelet[3051]: I0912 17:23:23.115704 3051 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 12 17:23:23.215876 kubelet[3051]: I0912 17:23:23.115724 3051 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 17:23:23.215876 kubelet[3051]: I0912 17:23:23.115731 3051 kubelet.go:2436] "Starting kubelet main sync loop" Sep 12 17:23:23.215876 kubelet[3051]: E0912 17:23:23.115766 3051 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 17:23:23.215876 kubelet[3051]: E0912 17:23:23.117181 3051 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.200.20.44:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.44:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 12 17:23:23.215876 kubelet[3051]: E0912 17:23:23.212677 3051 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" Sep 12 17:23:23.215876 kubelet[3051]: E0912 17:23:23.214029 3051 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.44:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426.1.0-a-1fe763f55e?timeout=10s\": dial tcp 10.200.20.44:6443: connect: connection refused" interval="800ms" Sep 12 17:23:23.216124 kubelet[3051]: E0912 17:23:23.216109 3051 kubelet.go:2460] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 12 17:23:23.237595 kubelet[3051]: I0912 17:23:23.237351 3051 policy_none.go:49] "None policy: Start" Sep 12 17:23:23.237595 kubelet[3051]: I0912 17:23:23.237379 3051 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 17:23:23.237595 kubelet[3051]: I0912 17:23:23.237391 3051 state_mem.go:35] "Initializing new in-memory state store" Sep 12 17:23:23.258407 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 12 17:23:23.266613 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 12 17:23:23.269472 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 12 17:23:23.286950 kubelet[3051]: E0912 17:23:23.286652 3051 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 12 17:23:23.287590 kubelet[3051]: I0912 17:23:23.287200 3051 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 17:23:23.287590 kubelet[3051]: I0912 17:23:23.287215 3051 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 17:23:23.287590 kubelet[3051]: I0912 17:23:23.287428 3051 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 17:23:23.288987 kubelet[3051]: E0912 17:23:23.288966 3051 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 17:23:23.289058 kubelet[3051]: E0912 17:23:23.289018 3051 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4426.1.0-a-1fe763f55e\" not found" Sep 12 17:23:23.389181 kubelet[3051]: I0912 17:23:23.389119 3051 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:23.389672 kubelet[3051]: E0912 17:23:23.389644 3051 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.44:6443/api/v1/nodes\": dial tcp 10.200.20.44:6443: connect: connection refused" node="ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:23.430732 systemd[1]: Created slice kubepods-burstable-podc5290526ee646ae491bbc701165604e3.slice - libcontainer container kubepods-burstable-podc5290526ee646ae491bbc701165604e3.slice. Sep 12 17:23:23.436905 kubelet[3051]: E0912 17:23:23.436690 3051 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" node="ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:23.439712 systemd[1]: Created slice kubepods-burstable-podf89b550650cd0edf31e7bb54fcb1c9fe.slice - libcontainer container kubepods-burstable-podf89b550650cd0edf31e7bb54fcb1c9fe.slice. Sep 12 17:23:23.443905 kubelet[3051]: E0912 17:23:23.443768 3051 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" node="ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:23.457430 systemd[1]: Created slice kubepods-burstable-pod2ea4678316b26ffdfd480451bde85a74.slice - libcontainer container kubepods-burstable-pod2ea4678316b26ffdfd480451bde85a74.slice. Sep 12 17:23:23.458708 kubelet[3051]: E0912 17:23:23.458690 3051 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" node="ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:23.482066 kubelet[3051]: E0912 17:23:23.482031 3051 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.200.20.44:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.20.44:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 12 17:23:23.517767 kubelet[3051]: I0912 17:23:23.517666 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c5290526ee646ae491bbc701165604e3-ca-certs\") pod \"kube-apiserver-ci-4426.1.0-a-1fe763f55e\" (UID: \"c5290526ee646ae491bbc701165604e3\") " pod="kube-system/kube-apiserver-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:23.517767 kubelet[3051]: I0912 17:23:23.517696 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c5290526ee646ae491bbc701165604e3-k8s-certs\") pod \"kube-apiserver-ci-4426.1.0-a-1fe763f55e\" (UID: \"c5290526ee646ae491bbc701165604e3\") " pod="kube-system/kube-apiserver-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:23.517767 kubelet[3051]: I0912 17:23:23.517706 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/f89b550650cd0edf31e7bb54fcb1c9fe-flexvolume-dir\") pod \"kube-controller-manager-ci-4426.1.0-a-1fe763f55e\" (UID: \"f89b550650cd0edf31e7bb54fcb1c9fe\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:23.517767 kubelet[3051]: I0912 17:23:23.517718 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f89b550650cd0edf31e7bb54fcb1c9fe-k8s-certs\") pod \"kube-controller-manager-ci-4426.1.0-a-1fe763f55e\" (UID: \"f89b550650cd0edf31e7bb54fcb1c9fe\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:23.517767 kubelet[3051]: I0912 17:23:23.517765 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f89b550650cd0edf31e7bb54fcb1c9fe-kubeconfig\") pod \"kube-controller-manager-ci-4426.1.0-a-1fe763f55e\" (UID: \"f89b550650cd0edf31e7bb54fcb1c9fe\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:23.517918 kubelet[3051]: I0912 17:23:23.517775 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f89b550650cd0edf31e7bb54fcb1c9fe-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4426.1.0-a-1fe763f55e\" (UID: \"f89b550650cd0edf31e7bb54fcb1c9fe\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:23.517918 kubelet[3051]: I0912 17:23:23.517786 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2ea4678316b26ffdfd480451bde85a74-kubeconfig\") pod \"kube-scheduler-ci-4426.1.0-a-1fe763f55e\" (UID: \"2ea4678316b26ffdfd480451bde85a74\") " pod="kube-system/kube-scheduler-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:23.517918 kubelet[3051]: I0912 17:23:23.517797 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c5290526ee646ae491bbc701165604e3-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4426.1.0-a-1fe763f55e\" (UID: \"c5290526ee646ae491bbc701165604e3\") " pod="kube-system/kube-apiserver-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:23.517918 kubelet[3051]: I0912 17:23:23.517807 3051 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f89b550650cd0edf31e7bb54fcb1c9fe-ca-certs\") pod \"kube-controller-manager-ci-4426.1.0-a-1fe763f55e\" (UID: \"f89b550650cd0edf31e7bb54fcb1c9fe\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:23.591299 kubelet[3051]: I0912 17:23:23.591260 3051 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:23.591713 kubelet[3051]: E0912 17:23:23.591651 3051 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.44:6443/api/v1/nodes\": dial tcp 10.200.20.44:6443: connect: connection refused" node="ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:23.738090 containerd[1886]: time="2025-09-12T17:23:23.737981662Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4426.1.0-a-1fe763f55e,Uid:c5290526ee646ae491bbc701165604e3,Namespace:kube-system,Attempt:0,}" Sep 12 17:23:23.744467 containerd[1886]: time="2025-09-12T17:23:23.744428851Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4426.1.0-a-1fe763f55e,Uid:f89b550650cd0edf31e7bb54fcb1c9fe,Namespace:kube-system,Attempt:0,}" Sep 12 17:23:23.760147 containerd[1886]: time="2025-09-12T17:23:23.760113697Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4426.1.0-a-1fe763f55e,Uid:2ea4678316b26ffdfd480451bde85a74,Namespace:kube-system,Attempt:0,}" Sep 12 17:23:23.877294 containerd[1886]: time="2025-09-12T17:23:23.877256597Z" level=info msg="connecting to shim d4237bfb77cb01897a6e2325a156c3717548cf5782fbcbf353c1970e343748eb" address="unix:///run/containerd/s/aaaf58720db85bc87fd3eeb625091d0510cb9bdc418926fedff259b682c18de5" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:23:23.879564 containerd[1886]: time="2025-09-12T17:23:23.879530898Z" level=info msg="connecting to shim 14d997fdb9fac4eb1d42c45d1fd4a8a7a27062741c37b41cd9d9003c064e5c5b" address="unix:///run/containerd/s/e0ae92586d716d7e3f9ea7813f83b53aee36d2b14d1b679ddbd3328e941a53c1" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:23:23.897217 containerd[1886]: time="2025-09-12T17:23:23.896815426Z" level=info msg="connecting to shim 36e4c56c2318f834068e8fd33465cfcbab0462eafae43bcb53a0c12d5b5e108b" address="unix:///run/containerd/s/b7425a013d08ccf5667fb052bdfa823db89adda39fa1ddc3020f13ca8c0eeafc" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:23:23.902273 systemd[1]: Started cri-containerd-d4237bfb77cb01897a6e2325a156c3717548cf5782fbcbf353c1970e343748eb.scope - libcontainer container d4237bfb77cb01897a6e2325a156c3717548cf5782fbcbf353c1970e343748eb. Sep 12 17:23:23.917299 systemd[1]: Started cri-containerd-14d997fdb9fac4eb1d42c45d1fd4a8a7a27062741c37b41cd9d9003c064e5c5b.scope - libcontainer container 14d997fdb9fac4eb1d42c45d1fd4a8a7a27062741c37b41cd9d9003c064e5c5b. Sep 12 17:23:23.936289 systemd[1]: Started cri-containerd-36e4c56c2318f834068e8fd33465cfcbab0462eafae43bcb53a0c12d5b5e108b.scope - libcontainer container 36e4c56c2318f834068e8fd33465cfcbab0462eafae43bcb53a0c12d5b5e108b. Sep 12 17:23:23.958567 containerd[1886]: time="2025-09-12T17:23:23.958533068Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4426.1.0-a-1fe763f55e,Uid:f89b550650cd0edf31e7bb54fcb1c9fe,Namespace:kube-system,Attempt:0,} returns sandbox id \"d4237bfb77cb01897a6e2325a156c3717548cf5782fbcbf353c1970e343748eb\"" Sep 12 17:23:23.965731 kubelet[3051]: E0912 17:23:23.965702 3051 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.200.20.44:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.20.44:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 12 17:23:23.967868 containerd[1886]: time="2025-09-12T17:23:23.967838182Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4426.1.0-a-1fe763f55e,Uid:c5290526ee646ae491bbc701165604e3,Namespace:kube-system,Attempt:0,} returns sandbox id \"14d997fdb9fac4eb1d42c45d1fd4a8a7a27062741c37b41cd9d9003c064e5c5b\"" Sep 12 17:23:23.969760 containerd[1886]: time="2025-09-12T17:23:23.969736770Z" level=info msg="CreateContainer within sandbox \"d4237bfb77cb01897a6e2325a156c3717548cf5782fbcbf353c1970e343748eb\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 12 17:23:23.985536 containerd[1886]: time="2025-09-12T17:23:23.985495169Z" level=info msg="CreateContainer within sandbox \"14d997fdb9fac4eb1d42c45d1fd4a8a7a27062741c37b41cd9d9003c064e5c5b\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 12 17:23:23.990122 containerd[1886]: time="2025-09-12T17:23:23.989709194Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4426.1.0-a-1fe763f55e,Uid:2ea4678316b26ffdfd480451bde85a74,Namespace:kube-system,Attempt:0,} returns sandbox id \"36e4c56c2318f834068e8fd33465cfcbab0462eafae43bcb53a0c12d5b5e108b\"" Sep 12 17:23:23.993767 kubelet[3051]: I0912 17:23:23.993744 3051 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:23.994041 kubelet[3051]: E0912 17:23:23.994017 3051 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.20.44:6443/api/v1/nodes\": dial tcp 10.200.20.44:6443: connect: connection refused" node="ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:24.000565 containerd[1886]: time="2025-09-12T17:23:24.000533797Z" level=info msg="CreateContainer within sandbox \"36e4c56c2318f834068e8fd33465cfcbab0462eafae43bcb53a0c12d5b5e108b\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 12 17:23:24.014720 kubelet[3051]: E0912 17:23:24.014687 3051 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.20.44:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426.1.0-a-1fe763f55e?timeout=10s\": dial tcp 10.200.20.44:6443: connect: connection refused" interval="1.6s" Sep 12 17:23:24.019519 containerd[1886]: time="2025-09-12T17:23:24.019196146Z" level=info msg="Container 589287b9cbb8b4d3572667bc782d8b181cbf518cb3735f406fa10a28837ac88d: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:23:24.042188 containerd[1886]: time="2025-09-12T17:23:24.042120962Z" level=info msg="Container f65c5c6784a482bebe7f06fd0d34b58ebc1140812920ab248193fdb9d73361cf: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:23:24.073095 containerd[1886]: time="2025-09-12T17:23:24.073055225Z" level=info msg="CreateContainer within sandbox \"d4237bfb77cb01897a6e2325a156c3717548cf5782fbcbf353c1970e343748eb\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"589287b9cbb8b4d3572667bc782d8b181cbf518cb3735f406fa10a28837ac88d\"" Sep 12 17:23:24.073747 containerd[1886]: time="2025-09-12T17:23:24.073725355Z" level=info msg="StartContainer for \"589287b9cbb8b4d3572667bc782d8b181cbf518cb3735f406fa10a28837ac88d\"" Sep 12 17:23:24.074646 containerd[1886]: time="2025-09-12T17:23:24.074623611Z" level=info msg="connecting to shim 589287b9cbb8b4d3572667bc782d8b181cbf518cb3735f406fa10a28837ac88d" address="unix:///run/containerd/s/aaaf58720db85bc87fd3eeb625091d0510cb9bdc418926fedff259b682c18de5" protocol=ttrpc version=3 Sep 12 17:23:24.077936 containerd[1886]: time="2025-09-12T17:23:24.077900603Z" level=info msg="Container 243d75f7be9ec1019a5284914727693e5a077d17b572b12479c304294c7b9acb: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:23:24.091249 systemd[1]: Started cri-containerd-589287b9cbb8b4d3572667bc782d8b181cbf518cb3735f406fa10a28837ac88d.scope - libcontainer container 589287b9cbb8b4d3572667bc782d8b181cbf518cb3735f406fa10a28837ac88d. Sep 12 17:23:24.099858 containerd[1886]: time="2025-09-12T17:23:24.099125134Z" level=info msg="CreateContainer within sandbox \"14d997fdb9fac4eb1d42c45d1fd4a8a7a27062741c37b41cd9d9003c064e5c5b\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"f65c5c6784a482bebe7f06fd0d34b58ebc1140812920ab248193fdb9d73361cf\"" Sep 12 17:23:24.100242 containerd[1886]: time="2025-09-12T17:23:24.100173274Z" level=info msg="StartContainer for \"f65c5c6784a482bebe7f06fd0d34b58ebc1140812920ab248193fdb9d73361cf\"" Sep 12 17:23:24.101440 containerd[1886]: time="2025-09-12T17:23:24.101415147Z" level=info msg="connecting to shim f65c5c6784a482bebe7f06fd0d34b58ebc1140812920ab248193fdb9d73361cf" address="unix:///run/containerd/s/e0ae92586d716d7e3f9ea7813f83b53aee36d2b14d1b679ddbd3328e941a53c1" protocol=ttrpc version=3 Sep 12 17:23:24.114080 containerd[1886]: time="2025-09-12T17:23:24.113486936Z" level=info msg="CreateContainer within sandbox \"36e4c56c2318f834068e8fd33465cfcbab0462eafae43bcb53a0c12d5b5e108b\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"243d75f7be9ec1019a5284914727693e5a077d17b572b12479c304294c7b9acb\"" Sep 12 17:23:24.114080 containerd[1886]: time="2025-09-12T17:23:24.113862738Z" level=info msg="StartContainer for \"243d75f7be9ec1019a5284914727693e5a077d17b572b12479c304294c7b9acb\"" Sep 12 17:23:24.114759 containerd[1886]: time="2025-09-12T17:23:24.114735697Z" level=info msg="connecting to shim 243d75f7be9ec1019a5284914727693e5a077d17b572b12479c304294c7b9acb" address="unix:///run/containerd/s/b7425a013d08ccf5667fb052bdfa823db89adda39fa1ddc3020f13ca8c0eeafc" protocol=ttrpc version=3 Sep 12 17:23:24.116903 kubelet[3051]: E0912 17:23:24.116868 3051 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.200.20.44:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426.1.0-a-1fe763f55e&limit=500&resourceVersion=0\": dial tcp 10.200.20.44:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 12 17:23:24.121256 systemd[1]: Started cri-containerd-f65c5c6784a482bebe7f06fd0d34b58ebc1140812920ab248193fdb9d73361cf.scope - libcontainer container f65c5c6784a482bebe7f06fd0d34b58ebc1140812920ab248193fdb9d73361cf. Sep 12 17:23:24.139434 systemd[1]: Started cri-containerd-243d75f7be9ec1019a5284914727693e5a077d17b572b12479c304294c7b9acb.scope - libcontainer container 243d75f7be9ec1019a5284914727693e5a077d17b572b12479c304294c7b9acb. Sep 12 17:23:24.149867 containerd[1886]: time="2025-09-12T17:23:24.149701869Z" level=info msg="StartContainer for \"589287b9cbb8b4d3572667bc782d8b181cbf518cb3735f406fa10a28837ac88d\" returns successfully" Sep 12 17:23:24.181309 containerd[1886]: time="2025-09-12T17:23:24.181283845Z" level=info msg="StartContainer for \"f65c5c6784a482bebe7f06fd0d34b58ebc1140812920ab248193fdb9d73361cf\" returns successfully" Sep 12 17:23:24.195628 kubelet[3051]: E0912 17:23:24.195090 3051 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.200.20.44:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.20.44:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 12 17:23:24.216792 containerd[1886]: time="2025-09-12T17:23:24.216767135Z" level=info msg="StartContainer for \"243d75f7be9ec1019a5284914727693e5a077d17b572b12479c304294c7b9acb\" returns successfully" Sep 12 17:23:24.797458 kubelet[3051]: I0912 17:23:24.797318 3051 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:25.141620 kubelet[3051]: E0912 17:23:25.141579 3051 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" node="ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:25.143900 kubelet[3051]: E0912 17:23:25.143870 3051 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" node="ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:25.147018 kubelet[3051]: E0912 17:23:25.146954 3051 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" node="ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:25.669046 kubelet[3051]: I0912 17:23:25.668922 3051 kubelet_node_status.go:78] "Successfully registered node" node="ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:25.669046 kubelet[3051]: E0912 17:23:25.668972 3051 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4426.1.0-a-1fe763f55e\": node \"ci-4426.1.0-a-1fe763f55e\" not found" Sep 12 17:23:25.759156 kubelet[3051]: E0912 17:23:25.758797 3051 controller.go:145] "Failed to ensure lease exists, will retry" err="namespaces \"kube-node-lease\" not found" interval="3.2s" Sep 12 17:23:25.812533 kubelet[3051]: E0912 17:23:25.812500 3051 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" Sep 12 17:23:25.914092 kubelet[3051]: E0912 17:23:25.912974 3051 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" Sep 12 17:23:26.013586 kubelet[3051]: E0912 17:23:26.013456 3051 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" Sep 12 17:23:26.114404 kubelet[3051]: E0912 17:23:26.114362 3051 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" Sep 12 17:23:26.147790 kubelet[3051]: E0912 17:23:26.147752 3051 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" node="ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:26.148232 kubelet[3051]: E0912 17:23:26.148031 3051 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" node="ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:26.148529 kubelet[3051]: E0912 17:23:26.148512 3051 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" node="ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:26.215115 kubelet[3051]: E0912 17:23:26.215079 3051 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" Sep 12 17:23:26.315891 kubelet[3051]: E0912 17:23:26.315791 3051 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" Sep 12 17:23:26.416711 kubelet[3051]: E0912 17:23:26.416676 3051 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" Sep 12 17:23:26.516967 kubelet[3051]: E0912 17:23:26.516934 3051 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" Sep 12 17:23:26.617642 kubelet[3051]: E0912 17:23:26.617536 3051 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" Sep 12 17:23:26.718608 kubelet[3051]: E0912 17:23:26.718567 3051 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" Sep 12 17:23:26.818688 kubelet[3051]: E0912 17:23:26.818653 3051 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" Sep 12 17:23:26.919251 kubelet[3051]: E0912 17:23:26.919215 3051 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" Sep 12 17:23:27.012425 kubelet[3051]: I0912 17:23:27.012387 3051 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:27.111617 kubelet[3051]: I0912 17:23:27.111488 3051 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 12 17:23:27.112353 kubelet[3051]: I0912 17:23:27.111794 3051 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:27.148087 kubelet[3051]: I0912 17:23:27.148065 3051 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:27.148459 kubelet[3051]: I0912 17:23:27.148383 3051 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:27.158587 kubelet[3051]: I0912 17:23:27.158330 3051 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 12 17:23:27.158587 kubelet[3051]: I0912 17:23:27.158413 3051 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:27.508160 kubelet[3051]: I0912 17:23:27.507896 3051 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 12 17:23:27.508160 kubelet[3051]: E0912 17:23:27.507960 3051 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4426.1.0-a-1fe763f55e\" already exists" pod="kube-system/kube-scheduler-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:27.508160 kubelet[3051]: I0912 17:23:27.507896 3051 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 12 17:23:27.508722 kubelet[3051]: I0912 17:23:27.508701 3051 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 12 17:23:27.508786 kubelet[3051]: E0912 17:23:27.508738 3051 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4426.1.0-a-1fe763f55e\" already exists" pod="kube-system/kube-apiserver-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:27.600237 kubelet[3051]: I0912 17:23:27.600205 3051 apiserver.go:52] "Watching apiserver" Sep 12 17:23:27.614002 kubelet[3051]: I0912 17:23:27.613977 3051 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 17:23:28.824623 systemd[1]: Reload requested from client PID 3338 ('systemctl') (unit session-9.scope)... Sep 12 17:23:28.824638 systemd[1]: Reloading... Sep 12 17:23:28.914201 zram_generator::config[3394]: No configuration found. Sep 12 17:23:29.064488 systemd[1]: Reloading finished in 239 ms. Sep 12 17:23:29.087481 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:23:29.101528 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 17:23:29.101698 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:23:29.101736 systemd[1]: kubelet.service: Consumed 1.119s CPU time, 125.9M memory peak. Sep 12 17:23:29.103926 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:23:29.218827 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:23:29.226577 (kubelet)[3450]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 17:23:29.584707 kubelet[3450]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:23:29.584707 kubelet[3450]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 17:23:29.584707 kubelet[3450]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:23:29.584707 kubelet[3450]: I0912 17:23:29.254947 3450 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 17:23:29.584707 kubelet[3450]: I0912 17:23:29.259513 3450 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 12 17:23:29.584707 kubelet[3450]: I0912 17:23:29.259528 3450 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 17:23:29.584707 kubelet[3450]: I0912 17:23:29.259729 3450 server.go:956] "Client rotation is on, will bootstrap in background" Sep 12 17:23:29.584991 kubelet[3450]: I0912 17:23:29.584936 3450 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Sep 12 17:23:29.586738 kubelet[3450]: I0912 17:23:29.586713 3450 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 17:23:29.590692 kubelet[3450]: I0912 17:23:29.590674 3450 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 17:23:29.593343 kubelet[3450]: I0912 17:23:29.593328 3450 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 17:23:29.593525 kubelet[3450]: I0912 17:23:29.593500 3450 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 17:23:29.593626 kubelet[3450]: I0912 17:23:29.593522 3450 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4426.1.0-a-1fe763f55e","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 17:23:29.593701 kubelet[3450]: I0912 17:23:29.593631 3450 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 17:23:29.593701 kubelet[3450]: I0912 17:23:29.593638 3450 container_manager_linux.go:303] "Creating device plugin manager" Sep 12 17:23:29.593701 kubelet[3450]: I0912 17:23:29.593670 3450 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:23:29.593788 kubelet[3450]: I0912 17:23:29.593778 3450 kubelet.go:480] "Attempting to sync node with API server" Sep 12 17:23:29.593810 kubelet[3450]: I0912 17:23:29.593789 3450 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 17:23:29.593810 kubelet[3450]: I0912 17:23:29.593806 3450 kubelet.go:386] "Adding apiserver pod source" Sep 12 17:23:29.593942 kubelet[3450]: I0912 17:23:29.593817 3450 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 17:23:29.594756 kubelet[3450]: I0912 17:23:29.594738 3450 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 17:23:29.595100 kubelet[3450]: I0912 17:23:29.595084 3450 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 12 17:23:29.597725 kubelet[3450]: I0912 17:23:29.597707 3450 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 17:23:29.598038 kubelet[3450]: I0912 17:23:29.598007 3450 server.go:1289] "Started kubelet" Sep 12 17:23:29.598832 kubelet[3450]: I0912 17:23:29.598671 3450 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 17:23:29.602995 kubelet[3450]: I0912 17:23:29.602962 3450 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 17:23:29.603695 kubelet[3450]: I0912 17:23:29.603676 3450 server.go:317] "Adding debug handlers to kubelet server" Sep 12 17:23:29.604675 kubelet[3450]: I0912 17:23:29.604655 3450 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 17:23:29.607495 kubelet[3450]: I0912 17:23:29.607448 3450 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 17:23:29.607855 kubelet[3450]: I0912 17:23:29.607770 3450 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 17:23:29.611332 kubelet[3450]: I0912 17:23:29.611318 3450 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 17:23:29.611587 kubelet[3450]: E0912 17:23:29.611573 3450 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426.1.0-a-1fe763f55e\" not found" Sep 12 17:23:29.618708 kubelet[3450]: I0912 17:23:29.618686 3450 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 17:23:29.618865 kubelet[3450]: I0912 17:23:29.618855 3450 reconciler.go:26] "Reconciler: start to sync state" Sep 12 17:23:29.623857 kubelet[3450]: I0912 17:23:29.623833 3450 factory.go:223] Registration of the systemd container factory successfully Sep 12 17:23:29.624019 kubelet[3450]: I0912 17:23:29.624000 3450 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 17:23:29.627514 kubelet[3450]: I0912 17:23:29.627497 3450 factory.go:223] Registration of the containerd container factory successfully Sep 12 17:23:29.636369 kubelet[3450]: E0912 17:23:29.635784 3450 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 17:23:29.637293 kubelet[3450]: I0912 17:23:29.637267 3450 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 12 17:23:29.641123 kubelet[3450]: I0912 17:23:29.641098 3450 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 12 17:23:29.641640 kubelet[3450]: I0912 17:23:29.641621 3450 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 12 17:23:29.641694 kubelet[3450]: I0912 17:23:29.641645 3450 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 17:23:29.641694 kubelet[3450]: I0912 17:23:29.641650 3450 kubelet.go:2436] "Starting kubelet main sync loop" Sep 12 17:23:29.641694 kubelet[3450]: E0912 17:23:29.641680 3450 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 17:23:29.682046 kubelet[3450]: I0912 17:23:29.682019 3450 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 17:23:29.682197 kubelet[3450]: I0912 17:23:29.682110 3450 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 17:23:29.683020 kubelet[3450]: I0912 17:23:29.682914 3450 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:23:29.683169 kubelet[3450]: I0912 17:23:29.683157 3450 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 12 17:23:29.683301 kubelet[3450]: I0912 17:23:29.683234 3450 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 12 17:23:29.683301 kubelet[3450]: I0912 17:23:29.683256 3450 policy_none.go:49] "None policy: Start" Sep 12 17:23:29.683301 kubelet[3450]: I0912 17:23:29.683264 3450 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 17:23:29.683301 kubelet[3450]: I0912 17:23:29.683274 3450 state_mem.go:35] "Initializing new in-memory state store" Sep 12 17:23:29.684082 kubelet[3450]: I0912 17:23:29.683449 3450 state_mem.go:75] "Updated machine memory state" Sep 12 17:23:29.687554 kubelet[3450]: E0912 17:23:29.687540 3450 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 12 17:23:29.687969 kubelet[3450]: I0912 17:23:29.687956 3450 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 17:23:29.688065 kubelet[3450]: I0912 17:23:29.688043 3450 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 17:23:29.688348 kubelet[3450]: I0912 17:23:29.688303 3450 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 17:23:29.690794 kubelet[3450]: E0912 17:23:29.690774 3450 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 17:23:29.742638 kubelet[3450]: I0912 17:23:29.742615 3450 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:29.742838 kubelet[3450]: I0912 17:23:29.742653 3450 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:29.742953 kubelet[3450]: I0912 17:23:29.742694 3450 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:29.763600 kubelet[3450]: I0912 17:23:29.763557 3450 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 12 17:23:29.763675 kubelet[3450]: E0912 17:23:29.763611 3450 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4426.1.0-a-1fe763f55e\" already exists" pod="kube-system/kube-apiserver-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:29.764058 kubelet[3450]: I0912 17:23:29.764042 3450 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 12 17:23:29.764114 kubelet[3450]: I0912 17:23:29.764078 3450 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 12 17:23:29.764114 kubelet[3450]: E0912 17:23:29.764104 3450 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4426.1.0-a-1fe763f55e\" already exists" pod="kube-system/kube-controller-manager-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:29.764258 kubelet[3450]: E0912 17:23:29.764239 3450 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4426.1.0-a-1fe763f55e\" already exists" pod="kube-system/kube-scheduler-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:29.790139 kubelet[3450]: I0912 17:23:29.790061 3450 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:29.802974 kubelet[3450]: I0912 17:23:29.802934 3450 kubelet_node_status.go:124] "Node was previously registered" node="ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:29.803042 kubelet[3450]: I0912 17:23:29.803019 3450 kubelet_node_status.go:78] "Successfully registered node" node="ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:29.919726 kubelet[3450]: I0912 17:23:29.919637 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c5290526ee646ae491bbc701165604e3-ca-certs\") pod \"kube-apiserver-ci-4426.1.0-a-1fe763f55e\" (UID: \"c5290526ee646ae491bbc701165604e3\") " pod="kube-system/kube-apiserver-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:29.919726 kubelet[3450]: I0912 17:23:29.919666 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c5290526ee646ae491bbc701165604e3-k8s-certs\") pod \"kube-apiserver-ci-4426.1.0-a-1fe763f55e\" (UID: \"c5290526ee646ae491bbc701165604e3\") " pod="kube-system/kube-apiserver-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:29.919726 kubelet[3450]: I0912 17:23:29.919680 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/f89b550650cd0edf31e7bb54fcb1c9fe-flexvolume-dir\") pod \"kube-controller-manager-ci-4426.1.0-a-1fe763f55e\" (UID: \"f89b550650cd0edf31e7bb54fcb1c9fe\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:29.919726 kubelet[3450]: I0912 17:23:29.919694 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f89b550650cd0edf31e7bb54fcb1c9fe-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4426.1.0-a-1fe763f55e\" (UID: \"f89b550650cd0edf31e7bb54fcb1c9fe\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:29.919726 kubelet[3450]: I0912 17:23:29.919708 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c5290526ee646ae491bbc701165604e3-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4426.1.0-a-1fe763f55e\" (UID: \"c5290526ee646ae491bbc701165604e3\") " pod="kube-system/kube-apiserver-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:29.919902 kubelet[3450]: I0912 17:23:29.919734 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f89b550650cd0edf31e7bb54fcb1c9fe-ca-certs\") pod \"kube-controller-manager-ci-4426.1.0-a-1fe763f55e\" (UID: \"f89b550650cd0edf31e7bb54fcb1c9fe\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:29.919902 kubelet[3450]: I0912 17:23:29.919764 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f89b550650cd0edf31e7bb54fcb1c9fe-k8s-certs\") pod \"kube-controller-manager-ci-4426.1.0-a-1fe763f55e\" (UID: \"f89b550650cd0edf31e7bb54fcb1c9fe\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:29.919902 kubelet[3450]: I0912 17:23:29.919778 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f89b550650cd0edf31e7bb54fcb1c9fe-kubeconfig\") pod \"kube-controller-manager-ci-4426.1.0-a-1fe763f55e\" (UID: \"f89b550650cd0edf31e7bb54fcb1c9fe\") " pod="kube-system/kube-controller-manager-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:29.919902 kubelet[3450]: I0912 17:23:29.919791 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2ea4678316b26ffdfd480451bde85a74-kubeconfig\") pod \"kube-scheduler-ci-4426.1.0-a-1fe763f55e\" (UID: \"2ea4678316b26ffdfd480451bde85a74\") " pod="kube-system/kube-scheduler-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:30.594453 kubelet[3450]: I0912 17:23:30.594384 3450 apiserver.go:52] "Watching apiserver" Sep 12 17:23:30.619276 kubelet[3450]: I0912 17:23:30.619237 3450 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 17:23:30.665686 kubelet[3450]: I0912 17:23:30.665661 3450 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:30.666146 kubelet[3450]: I0912 17:23:30.666094 3450 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:30.681135 kubelet[3450]: I0912 17:23:30.681110 3450 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 12 17:23:30.681217 kubelet[3450]: E0912 17:23:30.681201 3450 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4426.1.0-a-1fe763f55e\" already exists" pod="kube-system/kube-scheduler-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:30.687230 kubelet[3450]: I0912 17:23:30.687044 3450 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Sep 12 17:23:30.687230 kubelet[3450]: E0912 17:23:30.687084 3450 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4426.1.0-a-1fe763f55e\" already exists" pod="kube-system/kube-apiserver-ci-4426.1.0-a-1fe763f55e" Sep 12 17:23:30.688037 kubelet[3450]: I0912 17:23:30.687968 3450 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4426.1.0-a-1fe763f55e" podStartSLOduration=3.687957849 podStartE2EDuration="3.687957849s" podCreationTimestamp="2025-09-12 17:23:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:23:30.687122156 +0000 UTC m=+1.457869298" watchObservedRunningTime="2025-09-12 17:23:30.687957849 +0000 UTC m=+1.458704975" Sep 12 17:23:30.715549 kubelet[3450]: I0912 17:23:30.715123 3450 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4426.1.0-a-1fe763f55e" podStartSLOduration=3.715113661 podStartE2EDuration="3.715113661s" podCreationTimestamp="2025-09-12 17:23:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:23:30.697433479 +0000 UTC m=+1.468180613" watchObservedRunningTime="2025-09-12 17:23:30.715113661 +0000 UTC m=+1.485860787" Sep 12 17:23:30.725708 kubelet[3450]: I0912 17:23:30.725608 3450 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4426.1.0-a-1fe763f55e" podStartSLOduration=3.725600453 podStartE2EDuration="3.725600453s" podCreationTimestamp="2025-09-12 17:23:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:23:30.715442997 +0000 UTC m=+1.486190123" watchObservedRunningTime="2025-09-12 17:23:30.725600453 +0000 UTC m=+1.496347579" Sep 12 17:23:34.628111 kubelet[3450]: I0912 17:23:34.628077 3450 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 12 17:23:34.628777 containerd[1886]: time="2025-09-12T17:23:34.628668691Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 12 17:23:34.629043 kubelet[3450]: I0912 17:23:34.628906 3450 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 12 17:23:35.380081 systemd[1]: Created slice kubepods-besteffort-pod09fc36d4_29b4_4520_94f6_c1d1a9bc034b.slice - libcontainer container kubepods-besteffort-pod09fc36d4_29b4_4520_94f6_c1d1a9bc034b.slice. Sep 12 17:23:35.445707 kubelet[3450]: I0912 17:23:35.445578 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmggb\" (UniqueName: \"kubernetes.io/projected/09fc36d4-29b4-4520-94f6-c1d1a9bc034b-kube-api-access-qmggb\") pod \"kube-proxy-pfxv8\" (UID: \"09fc36d4-29b4-4520-94f6-c1d1a9bc034b\") " pod="kube-system/kube-proxy-pfxv8" Sep 12 17:23:35.445707 kubelet[3450]: I0912 17:23:35.445618 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/09fc36d4-29b4-4520-94f6-c1d1a9bc034b-kube-proxy\") pod \"kube-proxy-pfxv8\" (UID: \"09fc36d4-29b4-4520-94f6-c1d1a9bc034b\") " pod="kube-system/kube-proxy-pfxv8" Sep 12 17:23:35.445707 kubelet[3450]: I0912 17:23:35.445632 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/09fc36d4-29b4-4520-94f6-c1d1a9bc034b-xtables-lock\") pod \"kube-proxy-pfxv8\" (UID: \"09fc36d4-29b4-4520-94f6-c1d1a9bc034b\") " pod="kube-system/kube-proxy-pfxv8" Sep 12 17:23:35.445707 kubelet[3450]: I0912 17:23:35.445642 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/09fc36d4-29b4-4520-94f6-c1d1a9bc034b-lib-modules\") pod \"kube-proxy-pfxv8\" (UID: \"09fc36d4-29b4-4520-94f6-c1d1a9bc034b\") " pod="kube-system/kube-proxy-pfxv8" Sep 12 17:23:35.646871 kubelet[3450]: I0912 17:23:35.646681 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86l82\" (UniqueName: \"kubernetes.io/projected/7ebb8ba0-909e-4693-857b-980bda4743da-kube-api-access-86l82\") pod \"tigera-operator-755d956888-dqbk8\" (UID: \"7ebb8ba0-909e-4693-857b-980bda4743da\") " pod="tigera-operator/tigera-operator-755d956888-dqbk8" Sep 12 17:23:35.646871 kubelet[3450]: I0912 17:23:35.646712 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/7ebb8ba0-909e-4693-857b-980bda4743da-var-lib-calico\") pod \"tigera-operator-755d956888-dqbk8\" (UID: \"7ebb8ba0-909e-4693-857b-980bda4743da\") " pod="tigera-operator/tigera-operator-755d956888-dqbk8" Sep 12 17:23:35.652531 systemd[1]: Created slice kubepods-besteffort-pod7ebb8ba0_909e_4693_857b_980bda4743da.slice - libcontainer container kubepods-besteffort-pod7ebb8ba0_909e_4693_857b_980bda4743da.slice. Sep 12 17:23:35.689494 containerd[1886]: time="2025-09-12T17:23:35.689457029Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-pfxv8,Uid:09fc36d4-29b4-4520-94f6-c1d1a9bc034b,Namespace:kube-system,Attempt:0,}" Sep 12 17:23:35.744937 containerd[1886]: time="2025-09-12T17:23:35.744893044Z" level=info msg="connecting to shim 204ee85b2370043c7cb6489708ab6475476d628461fe1bfa2209bbf97824f479" address="unix:///run/containerd/s/b82908a9d81d428a254c9cd52abbb60eed0e895684d60f4a7873456517172934" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:23:35.772254 systemd[1]: Started cri-containerd-204ee85b2370043c7cb6489708ab6475476d628461fe1bfa2209bbf97824f479.scope - libcontainer container 204ee85b2370043c7cb6489708ab6475476d628461fe1bfa2209bbf97824f479. Sep 12 17:23:35.792278 containerd[1886]: time="2025-09-12T17:23:35.792242242Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-pfxv8,Uid:09fc36d4-29b4-4520-94f6-c1d1a9bc034b,Namespace:kube-system,Attempt:0,} returns sandbox id \"204ee85b2370043c7cb6489708ab6475476d628461fe1bfa2209bbf97824f479\"" Sep 12 17:23:35.803168 containerd[1886]: time="2025-09-12T17:23:35.802301471Z" level=info msg="CreateContainer within sandbox \"204ee85b2370043c7cb6489708ab6475476d628461fe1bfa2209bbf97824f479\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 12 17:23:35.834215 containerd[1886]: time="2025-09-12T17:23:35.834187095Z" level=info msg="Container 1ea671821985b8387fc839073f92d1391381f5ce7c2bfb1433ea004b1117c448: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:23:35.855847 containerd[1886]: time="2025-09-12T17:23:35.855812043Z" level=info msg="CreateContainer within sandbox \"204ee85b2370043c7cb6489708ab6475476d628461fe1bfa2209bbf97824f479\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"1ea671821985b8387fc839073f92d1391381f5ce7c2bfb1433ea004b1117c448\"" Sep 12 17:23:35.856489 containerd[1886]: time="2025-09-12T17:23:35.856470556Z" level=info msg="StartContainer for \"1ea671821985b8387fc839073f92d1391381f5ce7c2bfb1433ea004b1117c448\"" Sep 12 17:23:35.857605 containerd[1886]: time="2025-09-12T17:23:35.857564170Z" level=info msg="connecting to shim 1ea671821985b8387fc839073f92d1391381f5ce7c2bfb1433ea004b1117c448" address="unix:///run/containerd/s/b82908a9d81d428a254c9cd52abbb60eed0e895684d60f4a7873456517172934" protocol=ttrpc version=3 Sep 12 17:23:35.874434 systemd[1]: Started cri-containerd-1ea671821985b8387fc839073f92d1391381f5ce7c2bfb1433ea004b1117c448.scope - libcontainer container 1ea671821985b8387fc839073f92d1391381f5ce7c2bfb1433ea004b1117c448. Sep 12 17:23:35.903197 containerd[1886]: time="2025-09-12T17:23:35.902970395Z" level=info msg="StartContainer for \"1ea671821985b8387fc839073f92d1391381f5ce7c2bfb1433ea004b1117c448\" returns successfully" Sep 12 17:23:35.960722 containerd[1886]: time="2025-09-12T17:23:35.960686999Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-dqbk8,Uid:7ebb8ba0-909e-4693-857b-980bda4743da,Namespace:tigera-operator,Attempt:0,}" Sep 12 17:23:36.021397 containerd[1886]: time="2025-09-12T17:23:36.021352722Z" level=info msg="connecting to shim 7d7522d52bbd4c1ed6c73c2d396dcfe9d5ae79ab87a14b905c4becfd746c9d91" address="unix:///run/containerd/s/e77a4dc3f3351ea25e45fa432fb1f0c9c6b9c4028584518a129891a78bcc5c96" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:23:36.043262 systemd[1]: Started cri-containerd-7d7522d52bbd4c1ed6c73c2d396dcfe9d5ae79ab87a14b905c4becfd746c9d91.scope - libcontainer container 7d7522d52bbd4c1ed6c73c2d396dcfe9d5ae79ab87a14b905c4becfd746c9d91. Sep 12 17:23:36.074619 containerd[1886]: time="2025-09-12T17:23:36.074584558Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-dqbk8,Uid:7ebb8ba0-909e-4693-857b-980bda4743da,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"7d7522d52bbd4c1ed6c73c2d396dcfe9d5ae79ab87a14b905c4becfd746c9d91\"" Sep 12 17:23:36.076324 containerd[1886]: time="2025-09-12T17:23:36.076296156Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 12 17:23:37.779201 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2634740341.mount: Deactivated successfully. Sep 12 17:23:38.130782 containerd[1886]: time="2025-09-12T17:23:38.130741536Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:23:38.133998 containerd[1886]: time="2025-09-12T17:23:38.133965214Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 12 17:23:38.138197 containerd[1886]: time="2025-09-12T17:23:38.138157447Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:23:38.142969 containerd[1886]: time="2025-09-12T17:23:38.142927918Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:23:38.143490 containerd[1886]: time="2025-09-12T17:23:38.143214478Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 2.06689473s" Sep 12 17:23:38.143490 containerd[1886]: time="2025-09-12T17:23:38.143240247Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 12 17:23:38.151870 containerd[1886]: time="2025-09-12T17:23:38.151845446Z" level=info msg="CreateContainer within sandbox \"7d7522d52bbd4c1ed6c73c2d396dcfe9d5ae79ab87a14b905c4becfd746c9d91\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 12 17:23:38.178919 containerd[1886]: time="2025-09-12T17:23:38.178889507Z" level=info msg="Container 7f6fd2a8a53206b7f551e55bc9def3945660a5a09b9907c3929fe8fd54460590: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:23:38.195376 containerd[1886]: time="2025-09-12T17:23:38.195331276Z" level=info msg="CreateContainer within sandbox \"7d7522d52bbd4c1ed6c73c2d396dcfe9d5ae79ab87a14b905c4becfd746c9d91\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"7f6fd2a8a53206b7f551e55bc9def3945660a5a09b9907c3929fe8fd54460590\"" Sep 12 17:23:38.195999 containerd[1886]: time="2025-09-12T17:23:38.195878427Z" level=info msg="StartContainer for \"7f6fd2a8a53206b7f551e55bc9def3945660a5a09b9907c3929fe8fd54460590\"" Sep 12 17:23:38.196869 containerd[1886]: time="2025-09-12T17:23:38.196850061Z" level=info msg="connecting to shim 7f6fd2a8a53206b7f551e55bc9def3945660a5a09b9907c3929fe8fd54460590" address="unix:///run/containerd/s/e77a4dc3f3351ea25e45fa432fb1f0c9c6b9c4028584518a129891a78bcc5c96" protocol=ttrpc version=3 Sep 12 17:23:38.212237 systemd[1]: Started cri-containerd-7f6fd2a8a53206b7f551e55bc9def3945660a5a09b9907c3929fe8fd54460590.scope - libcontainer container 7f6fd2a8a53206b7f551e55bc9def3945660a5a09b9907c3929fe8fd54460590. Sep 12 17:23:38.235951 containerd[1886]: time="2025-09-12T17:23:38.235758368Z" level=info msg="StartContainer for \"7f6fd2a8a53206b7f551e55bc9def3945660a5a09b9907c3929fe8fd54460590\" returns successfully" Sep 12 17:23:38.691688 kubelet[3450]: I0912 17:23:38.691408 3450 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-pfxv8" podStartSLOduration=3.691392356 podStartE2EDuration="3.691392356s" podCreationTimestamp="2025-09-12 17:23:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:23:36.688543376 +0000 UTC m=+7.459290502" watchObservedRunningTime="2025-09-12 17:23:38.691392356 +0000 UTC m=+9.462139482" Sep 12 17:23:41.026535 kubelet[3450]: I0912 17:23:41.026472 3450 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-dqbk8" podStartSLOduration=3.95800297 podStartE2EDuration="6.026457182s" podCreationTimestamp="2025-09-12 17:23:35 +0000 UTC" firstStartedPulling="2025-09-12 17:23:36.075339802 +0000 UTC m=+6.846086928" lastFinishedPulling="2025-09-12 17:23:38.143794014 +0000 UTC m=+8.914541140" observedRunningTime="2025-09-12 17:23:38.692481993 +0000 UTC m=+9.463229119" watchObservedRunningTime="2025-09-12 17:23:41.026457182 +0000 UTC m=+11.797204308" Sep 12 17:23:43.263934 sudo[2377]: pam_unix(sudo:session): session closed for user root Sep 12 17:23:43.348650 sshd[2376]: Connection closed by 10.200.16.10 port 37114 Sep 12 17:23:43.348564 sshd-session[2373]: pam_unix(sshd:session): session closed for user core Sep 12 17:23:43.353462 systemd[1]: sshd@6-10.200.20.44:22-10.200.16.10:37114.service: Deactivated successfully. Sep 12 17:23:43.354959 systemd[1]: session-9.scope: Deactivated successfully. Sep 12 17:23:43.355112 systemd[1]: session-9.scope: Consumed 4.146s CPU time, 221.1M memory peak. Sep 12 17:23:43.357724 systemd-logind[1855]: Session 9 logged out. Waiting for processes to exit. Sep 12 17:23:43.358847 systemd-logind[1855]: Removed session 9. Sep 12 17:23:47.291522 systemd[1]: Created slice kubepods-besteffort-pode6b5cd69_4815_481e_bd5f_d872a621ffbd.slice - libcontainer container kubepods-besteffort-pode6b5cd69_4815_481e_bd5f_d872a621ffbd.slice. Sep 12 17:23:47.313148 kubelet[3450]: I0912 17:23:47.313104 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6b5cd69-4815-481e-bd5f-d872a621ffbd-tigera-ca-bundle\") pod \"calico-typha-d78fcb9c9-4xlz7\" (UID: \"e6b5cd69-4815-481e-bd5f-d872a621ffbd\") " pod="calico-system/calico-typha-d78fcb9c9-4xlz7" Sep 12 17:23:47.313470 kubelet[3450]: I0912 17:23:47.313170 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/e6b5cd69-4815-481e-bd5f-d872a621ffbd-typha-certs\") pod \"calico-typha-d78fcb9c9-4xlz7\" (UID: \"e6b5cd69-4815-481e-bd5f-d872a621ffbd\") " pod="calico-system/calico-typha-d78fcb9c9-4xlz7" Sep 12 17:23:47.313470 kubelet[3450]: I0912 17:23:47.313190 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6742\" (UniqueName: \"kubernetes.io/projected/e6b5cd69-4815-481e-bd5f-d872a621ffbd-kube-api-access-c6742\") pod \"calico-typha-d78fcb9c9-4xlz7\" (UID: \"e6b5cd69-4815-481e-bd5f-d872a621ffbd\") " pod="calico-system/calico-typha-d78fcb9c9-4xlz7" Sep 12 17:23:47.385819 systemd[1]: Created slice kubepods-besteffort-pod8e50b0d7_9159_4c8d_8657_baa12974b8dd.slice - libcontainer container kubepods-besteffort-pod8e50b0d7_9159_4c8d_8657_baa12974b8dd.slice. Sep 12 17:23:47.414461 kubelet[3450]: I0912 17:23:47.414429 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/8e50b0d7-9159-4c8d-8657-baa12974b8dd-cni-log-dir\") pod \"calico-node-6t99w\" (UID: \"8e50b0d7-9159-4c8d-8657-baa12974b8dd\") " pod="calico-system/calico-node-6t99w" Sep 12 17:23:47.414461 kubelet[3450]: I0912 17:23:47.414468 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8e50b0d7-9159-4c8d-8657-baa12974b8dd-lib-modules\") pod \"calico-node-6t99w\" (UID: \"8e50b0d7-9159-4c8d-8657-baa12974b8dd\") " pod="calico-system/calico-node-6t99w" Sep 12 17:23:47.414583 kubelet[3450]: I0912 17:23:47.414479 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e50b0d7-9159-4c8d-8657-baa12974b8dd-tigera-ca-bundle\") pod \"calico-node-6t99w\" (UID: \"8e50b0d7-9159-4c8d-8657-baa12974b8dd\") " pod="calico-system/calico-node-6t99w" Sep 12 17:23:47.414583 kubelet[3450]: I0912 17:23:47.414497 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdjz7\" (UniqueName: \"kubernetes.io/projected/8e50b0d7-9159-4c8d-8657-baa12974b8dd-kube-api-access-gdjz7\") pod \"calico-node-6t99w\" (UID: \"8e50b0d7-9159-4c8d-8657-baa12974b8dd\") " pod="calico-system/calico-node-6t99w" Sep 12 17:23:47.414583 kubelet[3450]: I0912 17:23:47.414517 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/8e50b0d7-9159-4c8d-8657-baa12974b8dd-xtables-lock\") pod \"calico-node-6t99w\" (UID: \"8e50b0d7-9159-4c8d-8657-baa12974b8dd\") " pod="calico-system/calico-node-6t99w" Sep 12 17:23:47.414583 kubelet[3450]: I0912 17:23:47.414533 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/8e50b0d7-9159-4c8d-8657-baa12974b8dd-node-certs\") pod \"calico-node-6t99w\" (UID: \"8e50b0d7-9159-4c8d-8657-baa12974b8dd\") " pod="calico-system/calico-node-6t99w" Sep 12 17:23:47.414583 kubelet[3450]: I0912 17:23:47.414543 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/8e50b0d7-9159-4c8d-8657-baa12974b8dd-policysync\") pod \"calico-node-6t99w\" (UID: \"8e50b0d7-9159-4c8d-8657-baa12974b8dd\") " pod="calico-system/calico-node-6t99w" Sep 12 17:23:47.414663 kubelet[3450]: I0912 17:23:47.414551 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/8e50b0d7-9159-4c8d-8657-baa12974b8dd-var-run-calico\") pod \"calico-node-6t99w\" (UID: \"8e50b0d7-9159-4c8d-8657-baa12974b8dd\") " pod="calico-system/calico-node-6t99w" Sep 12 17:23:47.414663 kubelet[3450]: I0912 17:23:47.414568 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/8e50b0d7-9159-4c8d-8657-baa12974b8dd-cni-bin-dir\") pod \"calico-node-6t99w\" (UID: \"8e50b0d7-9159-4c8d-8657-baa12974b8dd\") " pod="calico-system/calico-node-6t99w" Sep 12 17:23:47.414663 kubelet[3450]: I0912 17:23:47.414577 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/8e50b0d7-9159-4c8d-8657-baa12974b8dd-flexvol-driver-host\") pod \"calico-node-6t99w\" (UID: \"8e50b0d7-9159-4c8d-8657-baa12974b8dd\") " pod="calico-system/calico-node-6t99w" Sep 12 17:23:47.414663 kubelet[3450]: I0912 17:23:47.414589 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/8e50b0d7-9159-4c8d-8657-baa12974b8dd-cni-net-dir\") pod \"calico-node-6t99w\" (UID: \"8e50b0d7-9159-4c8d-8657-baa12974b8dd\") " pod="calico-system/calico-node-6t99w" Sep 12 17:23:47.414663 kubelet[3450]: I0912 17:23:47.414598 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/8e50b0d7-9159-4c8d-8657-baa12974b8dd-var-lib-calico\") pod \"calico-node-6t99w\" (UID: \"8e50b0d7-9159-4c8d-8657-baa12974b8dd\") " pod="calico-system/calico-node-6t99w" Sep 12 17:23:47.516756 kubelet[3450]: E0912 17:23:47.516711 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.516756 kubelet[3450]: W0912 17:23:47.516732 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.516756 kubelet[3450]: E0912 17:23:47.516758 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.521171 kubelet[3450]: E0912 17:23:47.520183 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.521171 kubelet[3450]: W0912 17:23:47.520203 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.521171 kubelet[3450]: E0912 17:23:47.520216 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.537152 kubelet[3450]: E0912 17:23:47.535912 3450 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-76j79" podUID="8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae" Sep 12 17:23:47.542808 kubelet[3450]: E0912 17:23:47.541519 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.542808 kubelet[3450]: W0912 17:23:47.541532 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.542808 kubelet[3450]: E0912 17:23:47.541542 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.595811 containerd[1886]: time="2025-09-12T17:23:47.595776811Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-d78fcb9c9-4xlz7,Uid:e6b5cd69-4815-481e-bd5f-d872a621ffbd,Namespace:calico-system,Attempt:0,}" Sep 12 17:23:47.608166 kubelet[3450]: E0912 17:23:47.608147 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.608166 kubelet[3450]: W0912 17:23:47.608162 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.608253 kubelet[3450]: E0912 17:23:47.608174 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.608302 kubelet[3450]: E0912 17:23:47.608292 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.608325 kubelet[3450]: W0912 17:23:47.608298 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.608343 kubelet[3450]: E0912 17:23:47.608329 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.608482 kubelet[3450]: E0912 17:23:47.608470 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.608482 kubelet[3450]: W0912 17:23:47.608481 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.608536 kubelet[3450]: E0912 17:23:47.608489 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.608782 kubelet[3450]: E0912 17:23:47.608768 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.608782 kubelet[3450]: W0912 17:23:47.608780 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.608822 kubelet[3450]: E0912 17:23:47.608791 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.608947 kubelet[3450]: E0912 17:23:47.608936 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.608975 kubelet[3450]: W0912 17:23:47.608951 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.608975 kubelet[3450]: E0912 17:23:47.608959 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.609071 kubelet[3450]: E0912 17:23:47.609061 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.609071 kubelet[3450]: W0912 17:23:47.609068 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.609113 kubelet[3450]: E0912 17:23:47.609075 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.609191 kubelet[3450]: E0912 17:23:47.609181 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.609191 kubelet[3450]: W0912 17:23:47.609188 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.609241 kubelet[3450]: E0912 17:23:47.609194 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.609301 kubelet[3450]: E0912 17:23:47.609291 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.609301 kubelet[3450]: W0912 17:23:47.609297 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.609344 kubelet[3450]: E0912 17:23:47.609303 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.609464 kubelet[3450]: E0912 17:23:47.609454 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.609464 kubelet[3450]: W0912 17:23:47.609461 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.609507 kubelet[3450]: E0912 17:23:47.609467 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.609564 kubelet[3450]: E0912 17:23:47.609555 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.609564 kubelet[3450]: W0912 17:23:47.609561 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.609612 kubelet[3450]: E0912 17:23:47.609568 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.609665 kubelet[3450]: E0912 17:23:47.609656 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.609665 kubelet[3450]: W0912 17:23:47.609662 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.609708 kubelet[3450]: E0912 17:23:47.609667 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.609817 kubelet[3450]: E0912 17:23:47.609807 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.609817 kubelet[3450]: W0912 17:23:47.609814 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.609861 kubelet[3450]: E0912 17:23:47.609821 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.609928 kubelet[3450]: E0912 17:23:47.609919 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.609928 kubelet[3450]: W0912 17:23:47.609925 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.610016 kubelet[3450]: E0912 17:23:47.609931 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.610100 kubelet[3450]: E0912 17:23:47.610088 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.610100 kubelet[3450]: W0912 17:23:47.610098 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.610161 kubelet[3450]: E0912 17:23:47.610107 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.610366 kubelet[3450]: E0912 17:23:47.610233 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.610388 kubelet[3450]: W0912 17:23:47.610368 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.610388 kubelet[3450]: E0912 17:23:47.610380 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.610754 kubelet[3450]: E0912 17:23:47.610740 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.610754 kubelet[3450]: W0912 17:23:47.610752 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.610797 kubelet[3450]: E0912 17:23:47.610761 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.611281 kubelet[3450]: E0912 17:23:47.611265 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.611281 kubelet[3450]: W0912 17:23:47.611278 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.611373 kubelet[3450]: E0912 17:23:47.611288 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.611809 kubelet[3450]: E0912 17:23:47.611792 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.611809 kubelet[3450]: W0912 17:23:47.611805 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.611855 kubelet[3450]: E0912 17:23:47.611815 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.611961 kubelet[3450]: E0912 17:23:47.611950 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.611961 kubelet[3450]: W0912 17:23:47.611958 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.612011 kubelet[3450]: E0912 17:23:47.611965 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.612201 kubelet[3450]: E0912 17:23:47.612186 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.612222 kubelet[3450]: W0912 17:23:47.612199 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.612222 kubelet[3450]: E0912 17:23:47.612211 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.616485 kubelet[3450]: E0912 17:23:47.616379 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.616485 kubelet[3450]: W0912 17:23:47.616391 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.616485 kubelet[3450]: E0912 17:23:47.616402 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.616485 kubelet[3450]: I0912 17:23:47.616421 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae-registration-dir\") pod \"csi-node-driver-76j79\" (UID: \"8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae\") " pod="calico-system/csi-node-driver-76j79" Sep 12 17:23:47.616646 kubelet[3450]: E0912 17:23:47.616635 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.616744 kubelet[3450]: W0912 17:23:47.616690 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.616744 kubelet[3450]: E0912 17:23:47.616703 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.616744 kubelet[3450]: I0912 17:23:47.616723 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b42zm\" (UniqueName: \"kubernetes.io/projected/8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae-kube-api-access-b42zm\") pod \"csi-node-driver-76j79\" (UID: \"8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae\") " pod="calico-system/csi-node-driver-76j79" Sep 12 17:23:47.616917 kubelet[3450]: E0912 17:23:47.616896 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.616917 kubelet[3450]: W0912 17:23:47.616910 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.616967 kubelet[3450]: E0912 17:23:47.616920 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.617034 kubelet[3450]: E0912 17:23:47.617021 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.617034 kubelet[3450]: W0912 17:23:47.617030 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.617167 kubelet[3450]: E0912 17:23:47.617036 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.617321 kubelet[3450]: E0912 17:23:47.617305 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.617321 kubelet[3450]: W0912 17:23:47.617317 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.617362 kubelet[3450]: E0912 17:23:47.617328 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.617362 kubelet[3450]: I0912 17:23:47.617350 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae-socket-dir\") pod \"csi-node-driver-76j79\" (UID: \"8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae\") " pod="calico-system/csi-node-driver-76j79" Sep 12 17:23:47.617511 kubelet[3450]: E0912 17:23:47.617498 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.617511 kubelet[3450]: W0912 17:23:47.617508 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.617563 kubelet[3450]: E0912 17:23:47.617516 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.617563 kubelet[3450]: I0912 17:23:47.617531 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae-kubelet-dir\") pod \"csi-node-driver-76j79\" (UID: \"8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae\") " pod="calico-system/csi-node-driver-76j79" Sep 12 17:23:47.617650 kubelet[3450]: E0912 17:23:47.617636 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.617650 kubelet[3450]: W0912 17:23:47.617646 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.617797 kubelet[3450]: E0912 17:23:47.617653 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.617797 kubelet[3450]: I0912 17:23:47.617664 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae-varrun\") pod \"csi-node-driver-76j79\" (UID: \"8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae\") " pod="calico-system/csi-node-driver-76j79" Sep 12 17:23:47.617888 kubelet[3450]: E0912 17:23:47.617876 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.617940 kubelet[3450]: W0912 17:23:47.617929 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.617986 kubelet[3450]: E0912 17:23:47.617977 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.618246 kubelet[3450]: E0912 17:23:47.618161 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.618246 kubelet[3450]: W0912 17:23:47.618172 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.618246 kubelet[3450]: E0912 17:23:47.618180 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.618475 kubelet[3450]: E0912 17:23:47.618377 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.618475 kubelet[3450]: W0912 17:23:47.618388 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.618475 kubelet[3450]: E0912 17:23:47.618397 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.618603 kubelet[3450]: E0912 17:23:47.618594 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.618655 kubelet[3450]: W0912 17:23:47.618646 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.618768 kubelet[3450]: E0912 17:23:47.618691 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.618865 kubelet[3450]: E0912 17:23:47.618856 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.618906 kubelet[3450]: W0912 17:23:47.618899 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.619024 kubelet[3450]: E0912 17:23:47.618945 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.619113 kubelet[3450]: E0912 17:23:47.619104 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.619184 kubelet[3450]: W0912 17:23:47.619174 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.619300 kubelet[3450]: E0912 17:23:47.619217 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.619568 kubelet[3450]: E0912 17:23:47.619559 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.619618 kubelet[3450]: W0912 17:23:47.619610 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.619670 kubelet[3450]: E0912 17:23:47.619661 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.619881 kubelet[3450]: E0912 17:23:47.619849 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.619881 kubelet[3450]: W0912 17:23:47.619858 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.619881 kubelet[3450]: E0912 17:23:47.619866 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.655442 containerd[1886]: time="2025-09-12T17:23:47.655357119Z" level=info msg="connecting to shim be18bf845556d368811dd2a87c83bf1a9ba5527f1ad25bbc049a1db6662e46e4" address="unix:///run/containerd/s/af802cf80556150ec25174fcc0ec89dd0ccc375ca98f7709b73784627659531e" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:23:47.677346 systemd[1]: Started cri-containerd-be18bf845556d368811dd2a87c83bf1a9ba5527f1ad25bbc049a1db6662e46e4.scope - libcontainer container be18bf845556d368811dd2a87c83bf1a9ba5527f1ad25bbc049a1db6662e46e4. Sep 12 17:23:47.689560 containerd[1886]: time="2025-09-12T17:23:47.689366145Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-6t99w,Uid:8e50b0d7-9159-4c8d-8657-baa12974b8dd,Namespace:calico-system,Attempt:0,}" Sep 12 17:23:47.718585 kubelet[3450]: E0912 17:23:47.718525 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.718738 kubelet[3450]: W0912 17:23:47.718704 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.719143 kubelet[3450]: E0912 17:23:47.718886 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.719394 kubelet[3450]: E0912 17:23:47.719380 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.719569 kubelet[3450]: W0912 17:23:47.719548 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.719640 kubelet[3450]: E0912 17:23:47.719630 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.719874 kubelet[3450]: E0912 17:23:47.719857 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.719874 kubelet[3450]: W0912 17:23:47.719873 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.719954 kubelet[3450]: E0912 17:23:47.719885 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.720030 kubelet[3450]: E0912 17:23:47.720016 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.720030 kubelet[3450]: W0912 17:23:47.720026 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.720199 kubelet[3450]: E0912 17:23:47.720034 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.720266 kubelet[3450]: E0912 17:23:47.720253 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.720266 kubelet[3450]: W0912 17:23:47.720260 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.720315 kubelet[3450]: E0912 17:23:47.720268 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.720485 kubelet[3450]: E0912 17:23:47.720470 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.720485 kubelet[3450]: W0912 17:23:47.720480 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.720543 kubelet[3450]: E0912 17:23:47.720487 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.720633 kubelet[3450]: E0912 17:23:47.720624 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.720633 kubelet[3450]: W0912 17:23:47.720631 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.720688 kubelet[3450]: E0912 17:23:47.720638 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.720764 kubelet[3450]: E0912 17:23:47.720752 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.720764 kubelet[3450]: W0912 17:23:47.720760 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.721069 kubelet[3450]: E0912 17:23:47.720766 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.721069 kubelet[3450]: E0912 17:23:47.720986 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.721069 kubelet[3450]: W0912 17:23:47.720997 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.721209 kubelet[3450]: E0912 17:23:47.721007 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.721509 kubelet[3450]: E0912 17:23:47.721422 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.721678 kubelet[3450]: W0912 17:23:47.721574 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.721678 kubelet[3450]: E0912 17:23:47.721594 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.721805 kubelet[3450]: E0912 17:23:47.721795 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.721869 kubelet[3450]: W0912 17:23:47.721845 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.721922 kubelet[3450]: E0912 17:23:47.721910 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.722256 kubelet[3450]: E0912 17:23:47.722162 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.722256 kubelet[3450]: W0912 17:23:47.722173 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.722256 kubelet[3450]: E0912 17:23:47.722185 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.723794 kubelet[3450]: E0912 17:23:47.722342 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.723794 kubelet[3450]: W0912 17:23:47.722350 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.723794 kubelet[3450]: E0912 17:23:47.722359 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.723794 kubelet[3450]: E0912 17:23:47.722578 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.723794 kubelet[3450]: W0912 17:23:47.722587 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.723794 kubelet[3450]: E0912 17:23:47.722597 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.723794 kubelet[3450]: E0912 17:23:47.722722 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.723794 kubelet[3450]: W0912 17:23:47.722728 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.723794 kubelet[3450]: E0912 17:23:47.722735 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.723794 kubelet[3450]: E0912 17:23:47.722896 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.723936 kubelet[3450]: W0912 17:23:47.722908 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.723936 kubelet[3450]: E0912 17:23:47.722917 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.723936 kubelet[3450]: E0912 17:23:47.723056 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.723936 kubelet[3450]: W0912 17:23:47.723062 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.723936 kubelet[3450]: E0912 17:23:47.723078 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.723936 kubelet[3450]: E0912 17:23:47.723255 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.723936 kubelet[3450]: W0912 17:23:47.723263 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.723936 kubelet[3450]: E0912 17:23:47.723270 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.723936 kubelet[3450]: E0912 17:23:47.723430 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.723936 kubelet[3450]: W0912 17:23:47.723439 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.724195 kubelet[3450]: E0912 17:23:47.723448 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.724195 kubelet[3450]: E0912 17:23:47.723562 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.724195 kubelet[3450]: W0912 17:23:47.723568 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.724195 kubelet[3450]: E0912 17:23:47.723574 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.724195 kubelet[3450]: E0912 17:23:47.723702 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.724195 kubelet[3450]: W0912 17:23:47.723708 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.724195 kubelet[3450]: E0912 17:23:47.723715 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.724195 kubelet[3450]: E0912 17:23:47.723863 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.724195 kubelet[3450]: W0912 17:23:47.723869 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.724195 kubelet[3450]: E0912 17:23:47.723876 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.724372 kubelet[3450]: E0912 17:23:47.724027 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.724372 kubelet[3450]: W0912 17:23:47.724033 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.724372 kubelet[3450]: E0912 17:23:47.724040 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.725086 kubelet[3450]: E0912 17:23:47.725069 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.725086 kubelet[3450]: W0912 17:23:47.725082 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.725161 kubelet[3450]: E0912 17:23:47.725094 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.725262 kubelet[3450]: E0912 17:23:47.725251 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.725262 kubelet[3450]: W0912 17:23:47.725259 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.725395 kubelet[3450]: E0912 17:23:47.725293 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.728025 containerd[1886]: time="2025-09-12T17:23:47.727953254Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-d78fcb9c9-4xlz7,Uid:e6b5cd69-4815-481e-bd5f-d872a621ffbd,Namespace:calico-system,Attempt:0,} returns sandbox id \"be18bf845556d368811dd2a87c83bf1a9ba5527f1ad25bbc049a1db6662e46e4\"" Sep 12 17:23:47.729832 containerd[1886]: time="2025-09-12T17:23:47.729726853Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 12 17:23:47.739525 kubelet[3450]: E0912 17:23:47.739505 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:47.739525 kubelet[3450]: W0912 17:23:47.739519 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:47.739525 kubelet[3450]: E0912 17:23:47.739529 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:47.772952 containerd[1886]: time="2025-09-12T17:23:47.772648893Z" level=info msg="connecting to shim 031dfa125bb1d2bbe003521fa53fdce60a14d2e5f0220628ca04ea7016e53c40" address="unix:///run/containerd/s/3590e8a22d239fd08e8fdce5b83814aadfbf6bfd7f48fcf4556929b881fccb80" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:23:47.795258 systemd[1]: Started cri-containerd-031dfa125bb1d2bbe003521fa53fdce60a14d2e5f0220628ca04ea7016e53c40.scope - libcontainer container 031dfa125bb1d2bbe003521fa53fdce60a14d2e5f0220628ca04ea7016e53c40. Sep 12 17:23:47.821670 containerd[1886]: time="2025-09-12T17:23:47.821636703Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-6t99w,Uid:8e50b0d7-9159-4c8d-8657-baa12974b8dd,Namespace:calico-system,Attempt:0,} returns sandbox id \"031dfa125bb1d2bbe003521fa53fdce60a14d2e5f0220628ca04ea7016e53c40\"" Sep 12 17:23:48.642743 kubelet[3450]: E0912 17:23:48.642696 3450 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-76j79" podUID="8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae" Sep 12 17:23:49.256631 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3376275258.mount: Deactivated successfully. Sep 12 17:23:49.709248 containerd[1886]: time="2025-09-12T17:23:49.709212936Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:23:49.715392 containerd[1886]: time="2025-09-12T17:23:49.715361515Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 12 17:23:49.719848 containerd[1886]: time="2025-09-12T17:23:49.719798778Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:23:49.727241 containerd[1886]: time="2025-09-12T17:23:49.727179334Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:23:49.728865 containerd[1886]: time="2025-09-12T17:23:49.727613010Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 1.997863037s" Sep 12 17:23:49.728865 containerd[1886]: time="2025-09-12T17:23:49.728798250Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 12 17:23:49.732601 containerd[1886]: time="2025-09-12T17:23:49.732447187Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 12 17:23:49.749569 containerd[1886]: time="2025-09-12T17:23:49.749550267Z" level=info msg="CreateContainer within sandbox \"be18bf845556d368811dd2a87c83bf1a9ba5527f1ad25bbc049a1db6662e46e4\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 12 17:23:49.774813 containerd[1886]: time="2025-09-12T17:23:49.774268182Z" level=info msg="Container 1f0398b9001109a87d35e8bb32a21abdd1d99c456f5011f2e38febbe5f04a8df: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:23:49.796875 containerd[1886]: time="2025-09-12T17:23:49.796790550Z" level=info msg="CreateContainer within sandbox \"be18bf845556d368811dd2a87c83bf1a9ba5527f1ad25bbc049a1db6662e46e4\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"1f0398b9001109a87d35e8bb32a21abdd1d99c456f5011f2e38febbe5f04a8df\"" Sep 12 17:23:49.797252 containerd[1886]: time="2025-09-12T17:23:49.797229930Z" level=info msg="StartContainer for \"1f0398b9001109a87d35e8bb32a21abdd1d99c456f5011f2e38febbe5f04a8df\"" Sep 12 17:23:49.798047 containerd[1886]: time="2025-09-12T17:23:49.797966565Z" level=info msg="connecting to shim 1f0398b9001109a87d35e8bb32a21abdd1d99c456f5011f2e38febbe5f04a8df" address="unix:///run/containerd/s/af802cf80556150ec25174fcc0ec89dd0ccc375ca98f7709b73784627659531e" protocol=ttrpc version=3 Sep 12 17:23:49.818270 systemd[1]: Started cri-containerd-1f0398b9001109a87d35e8bb32a21abdd1d99c456f5011f2e38febbe5f04a8df.scope - libcontainer container 1f0398b9001109a87d35e8bb32a21abdd1d99c456f5011f2e38febbe5f04a8df. Sep 12 17:23:49.856513 containerd[1886]: time="2025-09-12T17:23:49.856487957Z" level=info msg="StartContainer for \"1f0398b9001109a87d35e8bb32a21abdd1d99c456f5011f2e38febbe5f04a8df\" returns successfully" Sep 12 17:23:50.642698 kubelet[3450]: E0912 17:23:50.642651 3450 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-76j79" podUID="8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae" Sep 12 17:23:50.717225 kubelet[3450]: I0912 17:23:50.716685 3450 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-d78fcb9c9-4xlz7" podStartSLOduration=1.715367796 podStartE2EDuration="3.716673597s" podCreationTimestamp="2025-09-12 17:23:47 +0000 UTC" firstStartedPulling="2025-09-12 17:23:47.729517823 +0000 UTC m=+18.500264949" lastFinishedPulling="2025-09-12 17:23:49.730823624 +0000 UTC m=+20.501570750" observedRunningTime="2025-09-12 17:23:50.715941617 +0000 UTC m=+21.486688743" watchObservedRunningTime="2025-09-12 17:23:50.716673597 +0000 UTC m=+21.487420723" Sep 12 17:23:50.734249 kubelet[3450]: E0912 17:23:50.734232 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.734383 kubelet[3450]: W0912 17:23:50.734295 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.734383 kubelet[3450]: E0912 17:23:50.734311 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.734602 kubelet[3450]: E0912 17:23:50.734572 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.734685 kubelet[3450]: W0912 17:23:50.734582 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.734738 kubelet[3450]: E0912 17:23:50.734728 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.734965 kubelet[3450]: E0912 17:23:50.734920 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.734965 kubelet[3450]: W0912 17:23:50.734930 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.734965 kubelet[3450]: E0912 17:23:50.734938 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.735196 kubelet[3450]: E0912 17:23:50.735175 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.735196 kubelet[3450]: W0912 17:23:50.735185 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.735366 kubelet[3450]: E0912 17:23:50.735281 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.735476 kubelet[3450]: E0912 17:23:50.735463 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.735563 kubelet[3450]: W0912 17:23:50.735521 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.735563 kubelet[3450]: E0912 17:23:50.735534 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.735808 kubelet[3450]: E0912 17:23:50.735763 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.735808 kubelet[3450]: W0912 17:23:50.735772 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.735808 kubelet[3450]: E0912 17:23:50.735781 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.736069 kubelet[3450]: E0912 17:23:50.736024 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.736069 kubelet[3450]: W0912 17:23:50.736035 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.736069 kubelet[3450]: E0912 17:23:50.736043 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.736330 kubelet[3450]: E0912 17:23:50.736292 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.736330 kubelet[3450]: W0912 17:23:50.736303 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.736330 kubelet[3450]: E0912 17:23:50.736312 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.736562 kubelet[3450]: E0912 17:23:50.736546 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.736657 kubelet[3450]: W0912 17:23:50.736556 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.736657 kubelet[3450]: E0912 17:23:50.736628 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.736886 kubelet[3450]: E0912 17:23:50.736839 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.736886 kubelet[3450]: W0912 17:23:50.736848 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.736886 kubelet[3450]: E0912 17:23:50.736856 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.737142 kubelet[3450]: E0912 17:23:50.737092 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.737142 kubelet[3450]: W0912 17:23:50.737101 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.737142 kubelet[3450]: E0912 17:23:50.737109 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.737383 kubelet[3450]: E0912 17:23:50.737353 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.737383 kubelet[3450]: W0912 17:23:50.737363 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.737383 kubelet[3450]: E0912 17:23:50.737371 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.737617 kubelet[3450]: E0912 17:23:50.737604 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.737718 kubelet[3450]: W0912 17:23:50.737676 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.737718 kubelet[3450]: E0912 17:23:50.737688 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.737943 kubelet[3450]: E0912 17:23:50.737900 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.737943 kubelet[3450]: W0912 17:23:50.737909 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.737943 kubelet[3450]: E0912 17:23:50.737918 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.738211 kubelet[3450]: E0912 17:23:50.738155 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.738211 kubelet[3450]: W0912 17:23:50.738166 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.738211 kubelet[3450]: E0912 17:23:50.738174 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.741593 kubelet[3450]: E0912 17:23:50.741512 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.741593 kubelet[3450]: W0912 17:23:50.741523 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.741593 kubelet[3450]: E0912 17:23:50.741533 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.741819 kubelet[3450]: E0912 17:23:50.741808 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.741872 kubelet[3450]: W0912 17:23:50.741864 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.741919 kubelet[3450]: E0912 17:23:50.741910 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.742151 kubelet[3450]: E0912 17:23:50.742119 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.742151 kubelet[3450]: W0912 17:23:50.742143 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.742315 kubelet[3450]: E0912 17:23:50.742156 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.742315 kubelet[3450]: E0912 17:23:50.742281 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.742315 kubelet[3450]: W0912 17:23:50.742287 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.742315 kubelet[3450]: E0912 17:23:50.742295 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.742426 kubelet[3450]: E0912 17:23:50.742381 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.742426 kubelet[3450]: W0912 17:23:50.742386 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.742426 kubelet[3450]: E0912 17:23:50.742391 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.742526 kubelet[3450]: E0912 17:23:50.742489 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.742526 kubelet[3450]: W0912 17:23:50.742494 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.742526 kubelet[3450]: E0912 17:23:50.742499 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.742733 kubelet[3450]: E0912 17:23:50.742724 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.742797 kubelet[3450]: W0912 17:23:50.742787 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.742882 kubelet[3450]: E0912 17:23:50.742831 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.742978 kubelet[3450]: E0912 17:23:50.742966 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.742978 kubelet[3450]: W0912 17:23:50.742976 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.743043 kubelet[3450]: E0912 17:23:50.742983 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.743080 kubelet[3450]: E0912 17:23:50.743076 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.743123 kubelet[3450]: W0912 17:23:50.743080 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.743123 kubelet[3450]: E0912 17:23:50.743087 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.743190 kubelet[3450]: E0912 17:23:50.743176 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.743190 kubelet[3450]: W0912 17:23:50.743180 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.743190 kubelet[3450]: E0912 17:23:50.743186 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.743300 kubelet[3450]: E0912 17:23:50.743290 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.743300 kubelet[3450]: W0912 17:23:50.743296 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.743344 kubelet[3450]: E0912 17:23:50.743304 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.743511 kubelet[3450]: E0912 17:23:50.743485 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.743511 kubelet[3450]: W0912 17:23:50.743494 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.743657 kubelet[3450]: E0912 17:23:50.743580 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.743812 kubelet[3450]: E0912 17:23:50.743803 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.743925 kubelet[3450]: W0912 17:23:50.743866 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.743925 kubelet[3450]: E0912 17:23:50.743879 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.744141 kubelet[3450]: E0912 17:23:50.744093 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.744141 kubelet[3450]: W0912 17:23:50.744103 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.744141 kubelet[3450]: E0912 17:23:50.744112 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.744433 kubelet[3450]: E0912 17:23:50.744369 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.744433 kubelet[3450]: W0912 17:23:50.744379 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.744433 kubelet[3450]: E0912 17:23:50.744388 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.744647 kubelet[3450]: E0912 17:23:50.744636 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.744818 kubelet[3450]: W0912 17:23:50.744698 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.744818 kubelet[3450]: E0912 17:23:50.744710 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.744904 kubelet[3450]: E0912 17:23:50.744889 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.744904 kubelet[3450]: W0912 17:23:50.744901 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.744979 kubelet[3450]: E0912 17:23:50.744910 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:50.745033 kubelet[3450]: E0912 17:23:50.745022 3450 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:23:50.745033 kubelet[3450]: W0912 17:23:50.745029 3450 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:23:50.745069 kubelet[3450]: E0912 17:23:50.745035 3450 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:23:51.355009 containerd[1886]: time="2025-09-12T17:23:51.354538061Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:23:51.358784 containerd[1886]: time="2025-09-12T17:23:51.358762035Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 12 17:23:51.363974 containerd[1886]: time="2025-09-12T17:23:51.363929562Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:23:51.368991 containerd[1886]: time="2025-09-12T17:23:51.368966326Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:23:51.369488 containerd[1886]: time="2025-09-12T17:23:51.369333764Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 1.636863017s" Sep 12 17:23:51.369488 containerd[1886]: time="2025-09-12T17:23:51.369432846Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 12 17:23:51.378878 containerd[1886]: time="2025-09-12T17:23:51.378559615Z" level=info msg="CreateContainer within sandbox \"031dfa125bb1d2bbe003521fa53fdce60a14d2e5f0220628ca04ea7016e53c40\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 12 17:23:51.406694 containerd[1886]: time="2025-09-12T17:23:51.406669669Z" level=info msg="Container b1bb50e35656ea5e65aff774556d6ac321b5546f714c042a960d2e82bbde90bb: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:23:51.427725 containerd[1886]: time="2025-09-12T17:23:51.427694157Z" level=info msg="CreateContainer within sandbox \"031dfa125bb1d2bbe003521fa53fdce60a14d2e5f0220628ca04ea7016e53c40\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"b1bb50e35656ea5e65aff774556d6ac321b5546f714c042a960d2e82bbde90bb\"" Sep 12 17:23:51.428792 containerd[1886]: time="2025-09-12T17:23:51.428766847Z" level=info msg="StartContainer for \"b1bb50e35656ea5e65aff774556d6ac321b5546f714c042a960d2e82bbde90bb\"" Sep 12 17:23:51.431134 containerd[1886]: time="2025-09-12T17:23:51.431054165Z" level=info msg="connecting to shim b1bb50e35656ea5e65aff774556d6ac321b5546f714c042a960d2e82bbde90bb" address="unix:///run/containerd/s/3590e8a22d239fd08e8fdce5b83814aadfbf6bfd7f48fcf4556929b881fccb80" protocol=ttrpc version=3 Sep 12 17:23:51.451285 systemd[1]: Started cri-containerd-b1bb50e35656ea5e65aff774556d6ac321b5546f714c042a960d2e82bbde90bb.scope - libcontainer container b1bb50e35656ea5e65aff774556d6ac321b5546f714c042a960d2e82bbde90bb. Sep 12 17:23:51.493364 containerd[1886]: time="2025-09-12T17:23:51.493322351Z" level=info msg="StartContainer for \"b1bb50e35656ea5e65aff774556d6ac321b5546f714c042a960d2e82bbde90bb\" returns successfully" Sep 12 17:23:51.496977 systemd[1]: cri-containerd-b1bb50e35656ea5e65aff774556d6ac321b5546f714c042a960d2e82bbde90bb.scope: Deactivated successfully. Sep 12 17:23:51.498271 containerd[1886]: time="2025-09-12T17:23:51.498233953Z" level=info msg="received exit event container_id:\"b1bb50e35656ea5e65aff774556d6ac321b5546f714c042a960d2e82bbde90bb\" id:\"b1bb50e35656ea5e65aff774556d6ac321b5546f714c042a960d2e82bbde90bb\" pid:4116 exited_at:{seconds:1757697831 nanos:497742809}" Sep 12 17:23:51.499331 containerd[1886]: time="2025-09-12T17:23:51.499298211Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b1bb50e35656ea5e65aff774556d6ac321b5546f714c042a960d2e82bbde90bb\" id:\"b1bb50e35656ea5e65aff774556d6ac321b5546f714c042a960d2e82bbde90bb\" pid:4116 exited_at:{seconds:1757697831 nanos:497742809}" Sep 12 17:23:51.517145 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b1bb50e35656ea5e65aff774556d6ac321b5546f714c042a960d2e82bbde90bb-rootfs.mount: Deactivated successfully. Sep 12 17:23:51.705977 kubelet[3450]: I0912 17:23:51.705911 3450 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:23:52.642721 kubelet[3450]: E0912 17:23:52.642669 3450 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-76j79" podUID="8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae" Sep 12 17:23:52.713161 containerd[1886]: time="2025-09-12T17:23:52.712433699Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 12 17:23:54.642932 kubelet[3450]: E0912 17:23:54.642884 3450 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-76j79" podUID="8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae" Sep 12 17:23:55.481883 containerd[1886]: time="2025-09-12T17:23:55.481430737Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:23:55.488390 containerd[1886]: time="2025-09-12T17:23:55.488361096Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 12 17:23:55.492772 containerd[1886]: time="2025-09-12T17:23:55.492751299Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:23:55.499421 containerd[1886]: time="2025-09-12T17:23:55.499389411Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:23:55.499825 containerd[1886]: time="2025-09-12T17:23:55.499804286Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 2.787337218s" Sep 12 17:23:55.499900 containerd[1886]: time="2025-09-12T17:23:55.499888512Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 12 17:23:55.508063 containerd[1886]: time="2025-09-12T17:23:55.508041959Z" level=info msg="CreateContainer within sandbox \"031dfa125bb1d2bbe003521fa53fdce60a14d2e5f0220628ca04ea7016e53c40\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 12 17:23:55.536439 containerd[1886]: time="2025-09-12T17:23:55.536415508Z" level=info msg="Container 2f5d7a24a0bda0d010714fda8317579431547cb018df81a944e9067e44c92a8e: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:23:55.557727 containerd[1886]: time="2025-09-12T17:23:55.557608140Z" level=info msg="CreateContainer within sandbox \"031dfa125bb1d2bbe003521fa53fdce60a14d2e5f0220628ca04ea7016e53c40\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"2f5d7a24a0bda0d010714fda8317579431547cb018df81a944e9067e44c92a8e\"" Sep 12 17:23:55.558044 containerd[1886]: time="2025-09-12T17:23:55.557938884Z" level=info msg="StartContainer for \"2f5d7a24a0bda0d010714fda8317579431547cb018df81a944e9067e44c92a8e\"" Sep 12 17:23:55.559340 containerd[1886]: time="2025-09-12T17:23:55.559321001Z" level=info msg="connecting to shim 2f5d7a24a0bda0d010714fda8317579431547cb018df81a944e9067e44c92a8e" address="unix:///run/containerd/s/3590e8a22d239fd08e8fdce5b83814aadfbf6bfd7f48fcf4556929b881fccb80" protocol=ttrpc version=3 Sep 12 17:23:55.577262 systemd[1]: Started cri-containerd-2f5d7a24a0bda0d010714fda8317579431547cb018df81a944e9067e44c92a8e.scope - libcontainer container 2f5d7a24a0bda0d010714fda8317579431547cb018df81a944e9067e44c92a8e. Sep 12 17:23:55.608755 containerd[1886]: time="2025-09-12T17:23:55.608684416Z" level=info msg="StartContainer for \"2f5d7a24a0bda0d010714fda8317579431547cb018df81a944e9067e44c92a8e\" returns successfully" Sep 12 17:23:56.642797 kubelet[3450]: E0912 17:23:56.642746 3450 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-76j79" podUID="8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae" Sep 12 17:23:56.734562 containerd[1886]: time="2025-09-12T17:23:56.734468304Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 17:23:56.737835 containerd[1886]: time="2025-09-12T17:23:56.737804408Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2f5d7a24a0bda0d010714fda8317579431547cb018df81a944e9067e44c92a8e\" id:\"2f5d7a24a0bda0d010714fda8317579431547cb018df81a944e9067e44c92a8e\" pid:4176 exited_at:{seconds:1757697836 nanos:737564753}" Sep 12 17:23:56.737917 containerd[1886]: time="2025-09-12T17:23:56.737857257Z" level=info msg="received exit event container_id:\"2f5d7a24a0bda0d010714fda8317579431547cb018df81a944e9067e44c92a8e\" id:\"2f5d7a24a0bda0d010714fda8317579431547cb018df81a944e9067e44c92a8e\" pid:4176 exited_at:{seconds:1757697836 nanos:737564753}" Sep 12 17:23:56.738059 systemd[1]: cri-containerd-2f5d7a24a0bda0d010714fda8317579431547cb018df81a944e9067e44c92a8e.scope: Deactivated successfully. Sep 12 17:23:56.738711 systemd[1]: cri-containerd-2f5d7a24a0bda0d010714fda8317579431547cb018df81a944e9067e44c92a8e.scope: Consumed 314ms CPU time, 185.1M memory peak, 165.8M written to disk. Sep 12 17:23:56.754782 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2f5d7a24a0bda0d010714fda8317579431547cb018df81a944e9067e44c92a8e-rootfs.mount: Deactivated successfully. Sep 12 17:23:56.820699 kubelet[3450]: I0912 17:23:56.820499 3450 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 12 17:23:57.592741 systemd[1]: Created slice kubepods-burstable-pod9a5be5b5_06ad_4760_958c_44bbde584588.slice - libcontainer container kubepods-burstable-pod9a5be5b5_06ad_4760_958c_44bbde584588.slice. Sep 12 17:23:57.612470 systemd[1]: Created slice kubepods-burstable-podccd2bbe7_dc74_41c7_8369_074eb02b14c4.slice - libcontainer container kubepods-burstable-podccd2bbe7_dc74_41c7_8369_074eb02b14c4.slice. Sep 12 17:23:57.623479 systemd[1]: Created slice kubepods-besteffort-pod5c1c8cbf_c9bc_4bb5_9f61_27755b4a4152.slice - libcontainer container kubepods-besteffort-pod5c1c8cbf_c9bc_4bb5_9f61_27755b4a4152.slice. Sep 12 17:23:57.631386 systemd[1]: Created slice kubepods-besteffort-pod22a9e9f7_3d62_4a87_b321_c5550b85dec3.slice - libcontainer container kubepods-besteffort-pod22a9e9f7_3d62_4a87_b321_c5550b85dec3.slice. Sep 12 17:23:57.637203 systemd[1]: Created slice kubepods-besteffort-pod22071108_d5a7_42cd_8910_06a64e029d36.slice - libcontainer container kubepods-besteffort-pod22071108_d5a7_42cd_8910_06a64e029d36.slice. Sep 12 17:23:57.649783 systemd[1]: Created slice kubepods-besteffort-pod7a27628a_082d_4c3d_904c_c73dd3edd18e.slice - libcontainer container kubepods-besteffort-pod7a27628a_082d_4c3d_904c_c73dd3edd18e.slice. Sep 12 17:23:57.655665 systemd[1]: Created slice kubepods-besteffort-pod012d6e27_14a7_44d7_b5f3_972242703ab4.slice - libcontainer container kubepods-besteffort-pod012d6e27_14a7_44d7_b5f3_972242703ab4.slice. Sep 12 17:23:57.684226 kubelet[3450]: I0912 17:23:57.684191 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/22071108-d5a7-42cd-8910-06a64e029d36-whisker-backend-key-pair\") pod \"whisker-77f74795d-r56qb\" (UID: \"22071108-d5a7-42cd-8910-06a64e029d36\") " pod="calico-system/whisker-77f74795d-r56qb" Sep 12 17:23:57.684226 kubelet[3450]: I0912 17:23:57.684228 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr4s5\" (UniqueName: \"kubernetes.io/projected/ccd2bbe7-dc74-41c7-8369-074eb02b14c4-kube-api-access-cr4s5\") pod \"coredns-674b8bbfcf-q58t7\" (UID: \"ccd2bbe7-dc74-41c7-8369-074eb02b14c4\") " pod="kube-system/coredns-674b8bbfcf-q58t7" Sep 12 17:23:57.684510 kubelet[3450]: I0912 17:23:57.684240 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a9e9f7-3d62-4a87-b321-c5550b85dec3-config\") pod \"goldmane-54d579b49d-m5hn4\" (UID: \"22a9e9f7-3d62-4a87-b321-c5550b85dec3\") " pod="calico-system/goldmane-54d579b49d-m5hn4" Sep 12 17:23:57.684510 kubelet[3450]: I0912 17:23:57.684257 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8ddn\" (UniqueName: \"kubernetes.io/projected/22a9e9f7-3d62-4a87-b321-c5550b85dec3-kube-api-access-n8ddn\") pod \"goldmane-54d579b49d-m5hn4\" (UID: \"22a9e9f7-3d62-4a87-b321-c5550b85dec3\") " pod="calico-system/goldmane-54d579b49d-m5hn4" Sep 12 17:23:57.684510 kubelet[3450]: I0912 17:23:57.684269 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/7a27628a-082d-4c3d-904c-c73dd3edd18e-calico-apiserver-certs\") pod \"calico-apiserver-6b99b7b99b-gplt4\" (UID: \"7a27628a-082d-4c3d-904c-c73dd3edd18e\") " pod="calico-apiserver/calico-apiserver-6b99b7b99b-gplt4" Sep 12 17:23:57.684510 kubelet[3450]: I0912 17:23:57.684278 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/012d6e27-14a7-44d7-b5f3-972242703ab4-tigera-ca-bundle\") pod \"calico-kube-controllers-996756887-nfmh2\" (UID: \"012d6e27-14a7-44d7-b5f3-972242703ab4\") " pod="calico-system/calico-kube-controllers-996756887-nfmh2" Sep 12 17:23:57.684510 kubelet[3450]: I0912 17:23:57.684287 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a5be5b5-06ad-4760-958c-44bbde584588-config-volume\") pod \"coredns-674b8bbfcf-l8q56\" (UID: \"9a5be5b5-06ad-4760-958c-44bbde584588\") " pod="kube-system/coredns-674b8bbfcf-l8q56" Sep 12 17:23:57.684592 kubelet[3450]: I0912 17:23:57.684300 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22071108-d5a7-42cd-8910-06a64e029d36-whisker-ca-bundle\") pod \"whisker-77f74795d-r56qb\" (UID: \"22071108-d5a7-42cd-8910-06a64e029d36\") " pod="calico-system/whisker-77f74795d-r56qb" Sep 12 17:23:57.684592 kubelet[3450]: I0912 17:23:57.684312 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/5c1c8cbf-c9bc-4bb5-9f61-27755b4a4152-calico-apiserver-certs\") pod \"calico-apiserver-6b99b7b99b-58b54\" (UID: \"5c1c8cbf-c9bc-4bb5-9f61-27755b4a4152\") " pod="calico-apiserver/calico-apiserver-6b99b7b99b-58b54" Sep 12 17:23:57.684592 kubelet[3450]: I0912 17:23:57.684322 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tql4\" (UniqueName: \"kubernetes.io/projected/9a5be5b5-06ad-4760-958c-44bbde584588-kube-api-access-7tql4\") pod \"coredns-674b8bbfcf-l8q56\" (UID: \"9a5be5b5-06ad-4760-958c-44bbde584588\") " pod="kube-system/coredns-674b8bbfcf-l8q56" Sep 12 17:23:57.684592 kubelet[3450]: I0912 17:23:57.684331 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22a9e9f7-3d62-4a87-b321-c5550b85dec3-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-m5hn4\" (UID: \"22a9e9f7-3d62-4a87-b321-c5550b85dec3\") " pod="calico-system/goldmane-54d579b49d-m5hn4" Sep 12 17:23:57.684592 kubelet[3450]: I0912 17:23:57.684342 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxzcq\" (UniqueName: \"kubernetes.io/projected/012d6e27-14a7-44d7-b5f3-972242703ab4-kube-api-access-mxzcq\") pod \"calico-kube-controllers-996756887-nfmh2\" (UID: \"012d6e27-14a7-44d7-b5f3-972242703ab4\") " pod="calico-system/calico-kube-controllers-996756887-nfmh2" Sep 12 17:23:57.684668 kubelet[3450]: I0912 17:23:57.684351 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ccd2bbe7-dc74-41c7-8369-074eb02b14c4-config-volume\") pod \"coredns-674b8bbfcf-q58t7\" (UID: \"ccd2bbe7-dc74-41c7-8369-074eb02b14c4\") " pod="kube-system/coredns-674b8bbfcf-q58t7" Sep 12 17:23:57.684668 kubelet[3450]: I0912 17:23:57.684360 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dsxs\" (UniqueName: \"kubernetes.io/projected/5c1c8cbf-c9bc-4bb5-9f61-27755b4a4152-kube-api-access-9dsxs\") pod \"calico-apiserver-6b99b7b99b-58b54\" (UID: \"5c1c8cbf-c9bc-4bb5-9f61-27755b4a4152\") " pod="calico-apiserver/calico-apiserver-6b99b7b99b-58b54" Sep 12 17:23:57.684668 kubelet[3450]: I0912 17:23:57.684373 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddgjw\" (UniqueName: \"kubernetes.io/projected/22071108-d5a7-42cd-8910-06a64e029d36-kube-api-access-ddgjw\") pod \"whisker-77f74795d-r56qb\" (UID: \"22071108-d5a7-42cd-8910-06a64e029d36\") " pod="calico-system/whisker-77f74795d-r56qb" Sep 12 17:23:57.684668 kubelet[3450]: I0912 17:23:57.684382 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgwbt\" (UniqueName: \"kubernetes.io/projected/7a27628a-082d-4c3d-904c-c73dd3edd18e-kube-api-access-kgwbt\") pod \"calico-apiserver-6b99b7b99b-gplt4\" (UID: \"7a27628a-082d-4c3d-904c-c73dd3edd18e\") " pod="calico-apiserver/calico-apiserver-6b99b7b99b-gplt4" Sep 12 17:23:57.684668 kubelet[3450]: I0912 17:23:57.684392 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/22a9e9f7-3d62-4a87-b321-c5550b85dec3-goldmane-key-pair\") pod \"goldmane-54d579b49d-m5hn4\" (UID: \"22a9e9f7-3d62-4a87-b321-c5550b85dec3\") " pod="calico-system/goldmane-54d579b49d-m5hn4" Sep 12 17:23:57.722825 containerd[1886]: time="2025-09-12T17:23:57.722779123Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 12 17:23:57.896568 containerd[1886]: time="2025-09-12T17:23:57.896531430Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-l8q56,Uid:9a5be5b5-06ad-4760-958c-44bbde584588,Namespace:kube-system,Attempt:0,}" Sep 12 17:23:57.899379 kubelet[3450]: I0912 17:23:57.899079 3450 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:23:57.922305 containerd[1886]: time="2025-09-12T17:23:57.922277326Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-q58t7,Uid:ccd2bbe7-dc74-41c7-8369-074eb02b14c4,Namespace:kube-system,Attempt:0,}" Sep 12 17:23:57.929901 containerd[1886]: time="2025-09-12T17:23:57.929879151Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b99b7b99b-58b54,Uid:5c1c8cbf-c9bc-4bb5-9f61-27755b4a4152,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:23:57.936714 containerd[1886]: time="2025-09-12T17:23:57.936686138Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-m5hn4,Uid:22a9e9f7-3d62-4a87-b321-c5550b85dec3,Namespace:calico-system,Attempt:0,}" Sep 12 17:23:57.940681 containerd[1886]: time="2025-09-12T17:23:57.940652699Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-77f74795d-r56qb,Uid:22071108-d5a7-42cd-8910-06a64e029d36,Namespace:calico-system,Attempt:0,}" Sep 12 17:23:57.953908 containerd[1886]: time="2025-09-12T17:23:57.953878376Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b99b7b99b-gplt4,Uid:7a27628a-082d-4c3d-904c-c73dd3edd18e,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:23:57.962324 containerd[1886]: time="2025-09-12T17:23:57.962269790Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-996756887-nfmh2,Uid:012d6e27-14a7-44d7-b5f3-972242703ab4,Namespace:calico-system,Attempt:0,}" Sep 12 17:23:57.966739 containerd[1886]: time="2025-09-12T17:23:57.966675210Z" level=error msg="Failed to destroy network for sandbox \"a9cb536f63e48ad39103eae7980c18896cc18f829a1a85584bb408022e6c679c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:23:58.011590 containerd[1886]: time="2025-09-12T17:23:58.011391006Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-l8q56,Uid:9a5be5b5-06ad-4760-958c-44bbde584588,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9cb536f63e48ad39103eae7980c18896cc18f829a1a85584bb408022e6c679c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:23:58.012245 kubelet[3450]: E0912 17:23:58.012203 3450 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9cb536f63e48ad39103eae7980c18896cc18f829a1a85584bb408022e6c679c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:23:58.012606 kubelet[3450]: E0912 17:23:58.012269 3450 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9cb536f63e48ad39103eae7980c18896cc18f829a1a85584bb408022e6c679c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-l8q56" Sep 12 17:23:58.012606 kubelet[3450]: E0912 17:23:58.012288 3450 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9cb536f63e48ad39103eae7980c18896cc18f829a1a85584bb408022e6c679c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-l8q56" Sep 12 17:23:58.012606 kubelet[3450]: E0912 17:23:58.012356 3450 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-l8q56_kube-system(9a5be5b5-06ad-4760-958c-44bbde584588)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-l8q56_kube-system(9a5be5b5-06ad-4760-958c-44bbde584588)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a9cb536f63e48ad39103eae7980c18896cc18f829a1a85584bb408022e6c679c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-l8q56" podUID="9a5be5b5-06ad-4760-958c-44bbde584588" Sep 12 17:23:58.058620 containerd[1886]: time="2025-09-12T17:23:58.057353580Z" level=error msg="Failed to destroy network for sandbox \"4044d2c51cf759ef45502475b5a5c38973cf353ec904267f0f9ad34c0fd5bf78\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:23:58.066696 containerd[1886]: time="2025-09-12T17:23:58.066260623Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-q58t7,Uid:ccd2bbe7-dc74-41c7-8369-074eb02b14c4,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4044d2c51cf759ef45502475b5a5c38973cf353ec904267f0f9ad34c0fd5bf78\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:23:58.067941 kubelet[3450]: E0912 17:23:58.067875 3450 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4044d2c51cf759ef45502475b5a5c38973cf353ec904267f0f9ad34c0fd5bf78\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:23:58.068517 kubelet[3450]: E0912 17:23:58.068373 3450 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4044d2c51cf759ef45502475b5a5c38973cf353ec904267f0f9ad34c0fd5bf78\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-q58t7" Sep 12 17:23:58.068517 kubelet[3450]: E0912 17:23:58.068398 3450 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4044d2c51cf759ef45502475b5a5c38973cf353ec904267f0f9ad34c0fd5bf78\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-q58t7" Sep 12 17:23:58.068517 kubelet[3450]: E0912 17:23:58.068445 3450 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-q58t7_kube-system(ccd2bbe7-dc74-41c7-8369-074eb02b14c4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-q58t7_kube-system(ccd2bbe7-dc74-41c7-8369-074eb02b14c4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4044d2c51cf759ef45502475b5a5c38973cf353ec904267f0f9ad34c0fd5bf78\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-q58t7" podUID="ccd2bbe7-dc74-41c7-8369-074eb02b14c4" Sep 12 17:23:58.083445 containerd[1886]: time="2025-09-12T17:23:58.083418212Z" level=error msg="Failed to destroy network for sandbox \"d690d05403df906c9fc6c15059e0163dfd8678eecf1a7c5a51aebd4aeec91ca6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:23:58.089208 containerd[1886]: time="2025-09-12T17:23:58.088285364Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b99b7b99b-58b54,Uid:5c1c8cbf-c9bc-4bb5-9f61-27755b4a4152,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d690d05403df906c9fc6c15059e0163dfd8678eecf1a7c5a51aebd4aeec91ca6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:23:58.089529 kubelet[3450]: E0912 17:23:58.089453 3450 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d690d05403df906c9fc6c15059e0163dfd8678eecf1a7c5a51aebd4aeec91ca6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:23:58.089529 kubelet[3450]: E0912 17:23:58.089500 3450 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d690d05403df906c9fc6c15059e0163dfd8678eecf1a7c5a51aebd4aeec91ca6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b99b7b99b-58b54" Sep 12 17:23:58.089529 kubelet[3450]: E0912 17:23:58.089514 3450 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d690d05403df906c9fc6c15059e0163dfd8678eecf1a7c5a51aebd4aeec91ca6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b99b7b99b-58b54" Sep 12 17:23:58.089640 kubelet[3450]: E0912 17:23:58.089558 3450 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6b99b7b99b-58b54_calico-apiserver(5c1c8cbf-c9bc-4bb5-9f61-27755b4a4152)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6b99b7b99b-58b54_calico-apiserver(5c1c8cbf-c9bc-4bb5-9f61-27755b4a4152)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d690d05403df906c9fc6c15059e0163dfd8678eecf1a7c5a51aebd4aeec91ca6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6b99b7b99b-58b54" podUID="5c1c8cbf-c9bc-4bb5-9f61-27755b4a4152" Sep 12 17:23:58.098376 containerd[1886]: time="2025-09-12T17:23:58.098329454Z" level=error msg="Failed to destroy network for sandbox \"b7994fc697c1d8fb524ae16eb6f05d8b8336ecc4aa636a0d00b65ed0f5080b56\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:23:58.103537 containerd[1886]: time="2025-09-12T17:23:58.103435580Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-m5hn4,Uid:22a9e9f7-3d62-4a87-b321-c5550b85dec3,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b7994fc697c1d8fb524ae16eb6f05d8b8336ecc4aa636a0d00b65ed0f5080b56\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:23:58.103896 kubelet[3450]: E0912 17:23:58.103637 3450 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b7994fc697c1d8fb524ae16eb6f05d8b8336ecc4aa636a0d00b65ed0f5080b56\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:23:58.104034 kubelet[3450]: E0912 17:23:58.103907 3450 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b7994fc697c1d8fb524ae16eb6f05d8b8336ecc4aa636a0d00b65ed0f5080b56\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-m5hn4" Sep 12 17:23:58.104034 kubelet[3450]: E0912 17:23:58.103922 3450 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b7994fc697c1d8fb524ae16eb6f05d8b8336ecc4aa636a0d00b65ed0f5080b56\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-m5hn4" Sep 12 17:23:58.104034 kubelet[3450]: E0912 17:23:58.103964 3450 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-m5hn4_calico-system(22a9e9f7-3d62-4a87-b321-c5550b85dec3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-m5hn4_calico-system(22a9e9f7-3d62-4a87-b321-c5550b85dec3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b7994fc697c1d8fb524ae16eb6f05d8b8336ecc4aa636a0d00b65ed0f5080b56\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-m5hn4" podUID="22a9e9f7-3d62-4a87-b321-c5550b85dec3" Sep 12 17:23:58.114351 containerd[1886]: time="2025-09-12T17:23:58.114326444Z" level=error msg="Failed to destroy network for sandbox \"1dfce064f79b212acf070f74910f9f1eef5238b2413f5adf3df316afe3afbc67\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:23:58.116316 containerd[1886]: time="2025-09-12T17:23:58.116137308Z" level=error msg="Failed to destroy network for sandbox \"21e7560442e3754212b9e54c37bfef9191235ed0563e700a3dbc7f18087ab0d9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:23:58.130533 containerd[1886]: time="2025-09-12T17:23:58.130499487Z" level=error msg="Failed to destroy network for sandbox \"5f10107030caa382c4e76c899fb854570003c5fa93f7a600dfc1ca2db99e4c65\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:23:58.140982 containerd[1886]: time="2025-09-12T17:23:58.140954603Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-996756887-nfmh2,Uid:012d6e27-14a7-44d7-b5f3-972242703ab4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1dfce064f79b212acf070f74910f9f1eef5238b2413f5adf3df316afe3afbc67\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:23:58.141354 kubelet[3450]: E0912 17:23:58.141328 3450 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1dfce064f79b212acf070f74910f9f1eef5238b2413f5adf3df316afe3afbc67\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:23:58.141433 kubelet[3450]: E0912 17:23:58.141365 3450 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1dfce064f79b212acf070f74910f9f1eef5238b2413f5adf3df316afe3afbc67\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-996756887-nfmh2" Sep 12 17:23:58.141433 kubelet[3450]: E0912 17:23:58.141382 3450 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1dfce064f79b212acf070f74910f9f1eef5238b2413f5adf3df316afe3afbc67\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-996756887-nfmh2" Sep 12 17:23:58.141433 kubelet[3450]: E0912 17:23:58.141411 3450 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-996756887-nfmh2_calico-system(012d6e27-14a7-44d7-b5f3-972242703ab4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-996756887-nfmh2_calico-system(012d6e27-14a7-44d7-b5f3-972242703ab4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1dfce064f79b212acf070f74910f9f1eef5238b2413f5adf3df316afe3afbc67\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-996756887-nfmh2" podUID="012d6e27-14a7-44d7-b5f3-972242703ab4" Sep 12 17:23:58.149788 containerd[1886]: time="2025-09-12T17:23:58.149710242Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-77f74795d-r56qb,Uid:22071108-d5a7-42cd-8910-06a64e029d36,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"21e7560442e3754212b9e54c37bfef9191235ed0563e700a3dbc7f18087ab0d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:23:58.150917 kubelet[3450]: E0912 17:23:58.150402 3450 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"21e7560442e3754212b9e54c37bfef9191235ed0563e700a3dbc7f18087ab0d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:23:58.150917 kubelet[3450]: E0912 17:23:58.150433 3450 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"21e7560442e3754212b9e54c37bfef9191235ed0563e700a3dbc7f18087ab0d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-77f74795d-r56qb" Sep 12 17:23:58.150917 kubelet[3450]: E0912 17:23:58.150457 3450 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"21e7560442e3754212b9e54c37bfef9191235ed0563e700a3dbc7f18087ab0d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-77f74795d-r56qb" Sep 12 17:23:58.151025 kubelet[3450]: E0912 17:23:58.150485 3450 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-77f74795d-r56qb_calico-system(22071108-d5a7-42cd-8910-06a64e029d36)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-77f74795d-r56qb_calico-system(22071108-d5a7-42cd-8910-06a64e029d36)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"21e7560442e3754212b9e54c37bfef9191235ed0563e700a3dbc7f18087ab0d9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-77f74795d-r56qb" podUID="22071108-d5a7-42cd-8910-06a64e029d36" Sep 12 17:23:58.154102 containerd[1886]: time="2025-09-12T17:23:58.154072693Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b99b7b99b-gplt4,Uid:7a27628a-082d-4c3d-904c-c73dd3edd18e,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f10107030caa382c4e76c899fb854570003c5fa93f7a600dfc1ca2db99e4c65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:23:58.154358 kubelet[3450]: E0912 17:23:58.154340 3450 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f10107030caa382c4e76c899fb854570003c5fa93f7a600dfc1ca2db99e4c65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:23:58.154461 kubelet[3450]: E0912 17:23:58.154448 3450 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f10107030caa382c4e76c899fb854570003c5fa93f7a600dfc1ca2db99e4c65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b99b7b99b-gplt4" Sep 12 17:23:58.154545 kubelet[3450]: E0912 17:23:58.154515 3450 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f10107030caa382c4e76c899fb854570003c5fa93f7a600dfc1ca2db99e4c65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b99b7b99b-gplt4" Sep 12 17:23:58.154623 kubelet[3450]: E0912 17:23:58.154604 3450 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6b99b7b99b-gplt4_calico-apiserver(7a27628a-082d-4c3d-904c-c73dd3edd18e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6b99b7b99b-gplt4_calico-apiserver(7a27628a-082d-4c3d-904c-c73dd3edd18e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5f10107030caa382c4e76c899fb854570003c5fa93f7a600dfc1ca2db99e4c65\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6b99b7b99b-gplt4" podUID="7a27628a-082d-4c3d-904c-c73dd3edd18e" Sep 12 17:23:58.647200 systemd[1]: Created slice kubepods-besteffort-pod8f03c06d_da1b_4e3e_bc09_a87a7cbc90ae.slice - libcontainer container kubepods-besteffort-pod8f03c06d_da1b_4e3e_bc09_a87a7cbc90ae.slice. Sep 12 17:23:58.649272 containerd[1886]: time="2025-09-12T17:23:58.649078816Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-76j79,Uid:8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae,Namespace:calico-system,Attempt:0,}" Sep 12 17:23:58.708707 containerd[1886]: time="2025-09-12T17:23:58.708676131Z" level=error msg="Failed to destroy network for sandbox \"5a44e07eab4a0287f2e98f5ffb905694c8aefade3a28c3a91a7ccde1229931ce\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:23:58.712990 containerd[1886]: time="2025-09-12T17:23:58.712503634Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-76j79,Uid:8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a44e07eab4a0287f2e98f5ffb905694c8aefade3a28c3a91a7ccde1229931ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:23:58.713196 kubelet[3450]: E0912 17:23:58.712661 3450 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a44e07eab4a0287f2e98f5ffb905694c8aefade3a28c3a91a7ccde1229931ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:23:58.713196 kubelet[3450]: E0912 17:23:58.712701 3450 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a44e07eab4a0287f2e98f5ffb905694c8aefade3a28c3a91a7ccde1229931ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-76j79" Sep 12 17:23:58.713196 kubelet[3450]: E0912 17:23:58.712736 3450 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a44e07eab4a0287f2e98f5ffb905694c8aefade3a28c3a91a7ccde1229931ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-76j79" Sep 12 17:23:58.713508 kubelet[3450]: E0912 17:23:58.712769 3450 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-76j79_calico-system(8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-76j79_calico-system(8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5a44e07eab4a0287f2e98f5ffb905694c8aefade3a28c3a91a7ccde1229931ce\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-76j79" podUID="8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae" Sep 12 17:24:01.626058 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount259450972.mount: Deactivated successfully. Sep 12 17:24:01.954477 containerd[1886]: time="2025-09-12T17:24:01.954346490Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:01.964190 containerd[1886]: time="2025-09-12T17:24:01.964148801Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 12 17:24:01.968711 containerd[1886]: time="2025-09-12T17:24:01.968666971Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:01.976573 containerd[1886]: time="2025-09-12T17:24:01.976528761Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:01.977262 containerd[1886]: time="2025-09-12T17:24:01.976868905Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 4.254049341s" Sep 12 17:24:01.977262 containerd[1886]: time="2025-09-12T17:24:01.977220754Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 12 17:24:01.999971 containerd[1886]: time="2025-09-12T17:24:01.999940383Z" level=info msg="CreateContainer within sandbox \"031dfa125bb1d2bbe003521fa53fdce60a14d2e5f0220628ca04ea7016e53c40\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 12 17:24:02.033010 containerd[1886]: time="2025-09-12T17:24:02.032979903Z" level=info msg="Container 5c45c67467f07c0751a9c139448df8357573caebf1e31f41ebecff726904cb67: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:24:02.055786 containerd[1886]: time="2025-09-12T17:24:02.055748629Z" level=info msg="CreateContainer within sandbox \"031dfa125bb1d2bbe003521fa53fdce60a14d2e5f0220628ca04ea7016e53c40\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"5c45c67467f07c0751a9c139448df8357573caebf1e31f41ebecff726904cb67\"" Sep 12 17:24:02.056140 containerd[1886]: time="2025-09-12T17:24:02.056107086Z" level=info msg="StartContainer for \"5c45c67467f07c0751a9c139448df8357573caebf1e31f41ebecff726904cb67\"" Sep 12 17:24:02.058024 containerd[1886]: time="2025-09-12T17:24:02.057996678Z" level=info msg="connecting to shim 5c45c67467f07c0751a9c139448df8357573caebf1e31f41ebecff726904cb67" address="unix:///run/containerd/s/3590e8a22d239fd08e8fdce5b83814aadfbf6bfd7f48fcf4556929b881fccb80" protocol=ttrpc version=3 Sep 12 17:24:02.075254 systemd[1]: Started cri-containerd-5c45c67467f07c0751a9c139448df8357573caebf1e31f41ebecff726904cb67.scope - libcontainer container 5c45c67467f07c0751a9c139448df8357573caebf1e31f41ebecff726904cb67. Sep 12 17:24:02.111479 containerd[1886]: time="2025-09-12T17:24:02.111446032Z" level=info msg="StartContainer for \"5c45c67467f07c0751a9c139448df8357573caebf1e31f41ebecff726904cb67\" returns successfully" Sep 12 17:24:02.441395 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 12 17:24:02.441512 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 12 17:24:02.610361 kubelet[3450]: I0912 17:24:02.610335 3450 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22071108-d5a7-42cd-8910-06a64e029d36-whisker-ca-bundle\") pod \"22071108-d5a7-42cd-8910-06a64e029d36\" (UID: \"22071108-d5a7-42cd-8910-06a64e029d36\") " Sep 12 17:24:02.611773 kubelet[3450]: I0912 17:24:02.611449 3450 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddgjw\" (UniqueName: \"kubernetes.io/projected/22071108-d5a7-42cd-8910-06a64e029d36-kube-api-access-ddgjw\") pod \"22071108-d5a7-42cd-8910-06a64e029d36\" (UID: \"22071108-d5a7-42cd-8910-06a64e029d36\") " Sep 12 17:24:02.611773 kubelet[3450]: I0912 17:24:02.611508 3450 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/22071108-d5a7-42cd-8910-06a64e029d36-whisker-backend-key-pair\") pod \"22071108-d5a7-42cd-8910-06a64e029d36\" (UID: \"22071108-d5a7-42cd-8910-06a64e029d36\") " Sep 12 17:24:02.615774 kubelet[3450]: I0912 17:24:02.615581 3450 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22071108-d5a7-42cd-8910-06a64e029d36-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "22071108-d5a7-42cd-8910-06a64e029d36" (UID: "22071108-d5a7-42cd-8910-06a64e029d36"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 12 17:24:02.618254 kubelet[3450]: I0912 17:24:02.618221 3450 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22071108-d5a7-42cd-8910-06a64e029d36-kube-api-access-ddgjw" (OuterVolumeSpecName: "kube-api-access-ddgjw") pod "22071108-d5a7-42cd-8910-06a64e029d36" (UID: "22071108-d5a7-42cd-8910-06a64e029d36"). InnerVolumeSpecName "kube-api-access-ddgjw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 12 17:24:02.618420 kubelet[3450]: I0912 17:24:02.618329 3450 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22071108-d5a7-42cd-8910-06a64e029d36-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "22071108-d5a7-42cd-8910-06a64e029d36" (UID: "22071108-d5a7-42cd-8910-06a64e029d36"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 12 17:24:02.626002 systemd[1]: var-lib-kubelet-pods-22071108\x2dd5a7\x2d42cd\x2d8910\x2d06a64e029d36-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dddgjw.mount: Deactivated successfully. Sep 12 17:24:02.626215 systemd[1]: var-lib-kubelet-pods-22071108\x2dd5a7\x2d42cd\x2d8910\x2d06a64e029d36-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 12 17:24:02.712187 kubelet[3450]: I0912 17:24:02.712106 3450 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/22071108-d5a7-42cd-8910-06a64e029d36-whisker-backend-key-pair\") on node \"ci-4426.1.0-a-1fe763f55e\" DevicePath \"\"" Sep 12 17:24:02.712187 kubelet[3450]: I0912 17:24:02.712161 3450 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22071108-d5a7-42cd-8910-06a64e029d36-whisker-ca-bundle\") on node \"ci-4426.1.0-a-1fe763f55e\" DevicePath \"\"" Sep 12 17:24:02.712187 kubelet[3450]: I0912 17:24:02.712171 3450 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ddgjw\" (UniqueName: \"kubernetes.io/projected/22071108-d5a7-42cd-8910-06a64e029d36-kube-api-access-ddgjw\") on node \"ci-4426.1.0-a-1fe763f55e\" DevicePath \"\"" Sep 12 17:24:02.747440 systemd[1]: Removed slice kubepods-besteffort-pod22071108_d5a7_42cd_8910_06a64e029d36.slice - libcontainer container kubepods-besteffort-pod22071108_d5a7_42cd_8910_06a64e029d36.slice. Sep 12 17:24:02.757298 kubelet[3450]: I0912 17:24:02.757236 3450 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-6t99w" podStartSLOduration=1.6021485549999999 podStartE2EDuration="15.757225792s" podCreationTimestamp="2025-09-12 17:23:47 +0000 UTC" firstStartedPulling="2025-09-12 17:23:47.822804814 +0000 UTC m=+18.593551940" lastFinishedPulling="2025-09-12 17:24:01.977882051 +0000 UTC m=+32.748629177" observedRunningTime="2025-09-12 17:24:02.756687482 +0000 UTC m=+33.527434616" watchObservedRunningTime="2025-09-12 17:24:02.757225792 +0000 UTC m=+33.527972918" Sep 12 17:24:02.835538 systemd[1]: Created slice kubepods-besteffort-podd0b9063d_b5f3_41c2_9b40_f3b2478277d4.slice - libcontainer container kubepods-besteffort-podd0b9063d_b5f3_41c2_9b40_f3b2478277d4.slice. Sep 12 17:24:02.913122 kubelet[3450]: I0912 17:24:02.913084 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swgl5\" (UniqueName: \"kubernetes.io/projected/d0b9063d-b5f3-41c2-9b40-f3b2478277d4-kube-api-access-swgl5\") pod \"whisker-8cfdc47df-fwfd5\" (UID: \"d0b9063d-b5f3-41c2-9b40-f3b2478277d4\") " pod="calico-system/whisker-8cfdc47df-fwfd5" Sep 12 17:24:02.913258 kubelet[3450]: I0912 17:24:02.913153 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/d0b9063d-b5f3-41c2-9b40-f3b2478277d4-whisker-backend-key-pair\") pod \"whisker-8cfdc47df-fwfd5\" (UID: \"d0b9063d-b5f3-41c2-9b40-f3b2478277d4\") " pod="calico-system/whisker-8cfdc47df-fwfd5" Sep 12 17:24:02.913258 kubelet[3450]: I0912 17:24:02.913173 3450 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0b9063d-b5f3-41c2-9b40-f3b2478277d4-whisker-ca-bundle\") pod \"whisker-8cfdc47df-fwfd5\" (UID: \"d0b9063d-b5f3-41c2-9b40-f3b2478277d4\") " pod="calico-system/whisker-8cfdc47df-fwfd5" Sep 12 17:24:03.138995 containerd[1886]: time="2025-09-12T17:24:03.138720964Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-8cfdc47df-fwfd5,Uid:d0b9063d-b5f3-41c2-9b40-f3b2478277d4,Namespace:calico-system,Attempt:0,}" Sep 12 17:24:03.259657 systemd-networkd[1697]: calid61632939e4: Link UP Sep 12 17:24:03.259815 systemd-networkd[1697]: calid61632939e4: Gained carrier Sep 12 17:24:03.278714 containerd[1886]: 2025-09-12 17:24:03.163 [INFO][4497] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:24:03.278714 containerd[1886]: 2025-09-12 17:24:03.188 [INFO][4497] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--1fe763f55e-k8s-whisker--8cfdc47df--fwfd5-eth0 whisker-8cfdc47df- calico-system d0b9063d-b5f3-41c2-9b40-f3b2478277d4 877 0 2025-09-12 17:24:02 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:8cfdc47df projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4426.1.0-a-1fe763f55e whisker-8cfdc47df-fwfd5 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calid61632939e4 [] [] }} ContainerID="fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599" Namespace="calico-system" Pod="whisker-8cfdc47df-fwfd5" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-whisker--8cfdc47df--fwfd5-" Sep 12 17:24:03.278714 containerd[1886]: 2025-09-12 17:24:03.188 [INFO][4497] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599" Namespace="calico-system" Pod="whisker-8cfdc47df-fwfd5" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-whisker--8cfdc47df--fwfd5-eth0" Sep 12 17:24:03.278714 containerd[1886]: 2025-09-12 17:24:03.205 [INFO][4509] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599" HandleID="k8s-pod-network.fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599" Workload="ci--4426.1.0--a--1fe763f55e-k8s-whisker--8cfdc47df--fwfd5-eth0" Sep 12 17:24:03.278925 containerd[1886]: 2025-09-12 17:24:03.207 [INFO][4509] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599" HandleID="k8s-pod-network.fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599" Workload="ci--4426.1.0--a--1fe763f55e-k8s-whisker--8cfdc47df--fwfd5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b610), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.1.0-a-1fe763f55e", "pod":"whisker-8cfdc47df-fwfd5", "timestamp":"2025-09-12 17:24:03.204986626 +0000 UTC"}, Hostname:"ci-4426.1.0-a-1fe763f55e", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:24:03.278925 containerd[1886]: 2025-09-12 17:24:03.207 [INFO][4509] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:24:03.278925 containerd[1886]: 2025-09-12 17:24:03.207 [INFO][4509] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:24:03.278925 containerd[1886]: 2025-09-12 17:24:03.207 [INFO][4509] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-1fe763f55e' Sep 12 17:24:03.278925 containerd[1886]: 2025-09-12 17:24:03.213 [INFO][4509] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:03.278925 containerd[1886]: 2025-09-12 17:24:03.217 [INFO][4509] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:03.278925 containerd[1886]: 2025-09-12 17:24:03.220 [INFO][4509] ipam/ipam.go 511: Trying affinity for 192.168.97.64/26 host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:03.278925 containerd[1886]: 2025-09-12 17:24:03.222 [INFO][4509] ipam/ipam.go 158: Attempting to load block cidr=192.168.97.64/26 host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:03.278925 containerd[1886]: 2025-09-12 17:24:03.223 [INFO][4509] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.97.64/26 host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:03.279303 containerd[1886]: 2025-09-12 17:24:03.223 [INFO][4509] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.97.64/26 handle="k8s-pod-network.fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:03.279303 containerd[1886]: 2025-09-12 17:24:03.225 [INFO][4509] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599 Sep 12 17:24:03.279303 containerd[1886]: 2025-09-12 17:24:03.229 [INFO][4509] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.97.64/26 handle="k8s-pod-network.fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:03.279303 containerd[1886]: 2025-09-12 17:24:03.237 [INFO][4509] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.97.65/26] block=192.168.97.64/26 handle="k8s-pod-network.fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:03.279303 containerd[1886]: 2025-09-12 17:24:03.237 [INFO][4509] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.97.65/26] handle="k8s-pod-network.fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:03.279303 containerd[1886]: 2025-09-12 17:24:03.237 [INFO][4509] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:24:03.279303 containerd[1886]: 2025-09-12 17:24:03.237 [INFO][4509] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.97.65/26] IPv6=[] ContainerID="fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599" HandleID="k8s-pod-network.fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599" Workload="ci--4426.1.0--a--1fe763f55e-k8s-whisker--8cfdc47df--fwfd5-eth0" Sep 12 17:24:03.279421 containerd[1886]: 2025-09-12 17:24:03.239 [INFO][4497] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599" Namespace="calico-system" Pod="whisker-8cfdc47df-fwfd5" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-whisker--8cfdc47df--fwfd5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--1fe763f55e-k8s-whisker--8cfdc47df--fwfd5-eth0", GenerateName:"whisker-8cfdc47df-", Namespace:"calico-system", SelfLink:"", UID:"d0b9063d-b5f3-41c2-9b40-f3b2478277d4", ResourceVersion:"877", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 24, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"8cfdc47df", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-1fe763f55e", ContainerID:"", Pod:"whisker-8cfdc47df-fwfd5", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.97.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calid61632939e4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:24:03.279421 containerd[1886]: 2025-09-12 17:24:03.239 [INFO][4497] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.97.65/32] ContainerID="fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599" Namespace="calico-system" Pod="whisker-8cfdc47df-fwfd5" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-whisker--8cfdc47df--fwfd5-eth0" Sep 12 17:24:03.279488 containerd[1886]: 2025-09-12 17:24:03.240 [INFO][4497] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid61632939e4 ContainerID="fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599" Namespace="calico-system" Pod="whisker-8cfdc47df-fwfd5" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-whisker--8cfdc47df--fwfd5-eth0" Sep 12 17:24:03.279488 containerd[1886]: 2025-09-12 17:24:03.259 [INFO][4497] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599" Namespace="calico-system" Pod="whisker-8cfdc47df-fwfd5" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-whisker--8cfdc47df--fwfd5-eth0" Sep 12 17:24:03.279518 containerd[1886]: 2025-09-12 17:24:03.259 [INFO][4497] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599" Namespace="calico-system" Pod="whisker-8cfdc47df-fwfd5" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-whisker--8cfdc47df--fwfd5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--1fe763f55e-k8s-whisker--8cfdc47df--fwfd5-eth0", GenerateName:"whisker-8cfdc47df-", Namespace:"calico-system", SelfLink:"", UID:"d0b9063d-b5f3-41c2-9b40-f3b2478277d4", ResourceVersion:"877", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 24, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"8cfdc47df", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-1fe763f55e", ContainerID:"fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599", Pod:"whisker-8cfdc47df-fwfd5", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.97.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calid61632939e4", MAC:"1e:43:fb:ab:a2:7d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:24:03.279581 containerd[1886]: 2025-09-12 17:24:03.276 [INFO][4497] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599" Namespace="calico-system" Pod="whisker-8cfdc47df-fwfd5" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-whisker--8cfdc47df--fwfd5-eth0" Sep 12 17:24:03.324479 containerd[1886]: time="2025-09-12T17:24:03.324204806Z" level=info msg="connecting to shim fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599" address="unix:///run/containerd/s/253630d90ff0d4dbacaf458384933e828db06334a8737931f063063026fa6a49" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:24:03.340238 systemd[1]: Started cri-containerd-fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599.scope - libcontainer container fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599. Sep 12 17:24:03.367662 containerd[1886]: time="2025-09-12T17:24:03.367563746Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-8cfdc47df-fwfd5,Uid:d0b9063d-b5f3-41c2-9b40-f3b2478277d4,Namespace:calico-system,Attempt:0,} returns sandbox id \"fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599\"" Sep 12 17:24:03.368933 containerd[1886]: time="2025-09-12T17:24:03.368916260Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 12 17:24:03.644516 kubelet[3450]: I0912 17:24:03.644329 3450 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22071108-d5a7-42cd-8910-06a64e029d36" path="/var/lib/kubelet/pods/22071108-d5a7-42cd-8910-06a64e029d36/volumes" Sep 12 17:24:04.190973 systemd-networkd[1697]: vxlan.calico: Link UP Sep 12 17:24:04.190979 systemd-networkd[1697]: vxlan.calico: Gained carrier Sep 12 17:24:04.417809 systemd-networkd[1697]: calid61632939e4: Gained IPv6LL Sep 12 17:24:04.828552 containerd[1886]: time="2025-09-12T17:24:04.828504334Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:04.832089 containerd[1886]: time="2025-09-12T17:24:04.832056871Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 12 17:24:04.840785 containerd[1886]: time="2025-09-12T17:24:04.840739178Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:04.846013 containerd[1886]: time="2025-09-12T17:24:04.845539579Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:04.846013 containerd[1886]: time="2025-09-12T17:24:04.845888907Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 1.476865876s" Sep 12 17:24:04.846013 containerd[1886]: time="2025-09-12T17:24:04.845913708Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 12 17:24:04.853967 containerd[1886]: time="2025-09-12T17:24:04.853931238Z" level=info msg="CreateContainer within sandbox \"fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 12 17:24:04.884941 containerd[1886]: time="2025-09-12T17:24:04.884893642Z" level=info msg="Container afe0cf0cb19ae330804d18f6b51712a01685ac623742e87cae1e1bccb125a338: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:24:04.888641 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1988193291.mount: Deactivated successfully. Sep 12 17:24:04.912095 containerd[1886]: time="2025-09-12T17:24:04.912062871Z" level=info msg="CreateContainer within sandbox \"fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"afe0cf0cb19ae330804d18f6b51712a01685ac623742e87cae1e1bccb125a338\"" Sep 12 17:24:04.912419 containerd[1886]: time="2025-09-12T17:24:04.912380895Z" level=info msg="StartContainer for \"afe0cf0cb19ae330804d18f6b51712a01685ac623742e87cae1e1bccb125a338\"" Sep 12 17:24:04.913113 containerd[1886]: time="2025-09-12T17:24:04.913090433Z" level=info msg="connecting to shim afe0cf0cb19ae330804d18f6b51712a01685ac623742e87cae1e1bccb125a338" address="unix:///run/containerd/s/253630d90ff0d4dbacaf458384933e828db06334a8737931f063063026fa6a49" protocol=ttrpc version=3 Sep 12 17:24:04.931241 systemd[1]: Started cri-containerd-afe0cf0cb19ae330804d18f6b51712a01685ac623742e87cae1e1bccb125a338.scope - libcontainer container afe0cf0cb19ae330804d18f6b51712a01685ac623742e87cae1e1bccb125a338. Sep 12 17:24:04.970046 containerd[1886]: time="2025-09-12T17:24:04.970022763Z" level=info msg="StartContainer for \"afe0cf0cb19ae330804d18f6b51712a01685ac623742e87cae1e1bccb125a338\" returns successfully" Sep 12 17:24:04.971944 containerd[1886]: time="2025-09-12T17:24:04.971472024Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 12 17:24:05.377267 systemd-networkd[1697]: vxlan.calico: Gained IPv6LL Sep 12 17:24:06.667959 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3439725957.mount: Deactivated successfully. Sep 12 17:24:07.211095 containerd[1886]: time="2025-09-12T17:24:07.211039365Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:07.216908 containerd[1886]: time="2025-09-12T17:24:07.216765104Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 12 17:24:07.221880 containerd[1886]: time="2025-09-12T17:24:07.221852505Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:07.228137 containerd[1886]: time="2025-09-12T17:24:07.228104997Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:07.228589 containerd[1886]: time="2025-09-12T17:24:07.228561357Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 2.256507151s" Sep 12 17:24:07.228620 containerd[1886]: time="2025-09-12T17:24:07.228591030Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 12 17:24:07.236837 containerd[1886]: time="2025-09-12T17:24:07.236814620Z" level=info msg="CreateContainer within sandbox \"fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 12 17:24:07.266218 containerd[1886]: time="2025-09-12T17:24:07.266189658Z" level=info msg="Container 545984eaad240c8631cba4177976de5713ffc43b7b1cc1151944b402ddbb3d87: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:24:07.292323 containerd[1886]: time="2025-09-12T17:24:07.292293598Z" level=info msg="CreateContainer within sandbox \"fe315ddec6d895e52d13fba63f4ddc9cf55d88f71a6458ec84f35e67a8413599\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"545984eaad240c8631cba4177976de5713ffc43b7b1cc1151944b402ddbb3d87\"" Sep 12 17:24:07.292827 containerd[1886]: time="2025-09-12T17:24:07.292746230Z" level=info msg="StartContainer for \"545984eaad240c8631cba4177976de5713ffc43b7b1cc1151944b402ddbb3d87\"" Sep 12 17:24:07.293690 containerd[1886]: time="2025-09-12T17:24:07.293663414Z" level=info msg="connecting to shim 545984eaad240c8631cba4177976de5713ffc43b7b1cc1151944b402ddbb3d87" address="unix:///run/containerd/s/253630d90ff0d4dbacaf458384933e828db06334a8737931f063063026fa6a49" protocol=ttrpc version=3 Sep 12 17:24:07.319249 systemd[1]: Started cri-containerd-545984eaad240c8631cba4177976de5713ffc43b7b1cc1151944b402ddbb3d87.scope - libcontainer container 545984eaad240c8631cba4177976de5713ffc43b7b1cc1151944b402ddbb3d87. Sep 12 17:24:07.352328 containerd[1886]: time="2025-09-12T17:24:07.352297311Z" level=info msg="StartContainer for \"545984eaad240c8631cba4177976de5713ffc43b7b1cc1151944b402ddbb3d87\" returns successfully" Sep 12 17:24:08.643085 containerd[1886]: time="2025-09-12T17:24:08.643047428Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b99b7b99b-58b54,Uid:5c1c8cbf-c9bc-4bb5-9f61-27755b4a4152,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:24:08.771216 systemd-networkd[1697]: calib54c6bbb382: Link UP Sep 12 17:24:08.771838 systemd-networkd[1697]: calib54c6bbb382: Gained carrier Sep 12 17:24:08.785543 kubelet[3450]: I0912 17:24:08.784445 3450 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-8cfdc47df-fwfd5" podStartSLOduration=2.92359388 podStartE2EDuration="6.784428008s" podCreationTimestamp="2025-09-12 17:24:02 +0000 UTC" firstStartedPulling="2025-09-12 17:24:03.368535883 +0000 UTC m=+34.139283009" lastFinishedPulling="2025-09-12 17:24:07.229370003 +0000 UTC m=+38.000117137" observedRunningTime="2025-09-12 17:24:07.774926079 +0000 UTC m=+38.545673213" watchObservedRunningTime="2025-09-12 17:24:08.784428008 +0000 UTC m=+39.555175134" Sep 12 17:24:08.786242 containerd[1886]: 2025-09-12 17:24:08.716 [INFO][4845] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--58b54-eth0 calico-apiserver-6b99b7b99b- calico-apiserver 5c1c8cbf-c9bc-4bb5-9f61-27755b4a4152 807 0 2025-09-12 17:23:44 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6b99b7b99b projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426.1.0-a-1fe763f55e calico-apiserver-6b99b7b99b-58b54 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib54c6bbb382 [] [] }} ContainerID="a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691" Namespace="calico-apiserver" Pod="calico-apiserver-6b99b7b99b-58b54" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--58b54-" Sep 12 17:24:08.786242 containerd[1886]: 2025-09-12 17:24:08.716 [INFO][4845] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691" Namespace="calico-apiserver" Pod="calico-apiserver-6b99b7b99b-58b54" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--58b54-eth0" Sep 12 17:24:08.786242 containerd[1886]: 2025-09-12 17:24:08.737 [INFO][4857] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691" HandleID="k8s-pod-network.a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691" Workload="ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--58b54-eth0" Sep 12 17:24:08.786361 containerd[1886]: 2025-09-12 17:24:08.737 [INFO][4857] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691" HandleID="k8s-pod-network.a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691" Workload="ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--58b54-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002aa150), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426.1.0-a-1fe763f55e", "pod":"calico-apiserver-6b99b7b99b-58b54", "timestamp":"2025-09-12 17:24:08.737546307 +0000 UTC"}, Hostname:"ci-4426.1.0-a-1fe763f55e", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:24:08.786361 containerd[1886]: 2025-09-12 17:24:08.737 [INFO][4857] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:24:08.786361 containerd[1886]: 2025-09-12 17:24:08.737 [INFO][4857] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:24:08.786361 containerd[1886]: 2025-09-12 17:24:08.737 [INFO][4857] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-1fe763f55e' Sep 12 17:24:08.786361 containerd[1886]: 2025-09-12 17:24:08.743 [INFO][4857] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:08.786361 containerd[1886]: 2025-09-12 17:24:08.747 [INFO][4857] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:08.786361 containerd[1886]: 2025-09-12 17:24:08.750 [INFO][4857] ipam/ipam.go 511: Trying affinity for 192.168.97.64/26 host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:08.786361 containerd[1886]: 2025-09-12 17:24:08.752 [INFO][4857] ipam/ipam.go 158: Attempting to load block cidr=192.168.97.64/26 host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:08.786361 containerd[1886]: 2025-09-12 17:24:08.754 [INFO][4857] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.97.64/26 host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:08.786495 containerd[1886]: 2025-09-12 17:24:08.754 [INFO][4857] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.97.64/26 handle="k8s-pod-network.a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:08.786495 containerd[1886]: 2025-09-12 17:24:08.755 [INFO][4857] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691 Sep 12 17:24:08.786495 containerd[1886]: 2025-09-12 17:24:08.759 [INFO][4857] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.97.64/26 handle="k8s-pod-network.a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:08.786495 containerd[1886]: 2025-09-12 17:24:08.767 [INFO][4857] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.97.66/26] block=192.168.97.64/26 handle="k8s-pod-network.a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:08.786495 containerd[1886]: 2025-09-12 17:24:08.767 [INFO][4857] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.97.66/26] handle="k8s-pod-network.a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:08.786495 containerd[1886]: 2025-09-12 17:24:08.767 [INFO][4857] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:24:08.786495 containerd[1886]: 2025-09-12 17:24:08.767 [INFO][4857] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.97.66/26] IPv6=[] ContainerID="a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691" HandleID="k8s-pod-network.a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691" Workload="ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--58b54-eth0" Sep 12 17:24:08.786589 containerd[1886]: 2025-09-12 17:24:08.768 [INFO][4845] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691" Namespace="calico-apiserver" Pod="calico-apiserver-6b99b7b99b-58b54" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--58b54-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--58b54-eth0", GenerateName:"calico-apiserver-6b99b7b99b-", Namespace:"calico-apiserver", SelfLink:"", UID:"5c1c8cbf-c9bc-4bb5-9f61-27755b4a4152", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 23, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b99b7b99b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-1fe763f55e", ContainerID:"", Pod:"calico-apiserver-6b99b7b99b-58b54", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.97.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib54c6bbb382", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:24:08.786660 containerd[1886]: 2025-09-12 17:24:08.768 [INFO][4845] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.97.66/32] ContainerID="a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691" Namespace="calico-apiserver" Pod="calico-apiserver-6b99b7b99b-58b54" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--58b54-eth0" Sep 12 17:24:08.786660 containerd[1886]: 2025-09-12 17:24:08.768 [INFO][4845] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib54c6bbb382 ContainerID="a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691" Namespace="calico-apiserver" Pod="calico-apiserver-6b99b7b99b-58b54" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--58b54-eth0" Sep 12 17:24:08.786660 containerd[1886]: 2025-09-12 17:24:08.772 [INFO][4845] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691" Namespace="calico-apiserver" Pod="calico-apiserver-6b99b7b99b-58b54" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--58b54-eth0" Sep 12 17:24:08.786722 containerd[1886]: 2025-09-12 17:24:08.772 [INFO][4845] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691" Namespace="calico-apiserver" Pod="calico-apiserver-6b99b7b99b-58b54" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--58b54-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--58b54-eth0", GenerateName:"calico-apiserver-6b99b7b99b-", Namespace:"calico-apiserver", SelfLink:"", UID:"5c1c8cbf-c9bc-4bb5-9f61-27755b4a4152", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 23, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b99b7b99b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-1fe763f55e", ContainerID:"a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691", Pod:"calico-apiserver-6b99b7b99b-58b54", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.97.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib54c6bbb382", MAC:"f2:9f:96:4e:b8:6b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:24:08.786757 containerd[1886]: 2025-09-12 17:24:08.782 [INFO][4845] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691" Namespace="calico-apiserver" Pod="calico-apiserver-6b99b7b99b-58b54" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--58b54-eth0" Sep 12 17:24:08.839531 containerd[1886]: time="2025-09-12T17:24:08.839499715Z" level=info msg="connecting to shim a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691" address="unix:///run/containerd/s/4cc98fc36bb219c40bef515c6894f783a00e5fca5d819f0c4d09905b3dbdeb1c" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:24:08.864253 systemd[1]: Started cri-containerd-a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691.scope - libcontainer container a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691. Sep 12 17:24:09.643667 containerd[1886]: time="2025-09-12T17:24:09.643623371Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-m5hn4,Uid:22a9e9f7-3d62-4a87-b321-c5550b85dec3,Namespace:calico-system,Attempt:0,}" Sep 12 17:24:09.644330 containerd[1886]: time="2025-09-12T17:24:09.644219502Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b99b7b99b-gplt4,Uid:7a27628a-082d-4c3d-904c-c73dd3edd18e,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:24:10.642960 containerd[1886]: time="2025-09-12T17:24:10.642782649Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-996756887-nfmh2,Uid:012d6e27-14a7-44d7-b5f3-972242703ab4,Namespace:calico-system,Attempt:0,}" Sep 12 17:24:10.689310 systemd-networkd[1697]: calib54c6bbb382: Gained IPv6LL Sep 12 17:24:11.193536 containerd[1886]: time="2025-09-12T17:24:11.193450097Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b99b7b99b-58b54,Uid:5c1c8cbf-c9bc-4bb5-9f61-27755b4a4152,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691\"" Sep 12 17:24:11.195519 containerd[1886]: time="2025-09-12T17:24:11.195403272Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 17:24:11.526999 systemd-networkd[1697]: cali6549a118439: Link UP Sep 12 17:24:11.527993 systemd-networkd[1697]: cali6549a118439: Gained carrier Sep 12 17:24:11.549949 containerd[1886]: 2025-09-12 17:24:11.471 [INFO][4928] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--1fe763f55e-k8s-goldmane--54d579b49d--m5hn4-eth0 goldmane-54d579b49d- calico-system 22a9e9f7-3d62-4a87-b321-c5550b85dec3 809 0 2025-09-12 17:23:47 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4426.1.0-a-1fe763f55e goldmane-54d579b49d-m5hn4 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali6549a118439 [] [] }} ContainerID="b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286" Namespace="calico-system" Pod="goldmane-54d579b49d-m5hn4" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-goldmane--54d579b49d--m5hn4-" Sep 12 17:24:11.549949 containerd[1886]: 2025-09-12 17:24:11.471 [INFO][4928] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286" Namespace="calico-system" Pod="goldmane-54d579b49d-m5hn4" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-goldmane--54d579b49d--m5hn4-eth0" Sep 12 17:24:11.549949 containerd[1886]: 2025-09-12 17:24:11.488 [INFO][4942] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286" HandleID="k8s-pod-network.b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286" Workload="ci--4426.1.0--a--1fe763f55e-k8s-goldmane--54d579b49d--m5hn4-eth0" Sep 12 17:24:11.550092 containerd[1886]: 2025-09-12 17:24:11.490 [INFO][4942] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286" HandleID="k8s-pod-network.b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286" Workload="ci--4426.1.0--a--1fe763f55e-k8s-goldmane--54d579b49d--m5hn4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024afc0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.1.0-a-1fe763f55e", "pod":"goldmane-54d579b49d-m5hn4", "timestamp":"2025-09-12 17:24:11.488761431 +0000 UTC"}, Hostname:"ci-4426.1.0-a-1fe763f55e", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:24:11.550092 containerd[1886]: 2025-09-12 17:24:11.494 [INFO][4942] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:24:11.550092 containerd[1886]: 2025-09-12 17:24:11.494 [INFO][4942] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:24:11.550092 containerd[1886]: 2025-09-12 17:24:11.494 [INFO][4942] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-1fe763f55e' Sep 12 17:24:11.550092 containerd[1886]: 2025-09-12 17:24:11.499 [INFO][4942] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.550092 containerd[1886]: 2025-09-12 17:24:11.504 [INFO][4942] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.550092 containerd[1886]: 2025-09-12 17:24:11.507 [INFO][4942] ipam/ipam.go 511: Trying affinity for 192.168.97.64/26 host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.550092 containerd[1886]: 2025-09-12 17:24:11.508 [INFO][4942] ipam/ipam.go 158: Attempting to load block cidr=192.168.97.64/26 host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.550092 containerd[1886]: 2025-09-12 17:24:11.510 [INFO][4942] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.97.64/26 host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.550502 containerd[1886]: 2025-09-12 17:24:11.510 [INFO][4942] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.97.64/26 handle="k8s-pod-network.b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.550502 containerd[1886]: 2025-09-12 17:24:11.511 [INFO][4942] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286 Sep 12 17:24:11.550502 containerd[1886]: 2025-09-12 17:24:11.516 [INFO][4942] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.97.64/26 handle="k8s-pod-network.b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.550502 containerd[1886]: 2025-09-12 17:24:11.521 [INFO][4942] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.97.67/26] block=192.168.97.64/26 handle="k8s-pod-network.b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.550502 containerd[1886]: 2025-09-12 17:24:11.521 [INFO][4942] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.97.67/26] handle="k8s-pod-network.b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.550502 containerd[1886]: 2025-09-12 17:24:11.521 [INFO][4942] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:24:11.550502 containerd[1886]: 2025-09-12 17:24:11.521 [INFO][4942] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.97.67/26] IPv6=[] ContainerID="b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286" HandleID="k8s-pod-network.b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286" Workload="ci--4426.1.0--a--1fe763f55e-k8s-goldmane--54d579b49d--m5hn4-eth0" Sep 12 17:24:11.550599 containerd[1886]: 2025-09-12 17:24:11.523 [INFO][4928] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286" Namespace="calico-system" Pod="goldmane-54d579b49d-m5hn4" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-goldmane--54d579b49d--m5hn4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--1fe763f55e-k8s-goldmane--54d579b49d--m5hn4-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"22a9e9f7-3d62-4a87-b321-c5550b85dec3", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 23, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-1fe763f55e", ContainerID:"", Pod:"goldmane-54d579b49d-m5hn4", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.97.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali6549a118439", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:24:11.550640 containerd[1886]: 2025-09-12 17:24:11.523 [INFO][4928] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.97.67/32] ContainerID="b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286" Namespace="calico-system" Pod="goldmane-54d579b49d-m5hn4" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-goldmane--54d579b49d--m5hn4-eth0" Sep 12 17:24:11.550640 containerd[1886]: 2025-09-12 17:24:11.523 [INFO][4928] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6549a118439 ContainerID="b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286" Namespace="calico-system" Pod="goldmane-54d579b49d-m5hn4" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-goldmane--54d579b49d--m5hn4-eth0" Sep 12 17:24:11.550640 containerd[1886]: 2025-09-12 17:24:11.528 [INFO][4928] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286" Namespace="calico-system" Pod="goldmane-54d579b49d-m5hn4" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-goldmane--54d579b49d--m5hn4-eth0" Sep 12 17:24:11.550684 containerd[1886]: 2025-09-12 17:24:11.529 [INFO][4928] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286" Namespace="calico-system" Pod="goldmane-54d579b49d-m5hn4" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-goldmane--54d579b49d--m5hn4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--1fe763f55e-k8s-goldmane--54d579b49d--m5hn4-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"22a9e9f7-3d62-4a87-b321-c5550b85dec3", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 23, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-1fe763f55e", ContainerID:"b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286", Pod:"goldmane-54d579b49d-m5hn4", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.97.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali6549a118439", MAC:"e6:49:09:79:7c:6d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:24:11.550717 containerd[1886]: 2025-09-12 17:24:11.546 [INFO][4928] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286" Namespace="calico-system" Pod="goldmane-54d579b49d-m5hn4" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-goldmane--54d579b49d--m5hn4-eth0" Sep 12 17:24:11.643856 containerd[1886]: time="2025-09-12T17:24:11.643645576Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-q58t7,Uid:ccd2bbe7-dc74-41c7-8369-074eb02b14c4,Namespace:kube-system,Attempt:0,}" Sep 12 17:24:11.643856 containerd[1886]: time="2025-09-12T17:24:11.643650592Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-l8q56,Uid:9a5be5b5-06ad-4760-958c-44bbde584588,Namespace:kube-system,Attempt:0,}" Sep 12 17:24:11.661740 systemd-networkd[1697]: cali4db9d496e59: Link UP Sep 12 17:24:11.662399 systemd-networkd[1697]: cali4db9d496e59: Gained carrier Sep 12 17:24:11.682023 containerd[1886]: 2025-09-12 17:24:11.572 [INFO][4950] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--gplt4-eth0 calico-apiserver-6b99b7b99b- calico-apiserver 7a27628a-082d-4c3d-904c-c73dd3edd18e 811 0 2025-09-12 17:23:44 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6b99b7b99b projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426.1.0-a-1fe763f55e calico-apiserver-6b99b7b99b-gplt4 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali4db9d496e59 [] [] }} ContainerID="76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a" Namespace="calico-apiserver" Pod="calico-apiserver-6b99b7b99b-gplt4" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--gplt4-" Sep 12 17:24:11.682023 containerd[1886]: 2025-09-12 17:24:11.572 [INFO][4950] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a" Namespace="calico-apiserver" Pod="calico-apiserver-6b99b7b99b-gplt4" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--gplt4-eth0" Sep 12 17:24:11.682023 containerd[1886]: 2025-09-12 17:24:11.588 [INFO][4972] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a" HandleID="k8s-pod-network.76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a" Workload="ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--gplt4-eth0" Sep 12 17:24:11.682360 containerd[1886]: 2025-09-12 17:24:11.589 [INFO][4972] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a" HandleID="k8s-pod-network.76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a" Workload="ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--gplt4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b1a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426.1.0-a-1fe763f55e", "pod":"calico-apiserver-6b99b7b99b-gplt4", "timestamp":"2025-09-12 17:24:11.58893685 +0000 UTC"}, Hostname:"ci-4426.1.0-a-1fe763f55e", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:24:11.682360 containerd[1886]: 2025-09-12 17:24:11.589 [INFO][4972] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:24:11.682360 containerd[1886]: 2025-09-12 17:24:11.589 [INFO][4972] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:24:11.682360 containerd[1886]: 2025-09-12 17:24:11.589 [INFO][4972] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-1fe763f55e' Sep 12 17:24:11.682360 containerd[1886]: 2025-09-12 17:24:11.600 [INFO][4972] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.682360 containerd[1886]: 2025-09-12 17:24:11.607 [INFO][4972] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.682360 containerd[1886]: 2025-09-12 17:24:11.616 [INFO][4972] ipam/ipam.go 511: Trying affinity for 192.168.97.64/26 host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.682360 containerd[1886]: 2025-09-12 17:24:11.617 [INFO][4972] ipam/ipam.go 158: Attempting to load block cidr=192.168.97.64/26 host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.682360 containerd[1886]: 2025-09-12 17:24:11.620 [INFO][4972] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.97.64/26 host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.682527 containerd[1886]: 2025-09-12 17:24:11.620 [INFO][4972] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.97.64/26 handle="k8s-pod-network.76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.682527 containerd[1886]: 2025-09-12 17:24:11.621 [INFO][4972] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a Sep 12 17:24:11.682527 containerd[1886]: 2025-09-12 17:24:11.629 [INFO][4972] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.97.64/26 handle="k8s-pod-network.76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.682527 containerd[1886]: 2025-09-12 17:24:11.654 [INFO][4972] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.97.68/26] block=192.168.97.64/26 handle="k8s-pod-network.76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.682527 containerd[1886]: 2025-09-12 17:24:11.654 [INFO][4972] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.97.68/26] handle="k8s-pod-network.76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.682527 containerd[1886]: 2025-09-12 17:24:11.654 [INFO][4972] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:24:11.682527 containerd[1886]: 2025-09-12 17:24:11.654 [INFO][4972] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.97.68/26] IPv6=[] ContainerID="76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a" HandleID="k8s-pod-network.76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a" Workload="ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--gplt4-eth0" Sep 12 17:24:11.682892 containerd[1886]: 2025-09-12 17:24:11.655 [INFO][4950] cni-plugin/k8s.go 418: Populated endpoint ContainerID="76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a" Namespace="calico-apiserver" Pod="calico-apiserver-6b99b7b99b-gplt4" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--gplt4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--gplt4-eth0", GenerateName:"calico-apiserver-6b99b7b99b-", Namespace:"calico-apiserver", SelfLink:"", UID:"7a27628a-082d-4c3d-904c-c73dd3edd18e", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 23, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b99b7b99b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-1fe763f55e", ContainerID:"", Pod:"calico-apiserver-6b99b7b99b-gplt4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.97.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4db9d496e59", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:24:11.683598 containerd[1886]: 2025-09-12 17:24:11.656 [INFO][4950] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.97.68/32] ContainerID="76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a" Namespace="calico-apiserver" Pod="calico-apiserver-6b99b7b99b-gplt4" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--gplt4-eth0" Sep 12 17:24:11.683598 containerd[1886]: 2025-09-12 17:24:11.656 [INFO][4950] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4db9d496e59 ContainerID="76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a" Namespace="calico-apiserver" Pod="calico-apiserver-6b99b7b99b-gplt4" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--gplt4-eth0" Sep 12 17:24:11.683598 containerd[1886]: 2025-09-12 17:24:11.662 [INFO][4950] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a" Namespace="calico-apiserver" Pod="calico-apiserver-6b99b7b99b-gplt4" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--gplt4-eth0" Sep 12 17:24:11.683668 containerd[1886]: 2025-09-12 17:24:11.664 [INFO][4950] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a" Namespace="calico-apiserver" Pod="calico-apiserver-6b99b7b99b-gplt4" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--gplt4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--gplt4-eth0", GenerateName:"calico-apiserver-6b99b7b99b-", Namespace:"calico-apiserver", SelfLink:"", UID:"7a27628a-082d-4c3d-904c-c73dd3edd18e", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 23, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b99b7b99b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-1fe763f55e", ContainerID:"76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a", Pod:"calico-apiserver-6b99b7b99b-gplt4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.97.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4db9d496e59", MAC:"ea:f3:72:49:7f:cc", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:24:11.683711 containerd[1886]: 2025-09-12 17:24:11.679 [INFO][4950] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a" Namespace="calico-apiserver" Pod="calico-apiserver-6b99b7b99b-gplt4" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-calico--apiserver--6b99b7b99b--gplt4-eth0" Sep 12 17:24:11.740092 systemd-networkd[1697]: cali6010c20b0ba: Link UP Sep 12 17:24:11.742086 systemd-networkd[1697]: cali6010c20b0ba: Gained carrier Sep 12 17:24:11.767446 containerd[1886]: 2025-09-12 17:24:11.620 [INFO][4979] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--1fe763f55e-k8s-calico--kube--controllers--996756887--nfmh2-eth0 calico-kube-controllers-996756887- calico-system 012d6e27-14a7-44d7-b5f3-972242703ab4 810 0 2025-09-12 17:23:47 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:996756887 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4426.1.0-a-1fe763f55e calico-kube-controllers-996756887-nfmh2 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali6010c20b0ba [] [] }} ContainerID="96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6" Namespace="calico-system" Pod="calico-kube-controllers-996756887-nfmh2" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-calico--kube--controllers--996756887--nfmh2-" Sep 12 17:24:11.767446 containerd[1886]: 2025-09-12 17:24:11.620 [INFO][4979] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6" Namespace="calico-system" Pod="calico-kube-controllers-996756887-nfmh2" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-calico--kube--controllers--996756887--nfmh2-eth0" Sep 12 17:24:11.767446 containerd[1886]: 2025-09-12 17:24:11.641 [INFO][4991] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6" HandleID="k8s-pod-network.96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6" Workload="ci--4426.1.0--a--1fe763f55e-k8s-calico--kube--controllers--996756887--nfmh2-eth0" Sep 12 17:24:11.768375 containerd[1886]: 2025-09-12 17:24:11.641 [INFO][4991] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6" HandleID="k8s-pod-network.96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6" Workload="ci--4426.1.0--a--1fe763f55e-k8s-calico--kube--controllers--996756887--nfmh2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b180), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.1.0-a-1fe763f55e", "pod":"calico-kube-controllers-996756887-nfmh2", "timestamp":"2025-09-12 17:24:11.641471851 +0000 UTC"}, Hostname:"ci-4426.1.0-a-1fe763f55e", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:24:11.768375 containerd[1886]: 2025-09-12 17:24:11.641 [INFO][4991] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:24:11.768375 containerd[1886]: 2025-09-12 17:24:11.654 [INFO][4991] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:24:11.768375 containerd[1886]: 2025-09-12 17:24:11.654 [INFO][4991] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-1fe763f55e' Sep 12 17:24:11.768375 containerd[1886]: 2025-09-12 17:24:11.701 [INFO][4991] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.768375 containerd[1886]: 2025-09-12 17:24:11.707 [INFO][4991] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.768375 containerd[1886]: 2025-09-12 17:24:11.712 [INFO][4991] ipam/ipam.go 511: Trying affinity for 192.168.97.64/26 host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.768375 containerd[1886]: 2025-09-12 17:24:11.714 [INFO][4991] ipam/ipam.go 158: Attempting to load block cidr=192.168.97.64/26 host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.768375 containerd[1886]: 2025-09-12 17:24:11.716 [INFO][4991] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.97.64/26 host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.768722 containerd[1886]: 2025-09-12 17:24:11.716 [INFO][4991] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.97.64/26 handle="k8s-pod-network.96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.768722 containerd[1886]: 2025-09-12 17:24:11.717 [INFO][4991] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6 Sep 12 17:24:11.768722 containerd[1886]: 2025-09-12 17:24:11.723 [INFO][4991] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.97.64/26 handle="k8s-pod-network.96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.768722 containerd[1886]: 2025-09-12 17:24:11.735 [INFO][4991] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.97.69/26] block=192.168.97.64/26 handle="k8s-pod-network.96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.768722 containerd[1886]: 2025-09-12 17:24:11.735 [INFO][4991] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.97.69/26] handle="k8s-pod-network.96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:11.768722 containerd[1886]: 2025-09-12 17:24:11.735 [INFO][4991] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:24:11.768722 containerd[1886]: 2025-09-12 17:24:11.735 [INFO][4991] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.97.69/26] IPv6=[] ContainerID="96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6" HandleID="k8s-pod-network.96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6" Workload="ci--4426.1.0--a--1fe763f55e-k8s-calico--kube--controllers--996756887--nfmh2-eth0" Sep 12 17:24:11.768827 containerd[1886]: 2025-09-12 17:24:11.737 [INFO][4979] cni-plugin/k8s.go 418: Populated endpoint ContainerID="96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6" Namespace="calico-system" Pod="calico-kube-controllers-996756887-nfmh2" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-calico--kube--controllers--996756887--nfmh2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--1fe763f55e-k8s-calico--kube--controllers--996756887--nfmh2-eth0", GenerateName:"calico-kube-controllers-996756887-", Namespace:"calico-system", SelfLink:"", UID:"012d6e27-14a7-44d7-b5f3-972242703ab4", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 23, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"996756887", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-1fe763f55e", ContainerID:"", Pod:"calico-kube-controllers-996756887-nfmh2", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.97.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali6010c20b0ba", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:24:11.768869 containerd[1886]: 2025-09-12 17:24:11.737 [INFO][4979] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.97.69/32] ContainerID="96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6" Namespace="calico-system" Pod="calico-kube-controllers-996756887-nfmh2" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-calico--kube--controllers--996756887--nfmh2-eth0" Sep 12 17:24:11.768869 containerd[1886]: 2025-09-12 17:24:11.737 [INFO][4979] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6010c20b0ba ContainerID="96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6" Namespace="calico-system" Pod="calico-kube-controllers-996756887-nfmh2" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-calico--kube--controllers--996756887--nfmh2-eth0" Sep 12 17:24:11.768869 containerd[1886]: 2025-09-12 17:24:11.739 [INFO][4979] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6" Namespace="calico-system" Pod="calico-kube-controllers-996756887-nfmh2" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-calico--kube--controllers--996756887--nfmh2-eth0" Sep 12 17:24:11.768943 containerd[1886]: 2025-09-12 17:24:11.741 [INFO][4979] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6" Namespace="calico-system" Pod="calico-kube-controllers-996756887-nfmh2" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-calico--kube--controllers--996756887--nfmh2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--1fe763f55e-k8s-calico--kube--controllers--996756887--nfmh2-eth0", GenerateName:"calico-kube-controllers-996756887-", Namespace:"calico-system", SelfLink:"", UID:"012d6e27-14a7-44d7-b5f3-972242703ab4", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 23, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"996756887", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-1fe763f55e", ContainerID:"96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6", Pod:"calico-kube-controllers-996756887-nfmh2", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.97.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali6010c20b0ba", MAC:"46:f6:3b:c2:e4:67", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:24:11.768979 containerd[1886]: 2025-09-12 17:24:11.764 [INFO][4979] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6" Namespace="calico-system" Pod="calico-kube-controllers-996756887-nfmh2" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-calico--kube--controllers--996756887--nfmh2-eth0" Sep 12 17:24:12.171591 systemd-networkd[1697]: caliae03d262ba8: Link UP Sep 12 17:24:12.172828 systemd-networkd[1697]: caliae03d262ba8: Gained carrier Sep 12 17:24:12.192852 containerd[1886]: 2025-09-12 17:24:12.110 [INFO][5015] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--q58t7-eth0 coredns-674b8bbfcf- kube-system ccd2bbe7-dc74-41c7-8369-074eb02b14c4 806 0 2025-09-12 17:23:35 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4426.1.0-a-1fe763f55e coredns-674b8bbfcf-q58t7 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] caliae03d262ba8 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02" Namespace="kube-system" Pod="coredns-674b8bbfcf-q58t7" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--q58t7-" Sep 12 17:24:12.192852 containerd[1886]: 2025-09-12 17:24:12.111 [INFO][5015] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02" Namespace="kube-system" Pod="coredns-674b8bbfcf-q58t7" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--q58t7-eth0" Sep 12 17:24:12.192852 containerd[1886]: 2025-09-12 17:24:12.127 [INFO][5027] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02" HandleID="k8s-pod-network.77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02" Workload="ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--q58t7-eth0" Sep 12 17:24:12.193193 containerd[1886]: 2025-09-12 17:24:12.127 [INFO][5027] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02" HandleID="k8s-pod-network.77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02" Workload="ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--q58t7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002aa5b0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4426.1.0-a-1fe763f55e", "pod":"coredns-674b8bbfcf-q58t7", "timestamp":"2025-09-12 17:24:12.127192801 +0000 UTC"}, Hostname:"ci-4426.1.0-a-1fe763f55e", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:24:12.193193 containerd[1886]: 2025-09-12 17:24:12.127 [INFO][5027] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:24:12.193193 containerd[1886]: 2025-09-12 17:24:12.127 [INFO][5027] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:24:12.193193 containerd[1886]: 2025-09-12 17:24:12.127 [INFO][5027] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-1fe763f55e' Sep 12 17:24:12.193193 containerd[1886]: 2025-09-12 17:24:12.133 [INFO][5027] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:12.193193 containerd[1886]: 2025-09-12 17:24:12.136 [INFO][5027] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:12.193193 containerd[1886]: 2025-09-12 17:24:12.140 [INFO][5027] ipam/ipam.go 511: Trying affinity for 192.168.97.64/26 host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:12.193193 containerd[1886]: 2025-09-12 17:24:12.142 [INFO][5027] ipam/ipam.go 158: Attempting to load block cidr=192.168.97.64/26 host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:12.193193 containerd[1886]: 2025-09-12 17:24:12.145 [INFO][5027] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.97.64/26 host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:12.194249 containerd[1886]: 2025-09-12 17:24:12.145 [INFO][5027] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.97.64/26 handle="k8s-pod-network.77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:12.194249 containerd[1886]: 2025-09-12 17:24:12.147 [INFO][5027] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02 Sep 12 17:24:12.194249 containerd[1886]: 2025-09-12 17:24:12.151 [INFO][5027] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.97.64/26 handle="k8s-pod-network.77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:12.194249 containerd[1886]: 2025-09-12 17:24:12.163 [INFO][5027] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.97.70/26] block=192.168.97.64/26 handle="k8s-pod-network.77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:12.194249 containerd[1886]: 2025-09-12 17:24:12.163 [INFO][5027] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.97.70/26] handle="k8s-pod-network.77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:12.194249 containerd[1886]: 2025-09-12 17:24:12.164 [INFO][5027] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:24:12.194249 containerd[1886]: 2025-09-12 17:24:12.164 [INFO][5027] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.97.70/26] IPv6=[] ContainerID="77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02" HandleID="k8s-pod-network.77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02" Workload="ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--q58t7-eth0" Sep 12 17:24:12.194366 containerd[1886]: 2025-09-12 17:24:12.166 [INFO][5015] cni-plugin/k8s.go 418: Populated endpoint ContainerID="77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02" Namespace="kube-system" Pod="coredns-674b8bbfcf-q58t7" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--q58t7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--q58t7-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"ccd2bbe7-dc74-41c7-8369-074eb02b14c4", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 23, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-1fe763f55e", ContainerID:"", Pod:"coredns-674b8bbfcf-q58t7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.97.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliae03d262ba8", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:24:12.194366 containerd[1886]: 2025-09-12 17:24:12.166 [INFO][5015] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.97.70/32] ContainerID="77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02" Namespace="kube-system" Pod="coredns-674b8bbfcf-q58t7" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--q58t7-eth0" Sep 12 17:24:12.194366 containerd[1886]: 2025-09-12 17:24:12.166 [INFO][5015] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliae03d262ba8 ContainerID="77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02" Namespace="kube-system" Pod="coredns-674b8bbfcf-q58t7" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--q58t7-eth0" Sep 12 17:24:12.194366 containerd[1886]: 2025-09-12 17:24:12.173 [INFO][5015] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02" Namespace="kube-system" Pod="coredns-674b8bbfcf-q58t7" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--q58t7-eth0" Sep 12 17:24:12.194366 containerd[1886]: 2025-09-12 17:24:12.175 [INFO][5015] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02" Namespace="kube-system" Pod="coredns-674b8bbfcf-q58t7" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--q58t7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--q58t7-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"ccd2bbe7-dc74-41c7-8369-074eb02b14c4", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 23, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-1fe763f55e", ContainerID:"77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02", Pod:"coredns-674b8bbfcf-q58t7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.97.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliae03d262ba8", MAC:"76:e2:83:1d:5e:39", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:24:12.194366 containerd[1886]: 2025-09-12 17:24:12.190 [INFO][5015] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02" Namespace="kube-system" Pod="coredns-674b8bbfcf-q58t7" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--q58t7-eth0" Sep 12 17:24:12.271454 systemd-networkd[1697]: cali8fc9c31dbb6: Link UP Sep 12 17:24:12.272380 systemd-networkd[1697]: cali8fc9c31dbb6: Gained carrier Sep 12 17:24:12.295479 containerd[1886]: 2025-09-12 17:24:12.183 [INFO][5033] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--l8q56-eth0 coredns-674b8bbfcf- kube-system 9a5be5b5-06ad-4760-958c-44bbde584588 805 0 2025-09-12 17:23:35 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4426.1.0-a-1fe763f55e coredns-674b8bbfcf-l8q56 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali8fc9c31dbb6 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9" Namespace="kube-system" Pod="coredns-674b8bbfcf-l8q56" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--l8q56-" Sep 12 17:24:12.295479 containerd[1886]: 2025-09-12 17:24:12.183 [INFO][5033] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9" Namespace="kube-system" Pod="coredns-674b8bbfcf-l8q56" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--l8q56-eth0" Sep 12 17:24:12.295479 containerd[1886]: 2025-09-12 17:24:12.214 [INFO][5049] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9" HandleID="k8s-pod-network.c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9" Workload="ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--l8q56-eth0" Sep 12 17:24:12.295479 containerd[1886]: 2025-09-12 17:24:12.214 [INFO][5049] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9" HandleID="k8s-pod-network.c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9" Workload="ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--l8q56-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b610), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4426.1.0-a-1fe763f55e", "pod":"coredns-674b8bbfcf-l8q56", "timestamp":"2025-09-12 17:24:12.214013123 +0000 UTC"}, Hostname:"ci-4426.1.0-a-1fe763f55e", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:24:12.295479 containerd[1886]: 2025-09-12 17:24:12.214 [INFO][5049] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:24:12.295479 containerd[1886]: 2025-09-12 17:24:12.214 [INFO][5049] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:24:12.295479 containerd[1886]: 2025-09-12 17:24:12.214 [INFO][5049] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-1fe763f55e' Sep 12 17:24:12.295479 containerd[1886]: 2025-09-12 17:24:12.233 [INFO][5049] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:12.295479 containerd[1886]: 2025-09-12 17:24:12.236 [INFO][5049] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:12.295479 containerd[1886]: 2025-09-12 17:24:12.240 [INFO][5049] ipam/ipam.go 511: Trying affinity for 192.168.97.64/26 host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:12.295479 containerd[1886]: 2025-09-12 17:24:12.242 [INFO][5049] ipam/ipam.go 158: Attempting to load block cidr=192.168.97.64/26 host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:12.295479 containerd[1886]: 2025-09-12 17:24:12.244 [INFO][5049] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.97.64/26 host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:12.295479 containerd[1886]: 2025-09-12 17:24:12.244 [INFO][5049] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.97.64/26 handle="k8s-pod-network.c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:12.295479 containerd[1886]: 2025-09-12 17:24:12.246 [INFO][5049] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9 Sep 12 17:24:12.295479 containerd[1886]: 2025-09-12 17:24:12.257 [INFO][5049] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.97.64/26 handle="k8s-pod-network.c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:12.295479 containerd[1886]: 2025-09-12 17:24:12.266 [INFO][5049] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.97.71/26] block=192.168.97.64/26 handle="k8s-pod-network.c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:12.295479 containerd[1886]: 2025-09-12 17:24:12.266 [INFO][5049] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.97.71/26] handle="k8s-pod-network.c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:12.295479 containerd[1886]: 2025-09-12 17:24:12.266 [INFO][5049] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:24:12.295479 containerd[1886]: 2025-09-12 17:24:12.266 [INFO][5049] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.97.71/26] IPv6=[] ContainerID="c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9" HandleID="k8s-pod-network.c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9" Workload="ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--l8q56-eth0" Sep 12 17:24:12.295841 containerd[1886]: 2025-09-12 17:24:12.268 [INFO][5033] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9" Namespace="kube-system" Pod="coredns-674b8bbfcf-l8q56" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--l8q56-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--l8q56-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"9a5be5b5-06ad-4760-958c-44bbde584588", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 23, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-1fe763f55e", ContainerID:"", Pod:"coredns-674b8bbfcf-l8q56", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.97.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8fc9c31dbb6", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:24:12.295841 containerd[1886]: 2025-09-12 17:24:12.268 [INFO][5033] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.97.71/32] ContainerID="c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9" Namespace="kube-system" Pod="coredns-674b8bbfcf-l8q56" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--l8q56-eth0" Sep 12 17:24:12.295841 containerd[1886]: 2025-09-12 17:24:12.268 [INFO][5033] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8fc9c31dbb6 ContainerID="c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9" Namespace="kube-system" Pod="coredns-674b8bbfcf-l8q56" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--l8q56-eth0" Sep 12 17:24:12.295841 containerd[1886]: 2025-09-12 17:24:12.272 [INFO][5033] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9" Namespace="kube-system" Pod="coredns-674b8bbfcf-l8q56" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--l8q56-eth0" Sep 12 17:24:12.295841 containerd[1886]: 2025-09-12 17:24:12.273 [INFO][5033] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9" Namespace="kube-system" Pod="coredns-674b8bbfcf-l8q56" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--l8q56-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--l8q56-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"9a5be5b5-06ad-4760-958c-44bbde584588", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 23, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-1fe763f55e", ContainerID:"c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9", Pod:"coredns-674b8bbfcf-l8q56", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.97.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8fc9c31dbb6", MAC:"da:f3:d6:82:05:69", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:24:12.295841 containerd[1886]: 2025-09-12 17:24:12.291 [INFO][5033] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9" Namespace="kube-system" Pod="coredns-674b8bbfcf-l8q56" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-coredns--674b8bbfcf--l8q56-eth0" Sep 12 17:24:12.801343 systemd-networkd[1697]: cali4db9d496e59: Gained IPv6LL Sep 12 17:24:12.865251 systemd-networkd[1697]: cali6549a118439: Gained IPv6LL Sep 12 17:24:13.002321 containerd[1886]: time="2025-09-12T17:24:13.002277004Z" level=info msg="connecting to shim b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286" address="unix:///run/containerd/s/80e6ad67b969f1ac9b29ad221b0f40ea6405af8515600fa709f09f43aa7d5dd6" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:24:13.017248 systemd[1]: Started cri-containerd-b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286.scope - libcontainer container b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286. Sep 12 17:24:13.377334 systemd-networkd[1697]: caliae03d262ba8: Gained IPv6LL Sep 12 17:24:13.387443 containerd[1886]: time="2025-09-12T17:24:13.387397564Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-m5hn4,Uid:22a9e9f7-3d62-4a87-b321-c5550b85dec3,Namespace:calico-system,Attempt:0,} returns sandbox id \"b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286\"" Sep 12 17:24:13.633404 systemd-networkd[1697]: cali8fc9c31dbb6: Gained IPv6LL Sep 12 17:24:13.644292 containerd[1886]: time="2025-09-12T17:24:13.644170161Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-76j79,Uid:8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae,Namespace:calico-system,Attempt:0,}" Sep 12 17:24:13.754741 containerd[1886]: time="2025-09-12T17:24:13.754666469Z" level=info msg="connecting to shim 76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a" address="unix:///run/containerd/s/f19bfc397a0ac57ff2d96d775e90fe43e4247d8484fd28957725811f418f31a9" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:24:13.761274 systemd-networkd[1697]: cali6010c20b0ba: Gained IPv6LL Sep 12 17:24:13.776248 systemd[1]: Started cri-containerd-76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a.scope - libcontainer container 76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a. Sep 12 17:24:13.902920 containerd[1886]: time="2025-09-12T17:24:13.902834052Z" level=info msg="connecting to shim 96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6" address="unix:///run/containerd/s/585ae5eee4c79e4017acf7fb07f7a8bf41619d693d57ea41dc35436da6c7f354" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:24:13.920251 systemd[1]: Started cri-containerd-96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6.scope - libcontainer container 96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6. Sep 12 17:24:14.094519 containerd[1886]: time="2025-09-12T17:24:14.094428369Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b99b7b99b-gplt4,Uid:7a27628a-082d-4c3d-904c-c73dd3edd18e,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a\"" Sep 12 17:24:14.106227 containerd[1886]: time="2025-09-12T17:24:14.106165731Z" level=info msg="connecting to shim 77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02" address="unix:///run/containerd/s/f96a0badd38fa45d15a228f7188aa7435a56bfd1c4535cd4e4e2975ed8b86f3e" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:24:14.125243 systemd[1]: Started cri-containerd-77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02.scope - libcontainer container 77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02. Sep 12 17:24:14.252585 systemd-networkd[1697]: cali1bbbe5b6467: Link UP Sep 12 17:24:14.253431 systemd-networkd[1697]: cali1bbbe5b6467: Gained carrier Sep 12 17:24:14.271085 containerd[1886]: 2025-09-12 17:24:14.166 [INFO][5249] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426.1.0--a--1fe763f55e-k8s-csi--node--driver--76j79-eth0 csi-node-driver- calico-system 8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae 699 0 2025-09-12 17:23:47 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4426.1.0-a-1fe763f55e csi-node-driver-76j79 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali1bbbe5b6467 [] [] }} ContainerID="8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00" Namespace="calico-system" Pod="csi-node-driver-76j79" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-csi--node--driver--76j79-" Sep 12 17:24:14.271085 containerd[1886]: 2025-09-12 17:24:14.166 [INFO][5249] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00" Namespace="calico-system" Pod="csi-node-driver-76j79" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-csi--node--driver--76j79-eth0" Sep 12 17:24:14.271085 containerd[1886]: 2025-09-12 17:24:14.185 [INFO][5266] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00" HandleID="k8s-pod-network.8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00" Workload="ci--4426.1.0--a--1fe763f55e-k8s-csi--node--driver--76j79-eth0" Sep 12 17:24:14.271085 containerd[1886]: 2025-09-12 17:24:14.185 [INFO][5266] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00" HandleID="k8s-pod-network.8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00" Workload="ci--4426.1.0--a--1fe763f55e-k8s-csi--node--driver--76j79-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024afa0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426.1.0-a-1fe763f55e", "pod":"csi-node-driver-76j79", "timestamp":"2025-09-12 17:24:14.185387431 +0000 UTC"}, Hostname:"ci-4426.1.0-a-1fe763f55e", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:24:14.271085 containerd[1886]: 2025-09-12 17:24:14.185 [INFO][5266] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:24:14.271085 containerd[1886]: 2025-09-12 17:24:14.185 [INFO][5266] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:24:14.271085 containerd[1886]: 2025-09-12 17:24:14.185 [INFO][5266] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426.1.0-a-1fe763f55e' Sep 12 17:24:14.271085 containerd[1886]: 2025-09-12 17:24:14.190 [INFO][5266] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:14.271085 containerd[1886]: 2025-09-12 17:24:14.200 [INFO][5266] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:14.271085 containerd[1886]: 2025-09-12 17:24:14.210 [INFO][5266] ipam/ipam.go 511: Trying affinity for 192.168.97.64/26 host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:14.271085 containerd[1886]: 2025-09-12 17:24:14.213 [INFO][5266] ipam/ipam.go 158: Attempting to load block cidr=192.168.97.64/26 host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:14.271085 containerd[1886]: 2025-09-12 17:24:14.215 [INFO][5266] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.97.64/26 host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:14.271085 containerd[1886]: 2025-09-12 17:24:14.215 [INFO][5266] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.97.64/26 handle="k8s-pod-network.8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:14.271085 containerd[1886]: 2025-09-12 17:24:14.217 [INFO][5266] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00 Sep 12 17:24:14.271085 containerd[1886]: 2025-09-12 17:24:14.224 [INFO][5266] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.97.64/26 handle="k8s-pod-network.8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:14.271085 containerd[1886]: 2025-09-12 17:24:14.240 [INFO][5266] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.97.72/26] block=192.168.97.64/26 handle="k8s-pod-network.8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:14.271085 containerd[1886]: 2025-09-12 17:24:14.240 [INFO][5266] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.97.72/26] handle="k8s-pod-network.8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00" host="ci-4426.1.0-a-1fe763f55e" Sep 12 17:24:14.271085 containerd[1886]: 2025-09-12 17:24:14.240 [INFO][5266] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:24:14.271085 containerd[1886]: 2025-09-12 17:24:14.240 [INFO][5266] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.97.72/26] IPv6=[] ContainerID="8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00" HandleID="k8s-pod-network.8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00" Workload="ci--4426.1.0--a--1fe763f55e-k8s-csi--node--driver--76j79-eth0" Sep 12 17:24:14.272039 containerd[1886]: 2025-09-12 17:24:14.244 [INFO][5249] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00" Namespace="calico-system" Pod="csi-node-driver-76j79" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-csi--node--driver--76j79-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--1fe763f55e-k8s-csi--node--driver--76j79-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae", ResourceVersion:"699", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 23, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-1fe763f55e", ContainerID:"", Pod:"csi-node-driver-76j79", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.97.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali1bbbe5b6467", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:24:14.272039 containerd[1886]: 2025-09-12 17:24:14.244 [INFO][5249] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.97.72/32] ContainerID="8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00" Namespace="calico-system" Pod="csi-node-driver-76j79" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-csi--node--driver--76j79-eth0" Sep 12 17:24:14.272039 containerd[1886]: 2025-09-12 17:24:14.244 [INFO][5249] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1bbbe5b6467 ContainerID="8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00" Namespace="calico-system" Pod="csi-node-driver-76j79" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-csi--node--driver--76j79-eth0" Sep 12 17:24:14.272039 containerd[1886]: 2025-09-12 17:24:14.254 [INFO][5249] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00" Namespace="calico-system" Pod="csi-node-driver-76j79" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-csi--node--driver--76j79-eth0" Sep 12 17:24:14.272039 containerd[1886]: 2025-09-12 17:24:14.254 [INFO][5249] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00" Namespace="calico-system" Pod="csi-node-driver-76j79" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-csi--node--driver--76j79-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426.1.0--a--1fe763f55e-k8s-csi--node--driver--76j79-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae", ResourceVersion:"699", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 23, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426.1.0-a-1fe763f55e", ContainerID:"8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00", Pod:"csi-node-driver-76j79", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.97.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali1bbbe5b6467", MAC:"52:3b:f6:ee:e8:c6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:24:14.272039 containerd[1886]: 2025-09-12 17:24:14.269 [INFO][5249] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00" Namespace="calico-system" Pod="csi-node-driver-76j79" WorkloadEndpoint="ci--4426.1.0--a--1fe763f55e-k8s-csi--node--driver--76j79-eth0" Sep 12 17:24:15.147570 containerd[1886]: time="2025-09-12T17:24:15.147529524Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-996756887-nfmh2,Uid:012d6e27-14a7-44d7-b5f3-972242703ab4,Namespace:calico-system,Attempt:0,} returns sandbox id \"96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6\"" Sep 12 17:24:15.937361 systemd-networkd[1697]: cali1bbbe5b6467: Gained IPv6LL Sep 12 17:24:16.433554 kubelet[3450]: I0912 17:24:16.433514 3450 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:24:16.493055 containerd[1886]: time="2025-09-12T17:24:16.493008377Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5c45c67467f07c0751a9c139448df8357573caebf1e31f41ebecff726904cb67\" id:\"b1e62c58c3bab2fa5d1de45d89dac41f1c942a0962acf486d27f1b1395d28853\" pid:5301 exited_at:{seconds:1757697856 nanos:492539158}" Sep 12 17:24:16.561689 containerd[1886]: time="2025-09-12T17:24:16.561562317Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5c45c67467f07c0751a9c139448df8357573caebf1e31f41ebecff726904cb67\" id:\"2c15c537eddb5e4f2688323407e10ae13244071a91657d506432ee345efa320b\" pid:5326 exited_at:{seconds:1757697856 nanos:561346040}" Sep 12 17:24:17.403258 containerd[1886]: time="2025-09-12T17:24:17.403218752Z" level=info msg="connecting to shim c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9" address="unix:///run/containerd/s/ce1512076c4badc009eaa56fae5e4214ec89b5c48eaaea7ab7881b06a40cb8d0" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:24:17.420255 systemd[1]: Started cri-containerd-c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9.scope - libcontainer container c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9. Sep 12 17:24:17.448826 containerd[1886]: time="2025-09-12T17:24:17.448787272Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-q58t7,Uid:ccd2bbe7-dc74-41c7-8369-074eb02b14c4,Namespace:kube-system,Attempt:0,} returns sandbox id \"77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02\"" Sep 12 17:24:17.544546 containerd[1886]: time="2025-09-12T17:24:17.544501404Z" level=info msg="CreateContainer within sandbox \"77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 17:24:17.648472 containerd[1886]: time="2025-09-12T17:24:17.648372284Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-l8q56,Uid:9a5be5b5-06ad-4760-958c-44bbde584588,Namespace:kube-system,Attempt:0,} returns sandbox id \"c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9\"" Sep 12 17:24:17.741109 containerd[1886]: time="2025-09-12T17:24:17.740612774Z" level=info msg="CreateContainer within sandbox \"c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 17:24:18.343139 containerd[1886]: time="2025-09-12T17:24:18.342903110Z" level=info msg="Container 589b9679fe185dff8ec0adcb607c23c6bf6e7e1beaf7cce13c3b27c01f82c591: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:24:18.395660 containerd[1886]: time="2025-09-12T17:24:18.395631633Z" level=info msg="Container bd230ce6a52c2b98a65d7df2d23acce524d15f817d3fc80e1f50814772f83a37: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:24:18.698149 containerd[1886]: time="2025-09-12T17:24:18.698089213Z" level=info msg="connecting to shim 8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00" address="unix:///run/containerd/s/0b1cc6a01282ae348d8efb12b0247e9873672abe7640be9e56649cf407ee789f" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:24:18.720339 systemd[1]: Started cri-containerd-8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00.scope - libcontainer container 8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00. Sep 12 17:24:18.845366 containerd[1886]: time="2025-09-12T17:24:18.845264582Z" level=info msg="CreateContainer within sandbox \"77b9a175d0c12a00a17714bb112c6a180552b5382cd34e9fb5ae01c3d16e2c02\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"589b9679fe185dff8ec0adcb607c23c6bf6e7e1beaf7cce13c3b27c01f82c591\"" Sep 12 17:24:18.846177 containerd[1886]: time="2025-09-12T17:24:18.846124914Z" level=info msg="StartContainer for \"589b9679fe185dff8ec0adcb607c23c6bf6e7e1beaf7cce13c3b27c01f82c591\"" Sep 12 17:24:18.889280 containerd[1886]: time="2025-09-12T17:24:18.889251240Z" level=info msg="connecting to shim 589b9679fe185dff8ec0adcb607c23c6bf6e7e1beaf7cce13c3b27c01f82c591" address="unix:///run/containerd/s/f96a0badd38fa45d15a228f7188aa7435a56bfd1c4535cd4e4e2975ed8b86f3e" protocol=ttrpc version=3 Sep 12 17:24:18.893628 containerd[1886]: time="2025-09-12T17:24:18.893599823Z" level=info msg="CreateContainer within sandbox \"c6f0a2a453c4a4af50ca479d43493be55b997f22eb99487824454da2441a67f9\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"bd230ce6a52c2b98a65d7df2d23acce524d15f817d3fc80e1f50814772f83a37\"" Sep 12 17:24:18.895892 containerd[1886]: time="2025-09-12T17:24:18.895705394Z" level=info msg="StartContainer for \"bd230ce6a52c2b98a65d7df2d23acce524d15f817d3fc80e1f50814772f83a37\"" Sep 12 17:24:18.897076 containerd[1886]: time="2025-09-12T17:24:18.896992872Z" level=info msg="connecting to shim bd230ce6a52c2b98a65d7df2d23acce524d15f817d3fc80e1f50814772f83a37" address="unix:///run/containerd/s/ce1512076c4badc009eaa56fae5e4214ec89b5c48eaaea7ab7881b06a40cb8d0" protocol=ttrpc version=3 Sep 12 17:24:18.915230 systemd[1]: Started cri-containerd-589b9679fe185dff8ec0adcb607c23c6bf6e7e1beaf7cce13c3b27c01f82c591.scope - libcontainer container 589b9679fe185dff8ec0adcb607c23c6bf6e7e1beaf7cce13c3b27c01f82c591. Sep 12 17:24:18.918378 systemd[1]: Started cri-containerd-bd230ce6a52c2b98a65d7df2d23acce524d15f817d3fc80e1f50814772f83a37.scope - libcontainer container bd230ce6a52c2b98a65d7df2d23acce524d15f817d3fc80e1f50814772f83a37. Sep 12 17:24:18.944642 containerd[1886]: time="2025-09-12T17:24:18.944601601Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-76j79,Uid:8f03c06d-da1b-4e3e-bc09-a87a7cbc90ae,Namespace:calico-system,Attempt:0,} returns sandbox id \"8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00\"" Sep 12 17:24:19.097627 containerd[1886]: time="2025-09-12T17:24:19.097105761Z" level=info msg="StartContainer for \"bd230ce6a52c2b98a65d7df2d23acce524d15f817d3fc80e1f50814772f83a37\" returns successfully" Sep 12 17:24:19.097627 containerd[1886]: time="2025-09-12T17:24:19.097144546Z" level=info msg="StartContainer for \"589b9679fe185dff8ec0adcb607c23c6bf6e7e1beaf7cce13c3b27c01f82c591\" returns successfully" Sep 12 17:24:19.866823 kubelet[3450]: I0912 17:24:19.866675 3450 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-l8q56" podStartSLOduration=44.866659106 podStartE2EDuration="44.866659106s" podCreationTimestamp="2025-09-12 17:23:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:24:19.832484835 +0000 UTC m=+50.603231961" watchObservedRunningTime="2025-09-12 17:24:19.866659106 +0000 UTC m=+50.637406232" Sep 12 17:24:24.441888 containerd[1886]: time="2025-09-12T17:24:24.441782730Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:24.446980 containerd[1886]: time="2025-09-12T17:24:24.446951778Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 12 17:24:24.491914 containerd[1886]: time="2025-09-12T17:24:24.491865397Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:24.540886 containerd[1886]: time="2025-09-12T17:24:24.540834952Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:24.541949 containerd[1886]: time="2025-09-12T17:24:24.541845895Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 13.346413638s" Sep 12 17:24:24.541949 containerd[1886]: time="2025-09-12T17:24:24.541872144Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 12 17:24:24.542935 containerd[1886]: time="2025-09-12T17:24:24.542893047Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 12 17:24:24.647730 containerd[1886]: time="2025-09-12T17:24:24.647284241Z" level=info msg="CreateContainer within sandbox \"a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 17:24:24.795751 containerd[1886]: time="2025-09-12T17:24:24.795672266Z" level=info msg="Container 37e178120d6799715d1fb0e5a332defd05251669b83c39013898fda0590a9dde: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:24:24.995265 containerd[1886]: time="2025-09-12T17:24:24.995221711Z" level=info msg="CreateContainer within sandbox \"a10cdec282e69dd71dcb363a579db3e1c7b7b23ef6998b082776d1fd0e4cf691\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"37e178120d6799715d1fb0e5a332defd05251669b83c39013898fda0590a9dde\"" Sep 12 17:24:24.996084 containerd[1886]: time="2025-09-12T17:24:24.995993785Z" level=info msg="StartContainer for \"37e178120d6799715d1fb0e5a332defd05251669b83c39013898fda0590a9dde\"" Sep 12 17:24:24.997657 containerd[1886]: time="2025-09-12T17:24:24.997632703Z" level=info msg="connecting to shim 37e178120d6799715d1fb0e5a332defd05251669b83c39013898fda0590a9dde" address="unix:///run/containerd/s/4cc98fc36bb219c40bef515c6894f783a00e5fca5d819f0c4d09905b3dbdeb1c" protocol=ttrpc version=3 Sep 12 17:24:25.020252 systemd[1]: Started cri-containerd-37e178120d6799715d1fb0e5a332defd05251669b83c39013898fda0590a9dde.scope - libcontainer container 37e178120d6799715d1fb0e5a332defd05251669b83c39013898fda0590a9dde. Sep 12 17:24:25.092268 containerd[1886]: time="2025-09-12T17:24:25.092181812Z" level=info msg="StartContainer for \"37e178120d6799715d1fb0e5a332defd05251669b83c39013898fda0590a9dde\" returns successfully" Sep 12 17:24:25.835268 kubelet[3450]: I0912 17:24:25.835215 3450 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-q58t7" podStartSLOduration=50.835198863 podStartE2EDuration="50.835198863s" podCreationTimestamp="2025-09-12 17:23:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:24:19.9020844 +0000 UTC m=+50.672831526" watchObservedRunningTime="2025-09-12 17:24:25.835198863 +0000 UTC m=+56.605945989" Sep 12 17:24:26.824629 kubelet[3450]: I0912 17:24:26.824596 3450 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:24:27.499437 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1805378745.mount: Deactivated successfully. Sep 12 17:24:28.092150 containerd[1886]: time="2025-09-12T17:24:28.091756128Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:28.095553 containerd[1886]: time="2025-09-12T17:24:28.095518079Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 12 17:24:28.101198 containerd[1886]: time="2025-09-12T17:24:28.101043719Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:28.111157 containerd[1886]: time="2025-09-12T17:24:28.110920805Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:28.112508 containerd[1886]: time="2025-09-12T17:24:28.112230859Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 3.569316499s" Sep 12 17:24:28.112587 containerd[1886]: time="2025-09-12T17:24:28.112512098Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 12 17:24:28.113572 containerd[1886]: time="2025-09-12T17:24:28.113545938Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 17:24:28.122028 containerd[1886]: time="2025-09-12T17:24:28.122002022Z" level=info msg="CreateContainer within sandbox \"b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 12 17:24:28.144335 containerd[1886]: time="2025-09-12T17:24:28.143819873Z" level=info msg="Container 80ce711fab9081f96308e95a8467c272070b466dbe4044e655998357af397403: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:24:28.164717 containerd[1886]: time="2025-09-12T17:24:28.164678958Z" level=info msg="CreateContainer within sandbox \"b718b60b081d8ad32ca419c14e2e164a490f6c9c964c42d9b0d054f9eab24286\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"80ce711fab9081f96308e95a8467c272070b466dbe4044e655998357af397403\"" Sep 12 17:24:28.166149 containerd[1886]: time="2025-09-12T17:24:28.165065775Z" level=info msg="StartContainer for \"80ce711fab9081f96308e95a8467c272070b466dbe4044e655998357af397403\"" Sep 12 17:24:28.166862 containerd[1886]: time="2025-09-12T17:24:28.166835320Z" level=info msg="connecting to shim 80ce711fab9081f96308e95a8467c272070b466dbe4044e655998357af397403" address="unix:///run/containerd/s/80e6ad67b969f1ac9b29ad221b0f40ea6405af8515600fa709f09f43aa7d5dd6" protocol=ttrpc version=3 Sep 12 17:24:28.205247 systemd[1]: Started cri-containerd-80ce711fab9081f96308e95a8467c272070b466dbe4044e655998357af397403.scope - libcontainer container 80ce711fab9081f96308e95a8467c272070b466dbe4044e655998357af397403. Sep 12 17:24:28.243378 containerd[1886]: time="2025-09-12T17:24:28.243302921Z" level=info msg="StartContainer for \"80ce711fab9081f96308e95a8467c272070b466dbe4044e655998357af397403\" returns successfully" Sep 12 17:24:28.478342 containerd[1886]: time="2025-09-12T17:24:28.478295614Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:28.482964 containerd[1886]: time="2025-09-12T17:24:28.482934314Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 12 17:24:28.483780 containerd[1886]: time="2025-09-12T17:24:28.483751237Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 370.181163ms" Sep 12 17:24:28.483877 containerd[1886]: time="2025-09-12T17:24:28.483782910Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 12 17:24:28.485003 containerd[1886]: time="2025-09-12T17:24:28.484981258Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 12 17:24:28.495600 containerd[1886]: time="2025-09-12T17:24:28.495571936Z" level=info msg="CreateContainer within sandbox \"76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 17:24:28.520728 containerd[1886]: time="2025-09-12T17:24:28.519901637Z" level=info msg="Container 37bf7a1d8f64d601a2d5b5fa53190aa43c4d164d169cc812a3fc85eedf0a6b87: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:24:28.521096 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount859042088.mount: Deactivated successfully. Sep 12 17:24:28.545230 containerd[1886]: time="2025-09-12T17:24:28.545122775Z" level=info msg="CreateContainer within sandbox \"76eecb081cf5d5744f03a9a033e81922db27978ef76c9030a401a6b7f753026a\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"37bf7a1d8f64d601a2d5b5fa53190aa43c4d164d169cc812a3fc85eedf0a6b87\"" Sep 12 17:24:28.546154 containerd[1886]: time="2025-09-12T17:24:28.545848640Z" level=info msg="StartContainer for \"37bf7a1d8f64d601a2d5b5fa53190aa43c4d164d169cc812a3fc85eedf0a6b87\"" Sep 12 17:24:28.546749 containerd[1886]: time="2025-09-12T17:24:28.546730173Z" level=info msg="connecting to shim 37bf7a1d8f64d601a2d5b5fa53190aa43c4d164d169cc812a3fc85eedf0a6b87" address="unix:///run/containerd/s/f19bfc397a0ac57ff2d96d775e90fe43e4247d8484fd28957725811f418f31a9" protocol=ttrpc version=3 Sep 12 17:24:28.571254 systemd[1]: Started cri-containerd-37bf7a1d8f64d601a2d5b5fa53190aa43c4d164d169cc812a3fc85eedf0a6b87.scope - libcontainer container 37bf7a1d8f64d601a2d5b5fa53190aa43c4d164d169cc812a3fc85eedf0a6b87. Sep 12 17:24:28.612121 containerd[1886]: time="2025-09-12T17:24:28.612088212Z" level=info msg="StartContainer for \"37bf7a1d8f64d601a2d5b5fa53190aa43c4d164d169cc812a3fc85eedf0a6b87\" returns successfully" Sep 12 17:24:28.850346 kubelet[3450]: I0912 17:24:28.850130 3450 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6b99b7b99b-58b54" podStartSLOduration=31.502458723 podStartE2EDuration="44.850108559s" podCreationTimestamp="2025-09-12 17:23:44 +0000 UTC" firstStartedPulling="2025-09-12 17:24:11.194960309 +0000 UTC m=+41.965707435" lastFinishedPulling="2025-09-12 17:24:24.542610145 +0000 UTC m=+55.313357271" observedRunningTime="2025-09-12 17:24:25.835979321 +0000 UTC m=+56.606726463" watchObservedRunningTime="2025-09-12 17:24:28.850108559 +0000 UTC m=+59.620855685" Sep 12 17:24:28.852012 kubelet[3450]: I0912 17:24:28.851827 3450 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6b99b7b99b-gplt4" podStartSLOduration=30.463263341 podStartE2EDuration="44.851816695s" podCreationTimestamp="2025-09-12 17:23:44 +0000 UTC" firstStartedPulling="2025-09-12 17:24:14.095949309 +0000 UTC m=+44.866696435" lastFinishedPulling="2025-09-12 17:24:28.484502663 +0000 UTC m=+59.255249789" observedRunningTime="2025-09-12 17:24:28.84835743 +0000 UTC m=+59.619104556" watchObservedRunningTime="2025-09-12 17:24:28.851816695 +0000 UTC m=+59.622563917" Sep 12 17:24:29.139974 containerd[1886]: time="2025-09-12T17:24:29.139916310Z" level=info msg="TaskExit event in podsandbox handler container_id:\"80ce711fab9081f96308e95a8467c272070b466dbe4044e655998357af397403\" id:\"db7a8e4b2225603e2a06548fd624fd6e2fa5138655f4fe968559794f7982912f\" pid:5653 exit_status:1 exited_at:{seconds:1757697869 nanos:138185501}" Sep 12 17:24:29.729777 kubelet[3450]: I0912 17:24:29.729660 3450 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-m5hn4" podStartSLOduration=28.004965739 podStartE2EDuration="42.729646982s" podCreationTimestamp="2025-09-12 17:23:47 +0000 UTC" firstStartedPulling="2025-09-12 17:24:13.388602121 +0000 UTC m=+44.159349247" lastFinishedPulling="2025-09-12 17:24:28.113283364 +0000 UTC m=+58.884030490" observedRunningTime="2025-09-12 17:24:28.865295864 +0000 UTC m=+59.636042990" watchObservedRunningTime="2025-09-12 17:24:29.729646982 +0000 UTC m=+60.500394108" Sep 12 17:24:29.894855 containerd[1886]: time="2025-09-12T17:24:29.894720051Z" level=info msg="TaskExit event in podsandbox handler container_id:\"80ce711fab9081f96308e95a8467c272070b466dbe4044e655998357af397403\" id:\"e381524034a1b26b2537d672b7e4d419f4f8bdc55a4ef2e8b2c07c183b1e2073\" pid:5682 exit_status:1 exited_at:{seconds:1757697869 nanos:894551495}" Sep 12 17:24:32.817836 containerd[1886]: time="2025-09-12T17:24:32.817650149Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:32.821539 containerd[1886]: time="2025-09-12T17:24:32.821510263Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 12 17:24:32.825272 containerd[1886]: time="2025-09-12T17:24:32.825245158Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:32.831368 containerd[1886]: time="2025-09-12T17:24:32.831333567Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 4.346327669s" Sep 12 17:24:32.831368 containerd[1886]: time="2025-09-12T17:24:32.831366344Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 12 17:24:32.831673 containerd[1886]: time="2025-09-12T17:24:32.831473850Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:32.833793 containerd[1886]: time="2025-09-12T17:24:32.833541790Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 12 17:24:32.861096 containerd[1886]: time="2025-09-12T17:24:32.861068751Z" level=info msg="CreateContainer within sandbox \"96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 12 17:24:32.895219 containerd[1886]: time="2025-09-12T17:24:32.895187411Z" level=info msg="Container 496ad7900add400989d0ec393de402e1dea2cb5478991520aa4909401e77fffe: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:24:32.896855 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2734091694.mount: Deactivated successfully. Sep 12 17:24:32.917909 containerd[1886]: time="2025-09-12T17:24:32.917875221Z" level=info msg="CreateContainer within sandbox \"96aa2141fa3f6c4c4bba5a9065f264a106dec9509e371378b1bfc56af3a1abb6\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"496ad7900add400989d0ec393de402e1dea2cb5478991520aa4909401e77fffe\"" Sep 12 17:24:32.919172 containerd[1886]: time="2025-09-12T17:24:32.918359487Z" level=info msg="StartContainer for \"496ad7900add400989d0ec393de402e1dea2cb5478991520aa4909401e77fffe\"" Sep 12 17:24:32.919172 containerd[1886]: time="2025-09-12T17:24:32.919075422Z" level=info msg="connecting to shim 496ad7900add400989d0ec393de402e1dea2cb5478991520aa4909401e77fffe" address="unix:///run/containerd/s/585ae5eee4c79e4017acf7fb07f7a8bf41619d693d57ea41dc35436da6c7f354" protocol=ttrpc version=3 Sep 12 17:24:32.943378 systemd[1]: Started cri-containerd-496ad7900add400989d0ec393de402e1dea2cb5478991520aa4909401e77fffe.scope - libcontainer container 496ad7900add400989d0ec393de402e1dea2cb5478991520aa4909401e77fffe. Sep 12 17:24:32.986702 containerd[1886]: time="2025-09-12T17:24:32.986629065Z" level=info msg="StartContainer for \"496ad7900add400989d0ec393de402e1dea2cb5478991520aa4909401e77fffe\" returns successfully" Sep 12 17:24:33.901004 containerd[1886]: time="2025-09-12T17:24:33.900963567Z" level=info msg="TaskExit event in podsandbox handler container_id:\"496ad7900add400989d0ec393de402e1dea2cb5478991520aa4909401e77fffe\" id:\"578a013265b251f6f49ee52fd31f0680c9a83cb0a2bb4e8bf7752048d02746e3\" pid:5751 exited_at:{seconds:1757697873 nanos:900606527}" Sep 12 17:24:33.915631 kubelet[3450]: I0912 17:24:33.915574 3450 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-996756887-nfmh2" podStartSLOduration=29.232525822 podStartE2EDuration="46.915559693s" podCreationTimestamp="2025-09-12 17:23:47 +0000 UTC" firstStartedPulling="2025-09-12 17:24:15.149032632 +0000 UTC m=+45.919779758" lastFinishedPulling="2025-09-12 17:24:32.832066503 +0000 UTC m=+63.602813629" observedRunningTime="2025-09-12 17:24:33.875716407 +0000 UTC m=+64.646463541" watchObservedRunningTime="2025-09-12 17:24:33.915559693 +0000 UTC m=+64.686306819" Sep 12 17:24:34.756199 containerd[1886]: time="2025-09-12T17:24:34.756111164Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:34.760052 containerd[1886]: time="2025-09-12T17:24:34.760017167Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 12 17:24:34.764368 containerd[1886]: time="2025-09-12T17:24:34.764326971Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:34.769370 containerd[1886]: time="2025-09-12T17:24:34.769333277Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:34.769856 containerd[1886]: time="2025-09-12T17:24:34.769682061Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 1.936115357s" Sep 12 17:24:34.769856 containerd[1886]: time="2025-09-12T17:24:34.769710309Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 12 17:24:34.778692 containerd[1886]: time="2025-09-12T17:24:34.778671387Z" level=info msg="CreateContainer within sandbox \"8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 12 17:24:34.810446 containerd[1886]: time="2025-09-12T17:24:34.810278298Z" level=info msg="Container 21adf1de668cbf4f2ebedfdb4922c7c61d137fbaaa347e719276839372358c67: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:24:34.812539 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount73957344.mount: Deactivated successfully. Sep 12 17:24:34.842054 containerd[1886]: time="2025-09-12T17:24:34.842017132Z" level=info msg="CreateContainer within sandbox \"8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"21adf1de668cbf4f2ebedfdb4922c7c61d137fbaaa347e719276839372358c67\"" Sep 12 17:24:34.843605 containerd[1886]: time="2025-09-12T17:24:34.843577333Z" level=info msg="StartContainer for \"21adf1de668cbf4f2ebedfdb4922c7c61d137fbaaa347e719276839372358c67\"" Sep 12 17:24:34.845829 containerd[1886]: time="2025-09-12T17:24:34.845737147Z" level=info msg="connecting to shim 21adf1de668cbf4f2ebedfdb4922c7c61d137fbaaa347e719276839372358c67" address="unix:///run/containerd/s/0b1cc6a01282ae348d8efb12b0247e9873672abe7640be9e56649cf407ee789f" protocol=ttrpc version=3 Sep 12 17:24:34.873249 systemd[1]: Started cri-containerd-21adf1de668cbf4f2ebedfdb4922c7c61d137fbaaa347e719276839372358c67.scope - libcontainer container 21adf1de668cbf4f2ebedfdb4922c7c61d137fbaaa347e719276839372358c67. Sep 12 17:24:34.930412 containerd[1886]: time="2025-09-12T17:24:34.930381273Z" level=info msg="StartContainer for \"21adf1de668cbf4f2ebedfdb4922c7c61d137fbaaa347e719276839372358c67\" returns successfully" Sep 12 17:24:34.932326 containerd[1886]: time="2025-09-12T17:24:34.932288345Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 12 17:24:36.961744 containerd[1886]: time="2025-09-12T17:24:36.961692443Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:36.965496 containerd[1886]: time="2025-09-12T17:24:36.965463195Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 12 17:24:36.969494 containerd[1886]: time="2025-09-12T17:24:36.969439336Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:36.974646 containerd[1886]: time="2025-09-12T17:24:36.974602313Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:24:36.975536 containerd[1886]: time="2025-09-12T17:24:36.975062363Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 2.042746553s" Sep 12 17:24:36.975536 containerd[1886]: time="2025-09-12T17:24:36.975086908Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 12 17:24:36.985359 containerd[1886]: time="2025-09-12T17:24:36.985334507Z" level=info msg="CreateContainer within sandbox \"8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 12 17:24:37.011144 containerd[1886]: time="2025-09-12T17:24:37.011033020Z" level=info msg="Container d942a20b8216767ee66254bf21f1cc4fd48860532e250a6e5696291cf0625440: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:24:37.032931 containerd[1886]: time="2025-09-12T17:24:37.032907012Z" level=info msg="CreateContainer within sandbox \"8bed2e8a65a090827db6f89e7652b97959879cf46960db60d1849b76bcc9db00\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"d942a20b8216767ee66254bf21f1cc4fd48860532e250a6e5696291cf0625440\"" Sep 12 17:24:37.033849 containerd[1886]: time="2025-09-12T17:24:37.033437280Z" level=info msg="StartContainer for \"d942a20b8216767ee66254bf21f1cc4fd48860532e250a6e5696291cf0625440\"" Sep 12 17:24:37.034722 containerd[1886]: time="2025-09-12T17:24:37.034681461Z" level=info msg="connecting to shim d942a20b8216767ee66254bf21f1cc4fd48860532e250a6e5696291cf0625440" address="unix:///run/containerd/s/0b1cc6a01282ae348d8efb12b0247e9873672abe7640be9e56649cf407ee789f" protocol=ttrpc version=3 Sep 12 17:24:37.051246 systemd[1]: Started cri-containerd-d942a20b8216767ee66254bf21f1cc4fd48860532e250a6e5696291cf0625440.scope - libcontainer container d942a20b8216767ee66254bf21f1cc4fd48860532e250a6e5696291cf0625440. Sep 12 17:24:37.092151 containerd[1886]: time="2025-09-12T17:24:37.091780580Z" level=info msg="StartContainer for \"d942a20b8216767ee66254bf21f1cc4fd48860532e250a6e5696291cf0625440\" returns successfully" Sep 12 17:24:37.737230 kubelet[3450]: I0912 17:24:37.737198 3450 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 12 17:24:37.740982 kubelet[3450]: I0912 17:24:37.740964 3450 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 12 17:24:37.884263 kubelet[3450]: I0912 17:24:37.884209 3450 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-76j79" podStartSLOduration=32.856834261 podStartE2EDuration="50.884196285s" podCreationTimestamp="2025-09-12 17:23:47 +0000 UTC" firstStartedPulling="2025-09-12 17:24:18.948504438 +0000 UTC m=+49.719251572" lastFinishedPulling="2025-09-12 17:24:36.97586647 +0000 UTC m=+67.746613596" observedRunningTime="2025-09-12 17:24:37.883891702 +0000 UTC m=+68.654638868" watchObservedRunningTime="2025-09-12 17:24:37.884196285 +0000 UTC m=+68.654943411" Sep 12 17:24:42.791360 kubelet[3450]: I0912 17:24:42.791260 3450 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:24:46.711310 containerd[1886]: time="2025-09-12T17:24:46.711090980Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5c45c67467f07c0751a9c139448df8357573caebf1e31f41ebecff726904cb67\" id:\"b87b316a7192241c30ab4d338aae794452c6dcd71b1183755a4b17823d188bb6\" pid:5856 exited_at:{seconds:1757697886 nanos:710567443}" Sep 12 17:24:59.927559 containerd[1886]: time="2025-09-12T17:24:59.927484381Z" level=info msg="TaskExit event in podsandbox handler container_id:\"80ce711fab9081f96308e95a8467c272070b466dbe4044e655998357af397403\" id:\"a503b817618fc2371238dce736829f5ffded874fcdc70d15bd203768e85bb34b\" pid:5883 exited_at:{seconds:1757697899 nanos:926009460}" Sep 12 17:25:03.875512 containerd[1886]: time="2025-09-12T17:25:03.875385539Z" level=info msg="TaskExit event in podsandbox handler container_id:\"496ad7900add400989d0ec393de402e1dea2cb5478991520aa4909401e77fffe\" id:\"efd48dbe2bef0d5503b5486c7a4c21873f90694126c4070fc33fb4b8e05ee432\" pid:5909 exited_at:{seconds:1757697903 nanos:875059884}" Sep 12 17:25:16.553867 containerd[1886]: time="2025-09-12T17:25:16.553828230Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5c45c67467f07c0751a9c139448df8357573caebf1e31f41ebecff726904cb67\" id:\"8af3435b9a54eb153e143c5cbf33521c2bde58a611daec46c46afd68f3f4dd8a\" pid:5934 exited_at:{seconds:1757697916 nanos:553602450}" Sep 12 17:25:28.445895 containerd[1886]: time="2025-09-12T17:25:28.445852616Z" level=info msg="TaskExit event in podsandbox handler container_id:\"80ce711fab9081f96308e95a8467c272070b466dbe4044e655998357af397403\" id:\"675e1f889f376ca3de7a58082a0da2b49ba73b691b620b9adbf31544510593e7\" pid:5966 exited_at:{seconds:1757697928 nanos:445618875}" Sep 12 17:25:29.890420 containerd[1886]: time="2025-09-12T17:25:29.890243696Z" level=info msg="TaskExit event in podsandbox handler container_id:\"80ce711fab9081f96308e95a8467c272070b466dbe4044e655998357af397403\" id:\"307399533dd0a8a944b5a13bfccf40595d16cdcc2de678596ac201603e1a1e55\" pid:5990 exited_at:{seconds:1757697929 nanos:889982698}" Sep 12 17:25:33.871047 containerd[1886]: time="2025-09-12T17:25:33.871004333Z" level=info msg="TaskExit event in podsandbox handler container_id:\"496ad7900add400989d0ec393de402e1dea2cb5478991520aa4909401e77fffe\" id:\"1630aad9870d208854fea64c41a54cd7b0840ac7c186efc4437325a94eb56fe1\" pid:6010 exited_at:{seconds:1757697933 nanos:870708231}" Sep 12 17:25:40.904492 containerd[1886]: time="2025-09-12T17:25:40.904427138Z" level=info msg="TaskExit event in podsandbox handler container_id:\"496ad7900add400989d0ec393de402e1dea2cb5478991520aa4909401e77fffe\" id:\"9619731167fc124f8cd3d1791c7d52e01660ed8d507cc6d3b1df9cd956788758\" pid:6043 exited_at:{seconds:1757697940 nanos:904043730}" Sep 12 17:25:46.555712 containerd[1886]: time="2025-09-12T17:25:46.555670803Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5c45c67467f07c0751a9c139448df8357573caebf1e31f41ebecff726904cb67\" id:\"6ae7562b7ae210615a117bdf4ddbe639c14d227bed88d281afa4b5f999451c2b\" pid:6078 exited_at:{seconds:1757697946 nanos:555441167}" Sep 12 17:25:59.890463 containerd[1886]: time="2025-09-12T17:25:59.890418247Z" level=info msg="TaskExit event in podsandbox handler container_id:\"80ce711fab9081f96308e95a8467c272070b466dbe4044e655998357af397403\" id:\"ff52d6ef69b5c1d1fa70cec86235e292a1444242492ced0753f1efae869396ab\" pid:6102 exited_at:{seconds:1757697959 nanos:889934477}" Sep 12 17:26:03.870108 containerd[1886]: time="2025-09-12T17:26:03.870057553Z" level=info msg="TaskExit event in podsandbox handler container_id:\"496ad7900add400989d0ec393de402e1dea2cb5478991520aa4909401e77fffe\" id:\"3c582fbcfa3b000c2c8dd4a2210971ecdbc26c6783c5c26b2f2cfed22268ed83\" pid:6125 exited_at:{seconds:1757697963 nanos:869884973}" Sep 12 17:26:16.561046 containerd[1886]: time="2025-09-12T17:26:16.560954778Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5c45c67467f07c0751a9c139448df8357573caebf1e31f41ebecff726904cb67\" id:\"7540ebdf22a52d4dd39ff38c7c86e1e63dfa85ba67156997ed0a4223e1b4ed19\" pid:6150 exited_at:{seconds:1757697976 nanos:560763270}" Sep 12 17:26:28.443371 containerd[1886]: time="2025-09-12T17:26:28.443329857Z" level=info msg="TaskExit event in podsandbox handler container_id:\"80ce711fab9081f96308e95a8467c272070b466dbe4044e655998357af397403\" id:\"9587bd09dd4cfe8cd9d148d1be41db7c6cdf5e540f8530ba5a6ca85b47153de1\" pid:6173 exited_at:{seconds:1757697988 nanos:443090332}" Sep 12 17:26:29.886992 containerd[1886]: time="2025-09-12T17:26:29.886823614Z" level=info msg="TaskExit event in podsandbox handler container_id:\"80ce711fab9081f96308e95a8467c272070b466dbe4044e655998357af397403\" id:\"c920f28922d9ae172cd502e540dfba7eb7344bf9720927f960ece44aa865d74c\" pid:6196 exited_at:{seconds:1757697989 nanos:886420205}" Sep 12 17:26:33.872312 containerd[1886]: time="2025-09-12T17:26:33.872274318Z" level=info msg="TaskExit event in podsandbox handler container_id:\"496ad7900add400989d0ec393de402e1dea2cb5478991520aa4909401e77fffe\" id:\"c0445cac9a797f028535944c3a1e7eaa92ff8c1c26da2488d7be3169bbaeb254\" pid:6218 exited_at:{seconds:1757697993 nanos:872001440}" Sep 12 17:26:40.901300 containerd[1886]: time="2025-09-12T17:26:40.901260443Z" level=info msg="TaskExit event in podsandbox handler container_id:\"496ad7900add400989d0ec393de402e1dea2cb5478991520aa4909401e77fffe\" id:\"9ac7f9556605049ee2614320124bda947aa9814046152daaf8507e0a15df273f\" pid:6240 exited_at:{seconds:1757698000 nanos:901054247}" Sep 12 17:26:46.557694 containerd[1886]: time="2025-09-12T17:26:46.557649091Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5c45c67467f07c0751a9c139448df8357573caebf1e31f41ebecff726904cb67\" id:\"50f056ce976bb6fa0fb736d6328a6ce6bc31df0b178cc5ad04799c1fd86d8100\" pid:6270 exited_at:{seconds:1757698006 nanos:557253755}" Sep 12 17:26:48.456591 systemd[1]: Started sshd@7-10.200.20.44:22-10.200.16.10:52288.service - OpenSSH per-connection server daemon (10.200.16.10:52288). Sep 12 17:26:48.916306 sshd[6286]: Accepted publickey for core from 10.200.16.10 port 52288 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:26:48.918458 sshd-session[6286]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:26:48.923007 systemd-logind[1855]: New session 10 of user core. Sep 12 17:26:48.927239 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 12 17:26:49.294789 sshd[6290]: Connection closed by 10.200.16.10 port 52288 Sep 12 17:26:49.295460 sshd-session[6286]: pam_unix(sshd:session): session closed for user core Sep 12 17:26:49.298838 systemd-logind[1855]: Session 10 logged out. Waiting for processes to exit. Sep 12 17:26:49.299418 systemd[1]: sshd@7-10.200.20.44:22-10.200.16.10:52288.service: Deactivated successfully. Sep 12 17:26:49.301658 systemd[1]: session-10.scope: Deactivated successfully. Sep 12 17:26:49.304187 systemd-logind[1855]: Removed session 10. Sep 12 17:26:54.389185 systemd[1]: Started sshd@8-10.200.20.44:22-10.200.16.10:40380.service - OpenSSH per-connection server daemon (10.200.16.10:40380). Sep 12 17:26:54.841936 sshd[6305]: Accepted publickey for core from 10.200.16.10 port 40380 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:26:54.842772 sshd-session[6305]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:26:54.851759 systemd-logind[1855]: New session 11 of user core. Sep 12 17:26:54.854264 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 12 17:26:55.222256 sshd[6308]: Connection closed by 10.200.16.10 port 40380 Sep 12 17:26:55.223267 sshd-session[6305]: pam_unix(sshd:session): session closed for user core Sep 12 17:26:55.226824 systemd[1]: sshd@8-10.200.20.44:22-10.200.16.10:40380.service: Deactivated successfully. Sep 12 17:26:55.228540 systemd[1]: session-11.scope: Deactivated successfully. Sep 12 17:26:55.229360 systemd-logind[1855]: Session 11 logged out. Waiting for processes to exit. Sep 12 17:26:55.230596 systemd-logind[1855]: Removed session 11. Sep 12 17:26:59.885199 containerd[1886]: time="2025-09-12T17:26:59.885153969Z" level=info msg="TaskExit event in podsandbox handler container_id:\"80ce711fab9081f96308e95a8467c272070b466dbe4044e655998357af397403\" id:\"43e996f81d137c83b1851df1ca72acd72092e962ebf463f21d1c9cda2b558895\" pid:6332 exited_at:{seconds:1757698019 nanos:884911717}" Sep 12 17:27:00.301332 systemd[1]: Started sshd@9-10.200.20.44:22-10.200.16.10:34516.service - OpenSSH per-connection server daemon (10.200.16.10:34516). Sep 12 17:27:00.718688 sshd[6342]: Accepted publickey for core from 10.200.16.10 port 34516 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:00.719791 sshd-session[6342]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:00.723152 systemd-logind[1855]: New session 12 of user core. Sep 12 17:27:00.727237 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 12 17:27:01.075420 sshd[6345]: Connection closed by 10.200.16.10 port 34516 Sep 12 17:27:01.075871 sshd-session[6342]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:01.078952 systemd[1]: sshd@9-10.200.20.44:22-10.200.16.10:34516.service: Deactivated successfully. Sep 12 17:27:01.080509 systemd[1]: session-12.scope: Deactivated successfully. Sep 12 17:27:01.081090 systemd-logind[1855]: Session 12 logged out. Waiting for processes to exit. Sep 12 17:27:01.082482 systemd-logind[1855]: Removed session 12. Sep 12 17:27:01.149767 systemd[1]: Started sshd@10-10.200.20.44:22-10.200.16.10:34518.service - OpenSSH per-connection server daemon (10.200.16.10:34518). Sep 12 17:27:01.570261 sshd[6358]: Accepted publickey for core from 10.200.16.10 port 34518 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:01.571368 sshd-session[6358]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:01.574984 systemd-logind[1855]: New session 13 of user core. Sep 12 17:27:01.586248 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 12 17:27:01.950437 sshd[6361]: Connection closed by 10.200.16.10 port 34518 Sep 12 17:27:01.949919 sshd-session[6358]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:01.953520 systemd-logind[1855]: Session 13 logged out. Waiting for processes to exit. Sep 12 17:27:01.953695 systemd[1]: sshd@10-10.200.20.44:22-10.200.16.10:34518.service: Deactivated successfully. Sep 12 17:27:01.955232 systemd[1]: session-13.scope: Deactivated successfully. Sep 12 17:27:01.956547 systemd-logind[1855]: Removed session 13. Sep 12 17:27:02.037657 systemd[1]: Started sshd@11-10.200.20.44:22-10.200.16.10:34530.service - OpenSSH per-connection server daemon (10.200.16.10:34530). Sep 12 17:27:02.489236 sshd[6370]: Accepted publickey for core from 10.200.16.10 port 34530 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:02.490350 sshd-session[6370]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:02.494273 systemd-logind[1855]: New session 14 of user core. Sep 12 17:27:02.502254 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 12 17:27:02.857232 sshd[6373]: Connection closed by 10.200.16.10 port 34530 Sep 12 17:27:02.857834 sshd-session[6370]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:02.861048 systemd[1]: sshd@11-10.200.20.44:22-10.200.16.10:34530.service: Deactivated successfully. Sep 12 17:27:02.862870 systemd[1]: session-14.scope: Deactivated successfully. Sep 12 17:27:02.863606 systemd-logind[1855]: Session 14 logged out. Waiting for processes to exit. Sep 12 17:27:02.864949 systemd-logind[1855]: Removed session 14. Sep 12 17:27:03.872474 containerd[1886]: time="2025-09-12T17:27:03.872432559Z" level=info msg="TaskExit event in podsandbox handler container_id:\"496ad7900add400989d0ec393de402e1dea2cb5478991520aa4909401e77fffe\" id:\"8420759415eb387f411bd78b162d488078b17bb4b497369e497880b952cf145a\" pid:6397 exited_at:{seconds:1757698023 nanos:872201555}" Sep 12 17:27:07.936300 systemd[1]: Started sshd@12-10.200.20.44:22-10.200.16.10:34532.service - OpenSSH per-connection server daemon (10.200.16.10:34532). Sep 12 17:27:08.350590 sshd[6413]: Accepted publickey for core from 10.200.16.10 port 34532 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:08.351522 sshd-session[6413]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:08.354978 systemd-logind[1855]: New session 15 of user core. Sep 12 17:27:08.362267 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 12 17:27:08.698435 sshd[6420]: Connection closed by 10.200.16.10 port 34532 Sep 12 17:27:08.698969 sshd-session[6413]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:08.701690 systemd-logind[1855]: Session 15 logged out. Waiting for processes to exit. Sep 12 17:27:08.702586 systemd[1]: sshd@12-10.200.20.44:22-10.200.16.10:34532.service: Deactivated successfully. Sep 12 17:27:08.705104 systemd[1]: session-15.scope: Deactivated successfully. Sep 12 17:27:08.707652 systemd-logind[1855]: Removed session 15. Sep 12 17:27:13.785281 systemd[1]: Started sshd@13-10.200.20.44:22-10.200.16.10:57462.service - OpenSSH per-connection server daemon (10.200.16.10:57462). Sep 12 17:27:14.242218 sshd[6433]: Accepted publickey for core from 10.200.16.10 port 57462 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:14.243308 sshd-session[6433]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:14.248205 systemd-logind[1855]: New session 16 of user core. Sep 12 17:27:14.257267 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 12 17:27:14.610926 sshd[6436]: Connection closed by 10.200.16.10 port 57462 Sep 12 17:27:14.610764 sshd-session[6433]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:14.614051 systemd[1]: sshd@13-10.200.20.44:22-10.200.16.10:57462.service: Deactivated successfully. Sep 12 17:27:14.617109 systemd[1]: session-16.scope: Deactivated successfully. Sep 12 17:27:14.617820 systemd-logind[1855]: Session 16 logged out. Waiting for processes to exit. Sep 12 17:27:14.618871 systemd-logind[1855]: Removed session 16. Sep 12 17:27:16.565736 containerd[1886]: time="2025-09-12T17:27:16.565697396Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5c45c67467f07c0751a9c139448df8357573caebf1e31f41ebecff726904cb67\" id:\"4d9b7edcc54a4e880fac5efa1b16a74c775ad20cdbc617c61bf400278a7430d0\" pid:6480 exited_at:{seconds:1757698036 nanos:565240619}" Sep 12 17:27:19.694463 systemd[1]: Started sshd@14-10.200.20.44:22-10.200.16.10:57474.service - OpenSSH per-connection server daemon (10.200.16.10:57474). Sep 12 17:27:19.735237 update_engine[1865]: I20250912 17:27:19.735200 1865 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Sep 12 17:27:19.736698 update_engine[1865]: I20250912 17:27:19.736184 1865 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Sep 12 17:27:19.738207 update_engine[1865]: I20250912 17:27:19.737505 1865 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Sep 12 17:27:19.738933 update_engine[1865]: I20250912 17:27:19.738880 1865 omaha_request_params.cc:62] Current group set to beta Sep 12 17:27:19.739709 update_engine[1865]: I20250912 17:27:19.739684 1865 update_attempter.cc:499] Already updated boot flags. Skipping. Sep 12 17:27:19.742828 update_engine[1865]: I20250912 17:27:19.742169 1865 update_attempter.cc:643] Scheduling an action processor start. Sep 12 17:27:19.742828 update_engine[1865]: I20250912 17:27:19.742211 1865 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 12 17:27:19.745272 update_engine[1865]: I20250912 17:27:19.745251 1865 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Sep 12 17:27:19.745411 update_engine[1865]: I20250912 17:27:19.745394 1865 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 12 17:27:19.746107 update_engine[1865]: I20250912 17:27:19.745472 1865 omaha_request_action.cc:272] Request: Sep 12 17:27:19.746107 update_engine[1865]: Sep 12 17:27:19.746107 update_engine[1865]: Sep 12 17:27:19.746107 update_engine[1865]: Sep 12 17:27:19.746107 update_engine[1865]: Sep 12 17:27:19.746107 update_engine[1865]: Sep 12 17:27:19.746107 update_engine[1865]: Sep 12 17:27:19.746107 update_engine[1865]: Sep 12 17:27:19.746107 update_engine[1865]: Sep 12 17:27:19.746107 update_engine[1865]: I20250912 17:27:19.745487 1865 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 17:27:19.749338 update_engine[1865]: I20250912 17:27:19.749315 1865 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 17:27:19.750153 update_engine[1865]: I20250912 17:27:19.749634 1865 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 17:27:19.755595 locksmithd[1978]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Sep 12 17:27:19.822866 update_engine[1865]: E20250912 17:27:19.822804 1865 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 17:27:19.822964 update_engine[1865]: I20250912 17:27:19.822894 1865 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Sep 12 17:27:20.155953 sshd[6492]: Accepted publickey for core from 10.200.16.10 port 57474 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:20.157033 sshd-session[6492]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:20.160522 systemd-logind[1855]: New session 17 of user core. Sep 12 17:27:20.166231 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 12 17:27:20.536233 sshd[6495]: Connection closed by 10.200.16.10 port 57474 Sep 12 17:27:20.536050 sshd-session[6492]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:20.539415 systemd[1]: sshd@14-10.200.20.44:22-10.200.16.10:57474.service: Deactivated successfully. Sep 12 17:27:20.540941 systemd[1]: session-17.scope: Deactivated successfully. Sep 12 17:27:20.541913 systemd-logind[1855]: Session 17 logged out. Waiting for processes to exit. Sep 12 17:27:20.543416 systemd-logind[1855]: Removed session 17. Sep 12 17:27:25.621938 systemd[1]: Started sshd@15-10.200.20.44:22-10.200.16.10:44768.service - OpenSSH per-connection server daemon (10.200.16.10:44768). Sep 12 17:27:26.069675 sshd[6508]: Accepted publickey for core from 10.200.16.10 port 44768 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:26.071724 sshd-session[6508]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:26.077827 systemd-logind[1855]: New session 18 of user core. Sep 12 17:27:26.082243 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 12 17:27:26.455932 sshd[6511]: Connection closed by 10.200.16.10 port 44768 Sep 12 17:27:26.457087 sshd-session[6508]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:26.460819 systemd[1]: sshd@15-10.200.20.44:22-10.200.16.10:44768.service: Deactivated successfully. Sep 12 17:27:26.463004 systemd[1]: session-18.scope: Deactivated successfully. Sep 12 17:27:26.464926 systemd-logind[1855]: Session 18 logged out. Waiting for processes to exit. Sep 12 17:27:26.466335 systemd-logind[1855]: Removed session 18. Sep 12 17:27:26.546628 systemd[1]: Started sshd@16-10.200.20.44:22-10.200.16.10:44774.service - OpenSSH per-connection server daemon (10.200.16.10:44774). Sep 12 17:27:27.003156 sshd[6523]: Accepted publickey for core from 10.200.16.10 port 44774 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:27.004982 sshd-session[6523]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:27.008503 systemd-logind[1855]: New session 19 of user core. Sep 12 17:27:27.019328 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 12 17:27:27.530599 sshd[6526]: Connection closed by 10.200.16.10 port 44774 Sep 12 17:27:27.531566 sshd-session[6523]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:27.534900 systemd[1]: sshd@16-10.200.20.44:22-10.200.16.10:44774.service: Deactivated successfully. Sep 12 17:27:27.536895 systemd[1]: session-19.scope: Deactivated successfully. Sep 12 17:27:27.537789 systemd-logind[1855]: Session 19 logged out. Waiting for processes to exit. Sep 12 17:27:27.539704 systemd-logind[1855]: Removed session 19. Sep 12 17:27:27.611057 systemd[1]: Started sshd@17-10.200.20.44:22-10.200.16.10:44786.service - OpenSSH per-connection server daemon (10.200.16.10:44786). Sep 12 17:27:28.021171 sshd[6536]: Accepted publickey for core from 10.200.16.10 port 44786 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:28.022268 sshd-session[6536]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:28.025747 systemd-logind[1855]: New session 20 of user core. Sep 12 17:27:28.039251 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 12 17:27:28.448216 containerd[1886]: time="2025-09-12T17:27:28.448173971Z" level=info msg="TaskExit event in podsandbox handler container_id:\"80ce711fab9081f96308e95a8467c272070b466dbe4044e655998357af397403\" id:\"d5c17872b4c9d31d907baa9d10316930d366004ad9006bffa5179f767f63e0d1\" pid:6557 exited_at:{seconds:1757698048 nanos:447746295}" Sep 12 17:27:28.828788 sshd[6539]: Connection closed by 10.200.16.10 port 44786 Sep 12 17:27:28.829018 sshd-session[6536]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:28.832638 systemd[1]: sshd@17-10.200.20.44:22-10.200.16.10:44786.service: Deactivated successfully. Sep 12 17:27:28.834852 systemd[1]: session-20.scope: Deactivated successfully. Sep 12 17:27:28.837540 systemd-logind[1855]: Session 20 logged out. Waiting for processes to exit. Sep 12 17:27:28.838874 systemd-logind[1855]: Removed session 20. Sep 12 17:27:28.922879 systemd[1]: Started sshd@18-10.200.20.44:22-10.200.16.10:44796.service - OpenSSH per-connection server daemon (10.200.16.10:44796). Sep 12 17:27:29.422384 sshd[6578]: Accepted publickey for core from 10.200.16.10 port 44796 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:29.423984 sshd-session[6578]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:29.427576 systemd-logind[1855]: New session 21 of user core. Sep 12 17:27:29.435276 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 12 17:27:29.730508 update_engine[1865]: I20250912 17:27:29.730235 1865 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 17:27:29.731494 update_engine[1865]: I20250912 17:27:29.730770 1865 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 17:27:29.731494 update_engine[1865]: I20250912 17:27:29.730985 1865 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 17:27:29.806704 update_engine[1865]: E20250912 17:27:29.806607 1865 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 17:27:29.806704 update_engine[1865]: I20250912 17:27:29.806679 1865 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Sep 12 17:27:29.889335 containerd[1886]: time="2025-09-12T17:27:29.889297140Z" level=info msg="TaskExit event in podsandbox handler container_id:\"80ce711fab9081f96308e95a8467c272070b466dbe4044e655998357af397403\" id:\"b43cb4b52d5c9b8500be104d3ece1763e0df33a2e178cd450c650276634a3d20\" pid:6600 exited_at:{seconds:1757698049 nanos:889064368}" Sep 12 17:27:29.910065 sshd[6581]: Connection closed by 10.200.16.10 port 44796 Sep 12 17:27:29.910391 sshd-session[6578]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:29.914312 systemd-logind[1855]: Session 21 logged out. Waiting for processes to exit. Sep 12 17:27:29.914497 systemd[1]: sshd@18-10.200.20.44:22-10.200.16.10:44796.service: Deactivated successfully. Sep 12 17:27:29.917434 systemd[1]: session-21.scope: Deactivated successfully. Sep 12 17:27:29.919352 systemd-logind[1855]: Removed session 21. Sep 12 17:27:29.985471 systemd[1]: Started sshd@19-10.200.20.44:22-10.200.16.10:48446.service - OpenSSH per-connection server daemon (10.200.16.10:48446). Sep 12 17:27:30.397272 sshd[6614]: Accepted publickey for core from 10.200.16.10 port 48446 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:30.398367 sshd-session[6614]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:30.402384 systemd-logind[1855]: New session 22 of user core. Sep 12 17:27:30.405267 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 12 17:27:30.750810 sshd[6617]: Connection closed by 10.200.16.10 port 48446 Sep 12 17:27:30.751435 sshd-session[6614]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:30.754470 systemd[1]: session-22.scope: Deactivated successfully. Sep 12 17:27:30.754531 systemd-logind[1855]: Session 22 logged out. Waiting for processes to exit. Sep 12 17:27:30.754972 systemd[1]: sshd@19-10.200.20.44:22-10.200.16.10:48446.service: Deactivated successfully. Sep 12 17:27:30.758718 systemd-logind[1855]: Removed session 22. Sep 12 17:27:33.870583 containerd[1886]: time="2025-09-12T17:27:33.870540846Z" level=info msg="TaskExit event in podsandbox handler container_id:\"496ad7900add400989d0ec393de402e1dea2cb5478991520aa4909401e77fffe\" id:\"a32f9c26f78eb022db6f814723dc46a18b3b677d9bb047904c4b820e2c0bbcd7\" pid:6643 exited_at:{seconds:1757698053 nanos:870343490}" Sep 12 17:27:35.828906 systemd[1]: Started sshd@20-10.200.20.44:22-10.200.16.10:48458.service - OpenSSH per-connection server daemon (10.200.16.10:48458). Sep 12 17:27:36.240164 sshd[6652]: Accepted publickey for core from 10.200.16.10 port 48458 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:36.241257 sshd-session[6652]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:36.244935 systemd-logind[1855]: New session 23 of user core. Sep 12 17:27:36.250249 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 12 17:27:36.578409 sshd[6657]: Connection closed by 10.200.16.10 port 48458 Sep 12 17:27:36.578666 sshd-session[6652]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:36.582443 systemd-logind[1855]: Session 23 logged out. Waiting for processes to exit. Sep 12 17:27:36.583065 systemd[1]: sshd@20-10.200.20.44:22-10.200.16.10:48458.service: Deactivated successfully. Sep 12 17:27:36.585119 systemd[1]: session-23.scope: Deactivated successfully. Sep 12 17:27:36.586820 systemd-logind[1855]: Removed session 23. Sep 12 17:27:39.733801 update_engine[1865]: I20250912 17:27:39.733261 1865 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 17:27:39.733801 update_engine[1865]: I20250912 17:27:39.733465 1865 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 17:27:39.733801 update_engine[1865]: I20250912 17:27:39.733690 1865 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 17:27:39.743272 update_engine[1865]: E20250912 17:27:39.743240 1865 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 17:27:39.743408 update_engine[1865]: I20250912 17:27:39.743379 1865 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Sep 12 17:27:40.903770 containerd[1886]: time="2025-09-12T17:27:40.903730512Z" level=info msg="TaskExit event in podsandbox handler container_id:\"496ad7900add400989d0ec393de402e1dea2cb5478991520aa4909401e77fffe\" id:\"5418a17abee31398daa885bec81acb97aa13ea131d280db55a5a1c836c8bf82b\" pid:6680 exited_at:{seconds:1757698060 nanos:903363545}" Sep 12 17:27:41.666335 systemd[1]: Started sshd@21-10.200.20.44:22-10.200.16.10:56038.service - OpenSSH per-connection server daemon (10.200.16.10:56038). Sep 12 17:27:42.119646 sshd[6691]: Accepted publickey for core from 10.200.16.10 port 56038 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:42.120701 sshd-session[6691]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:42.124740 systemd-logind[1855]: New session 24 of user core. Sep 12 17:27:42.130236 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 12 17:27:42.493894 sshd[6694]: Connection closed by 10.200.16.10 port 56038 Sep 12 17:27:42.494470 sshd-session[6691]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:42.498646 systemd-logind[1855]: Session 24 logged out. Waiting for processes to exit. Sep 12 17:27:42.498930 systemd[1]: sshd@21-10.200.20.44:22-10.200.16.10:56038.service: Deactivated successfully. Sep 12 17:27:42.500718 systemd[1]: session-24.scope: Deactivated successfully. Sep 12 17:27:42.502760 systemd-logind[1855]: Removed session 24. Sep 12 17:27:46.554245 containerd[1886]: time="2025-09-12T17:27:46.554203753Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5c45c67467f07c0751a9c139448df8357573caebf1e31f41ebecff726904cb67\" id:\"2d4eaae60a662aacdc662919e982d7ee77021bcfb2579a98cc371d7babaffb6e\" pid:6719 exited_at:{seconds:1757698066 nanos:553981829}" Sep 12 17:27:47.571819 systemd[1]: Started sshd@22-10.200.20.44:22-10.200.16.10:56054.service - OpenSSH per-connection server daemon (10.200.16.10:56054). Sep 12 17:27:47.998755 sshd[6731]: Accepted publickey for core from 10.200.16.10 port 56054 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:47.999924 sshd-session[6731]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:48.003875 systemd-logind[1855]: New session 25 of user core. Sep 12 17:27:48.010242 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 12 17:27:48.361235 sshd[6734]: Connection closed by 10.200.16.10 port 56054 Sep 12 17:27:48.360900 sshd-session[6731]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:48.364247 systemd[1]: sshd@22-10.200.20.44:22-10.200.16.10:56054.service: Deactivated successfully. Sep 12 17:27:48.365982 systemd[1]: session-25.scope: Deactivated successfully. Sep 12 17:27:48.366899 systemd-logind[1855]: Session 25 logged out. Waiting for processes to exit. Sep 12 17:27:48.368442 systemd-logind[1855]: Removed session 25. Sep 12 17:27:49.731225 update_engine[1865]: I20250912 17:27:49.731163 1865 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 17:27:49.732200 update_engine[1865]: I20250912 17:27:49.731680 1865 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 17:27:49.732439 update_engine[1865]: I20250912 17:27:49.732343 1865 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 17:27:49.744494 update_engine[1865]: E20250912 17:27:49.744199 1865 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 17:27:49.744494 update_engine[1865]: I20250912 17:27:49.744246 1865 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 12 17:27:49.744494 update_engine[1865]: I20250912 17:27:49.744252 1865 omaha_request_action.cc:617] Omaha request response: Sep 12 17:27:49.744494 update_engine[1865]: E20250912 17:27:49.744328 1865 omaha_request_action.cc:636] Omaha request network transfer failed. Sep 12 17:27:49.744494 update_engine[1865]: I20250912 17:27:49.744343 1865 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Sep 12 17:27:49.744494 update_engine[1865]: I20250912 17:27:49.744347 1865 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 12 17:27:49.744494 update_engine[1865]: I20250912 17:27:49.744350 1865 update_attempter.cc:306] Processing Done. Sep 12 17:27:49.744494 update_engine[1865]: E20250912 17:27:49.744374 1865 update_attempter.cc:619] Update failed. Sep 12 17:27:49.744494 update_engine[1865]: I20250912 17:27:49.744378 1865 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Sep 12 17:27:49.744494 update_engine[1865]: I20250912 17:27:49.744386 1865 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Sep 12 17:27:49.744494 update_engine[1865]: I20250912 17:27:49.744390 1865 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Sep 12 17:27:49.744875 update_engine[1865]: I20250912 17:27:49.744853 1865 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 12 17:27:49.745055 update_engine[1865]: I20250912 17:27:49.744933 1865 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 12 17:27:49.745055 update_engine[1865]: I20250912 17:27:49.744946 1865 omaha_request_action.cc:272] Request: Sep 12 17:27:49.745055 update_engine[1865]: Sep 12 17:27:49.745055 update_engine[1865]: Sep 12 17:27:49.745055 update_engine[1865]: Sep 12 17:27:49.745055 update_engine[1865]: Sep 12 17:27:49.745055 update_engine[1865]: Sep 12 17:27:49.745055 update_engine[1865]: Sep 12 17:27:49.745055 update_engine[1865]: I20250912 17:27:49.744951 1865 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 17:27:49.745582 locksmithd[1978]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Sep 12 17:27:49.745775 update_engine[1865]: I20250912 17:27:49.745241 1865 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 17:27:49.745775 update_engine[1865]: I20250912 17:27:49.745419 1865 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 17:27:49.783398 update_engine[1865]: E20250912 17:27:49.783245 1865 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 17:27:49.783398 update_engine[1865]: I20250912 17:27:49.783298 1865 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 12 17:27:49.783398 update_engine[1865]: I20250912 17:27:49.783303 1865 omaha_request_action.cc:617] Omaha request response: Sep 12 17:27:49.783398 update_engine[1865]: I20250912 17:27:49.783308 1865 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 12 17:27:49.783398 update_engine[1865]: I20250912 17:27:49.783313 1865 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 12 17:27:49.783398 update_engine[1865]: I20250912 17:27:49.783316 1865 update_attempter.cc:306] Processing Done. Sep 12 17:27:49.783398 update_engine[1865]: I20250912 17:27:49.783320 1865 update_attempter.cc:310] Error event sent. Sep 12 17:27:49.783398 update_engine[1865]: I20250912 17:27:49.783329 1865 update_check_scheduler.cc:74] Next update check in 49m35s Sep 12 17:27:49.783857 locksmithd[1978]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Sep 12 17:27:53.443159 systemd[1]: Started sshd@23-10.200.20.44:22-10.200.16.10:36902.service - OpenSSH per-connection server daemon (10.200.16.10:36902). Sep 12 17:27:53.900953 sshd[6745]: Accepted publickey for core from 10.200.16.10 port 36902 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:53.902051 sshd-session[6745]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:53.905639 systemd-logind[1855]: New session 26 of user core. Sep 12 17:27:53.910385 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 12 17:27:54.270778 sshd[6748]: Connection closed by 10.200.16.10 port 36902 Sep 12 17:27:54.270614 sshd-session[6745]: pam_unix(sshd:session): session closed for user core Sep 12 17:27:54.273958 systemd[1]: sshd@23-10.200.20.44:22-10.200.16.10:36902.service: Deactivated successfully. Sep 12 17:27:54.275996 systemd[1]: session-26.scope: Deactivated successfully. Sep 12 17:27:54.276973 systemd-logind[1855]: Session 26 logged out. Waiting for processes to exit. Sep 12 17:27:54.278684 systemd-logind[1855]: Removed session 26. Sep 12 17:27:59.346557 systemd[1]: Started sshd@24-10.200.20.44:22-10.200.16.10:36914.service - OpenSSH per-connection server daemon (10.200.16.10:36914). Sep 12 17:27:59.759173 sshd[6760]: Accepted publickey for core from 10.200.16.10 port 36914 ssh2: RSA SHA256:+on2THTR/nRp7vYd/q00sis2kB6WPhgipQlvfvqeQ7E Sep 12 17:27:59.760262 sshd-session[6760]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:27:59.764072 systemd-logind[1855]: New session 27 of user core. Sep 12 17:27:59.769249 systemd[1]: Started session-27.scope - Session 27 of User core. Sep 12 17:27:59.884079 containerd[1886]: time="2025-09-12T17:27:59.884042041Z" level=info msg="TaskExit event in podsandbox handler container_id:\"80ce711fab9081f96308e95a8467c272070b466dbe4044e655998357af397403\" id:\"e40f9438665826ddb309337b4a3860b2ffed60591aadd53e513f489c5b15321f\" pid:6778 exited_at:{seconds:1757698079 nanos:883327595}" Sep 12 17:28:00.098978 sshd[6763]: Connection closed by 10.200.16.10 port 36914 Sep 12 17:28:00.099645 sshd-session[6760]: pam_unix(sshd:session): session closed for user core Sep 12 17:28:00.103054 systemd[1]: sshd@24-10.200.20.44:22-10.200.16.10:36914.service: Deactivated successfully. Sep 12 17:28:00.105206 systemd[1]: session-27.scope: Deactivated successfully. Sep 12 17:28:00.105844 systemd-logind[1855]: Session 27 logged out. Waiting for processes to exit. Sep 12 17:28:00.107302 systemd-logind[1855]: Removed session 27. Sep 12 17:28:03.873776 containerd[1886]: time="2025-09-12T17:28:03.873733019Z" level=info msg="TaskExit event in podsandbox handler container_id:\"496ad7900add400989d0ec393de402e1dea2cb5478991520aa4909401e77fffe\" id:\"41166609d73c8e09e69d2591bef98b57b84571fe7b8e9bd86caa053815e41cf7\" pid:6810 exited_at:{seconds:1757698083 nanos:873445854}"